Seven most interesting search marketing news stories of the week

Welcome to our weekly round-up of all the latest news and research from around the world of search marketing and beyond.

This week we have a bountiful collection of news, a heaving trove of stats and a swollen haul of insight from the last seven days.

These adjectives will make more sense in about one headline’s time.

Google AI is improving its conversational skills with… romance novels

Yep.

my fair viking

According to Buzzfeed – YES that’s where we get our intel from – for the past few months, Google has been feeding text from romance novels into its AI engine because the’ve determined that “parsing the text of romance novels could be a great way of enhancing the company’s technology with some of the personality and conversational skills it lacks.”

Buzzfeed also reports that the plan this seems to be working. “Google’s research team recently got the AI to write sentences that resemble those in the books.”

So expect your next Google search for a well reviewed local restaurant to include at least 12 synonyms for ‘throbbing’.

AdWords will launch redesigned AdWords on May 24th

And you can watch the launch live, if that’s the sort of thing you like doing with your time.

AdWords is being being redesigned for the mobile-first market and aesthetically will fall into line with its recently launched 360 suite.

Here’s a sneak peek:

Redesigned AdWords

You can get an early demo during the Google Performance Summit livestream on May 24th at 9:00am PT/12:00pm ET, which you can sign-up for here.

Seats are being booked up fast though, so hurry.

Jk, it’s on the internet. You just need to nudge the cat off the sofa.

Half of SEOs are either unaware of AMP or only have a “passing awareness”

As Rebecca Sentance reported this week, SEOs have been slow to implement Google’s accelerated mobile pages (AMP) in the two months since its official launch, despite the promise that AMP is an important ranking signal.

A survey, carried out by SEO PowerSuite, looked at awareness and uptake of AMP among 385 SEO professionals in North America and Europe. Of the respondents surveyed, less than a quarter (23%) had implemented AMP for their mobile sites.

Although general awareness of Accelerated Mobile Pages was high – 75% of the SEO professionals surveyed were aware of AMP – 21% said they were only aware of it “in passing.”

A column graph showing awareness of AMP among SEOs surveyed, with 21% of SEOs aware of AMP "in passing", 35% "have done SOME research" into AMP, 18% "have done A LOT of research" into AMP, while 25% are "not aware" of AMP.

Of those SEOs who hadn’t yet begun to implement AMP on their mobile sites, only 29% said they would do so in the next six months, and 5% of respondents said they had no intention of supporting AMP on their mobile sites whatsoever.

180% increase in websites being hacked in 2015

This week we reported on Google’s fight against webspam in 2015, revealing the following info:

  • An algorithmic update helped remove the amount of webspam in search results, impacting 5% of queries.
  • Google sent more than 4.3 million messages to webmasters notifying them of manual actions it had imposed on sites affected by spam.
  • Google saw a 33% increase in the number of sites that went through a spam clean-up “towards a successful reconsideration process.”

Most worrying of all was the massive 180% increase in hacking from 2014. If you haven’t already, it’s time to seriously think about the security of your website.

Google is moving all blogspot domain blogs to HTTPS

This week, Google has introduced a HTTPS version for every blogspot domain blog, meaning that visitors can access any blogspot domain blog over an encrypted channel.

https

Google has also removed the HTTPS Availability setting and all blogs will automatically have a HTTPS version enabled. So you don’t have to do a thing, and you may even get a little traffic boost as secure servers are seen as a ranking signal.

Google has taken action against sneaky mobile redirects

To tackle the trend of websites redirecting mobile users to spammy, unrelated domains Google has taken action on sites that sneakily redirect users in this way by issuing manual penalties.

sneaky mobile redirects

If your site has been affected, Google offers help on getting rid of these redirects to clean up your site and hopefully avoid further action.

Moz has introduced a new, free to use, keyword research tool

The new Keyword Researcher launched by Moz this week, can help take you all the way through the keyword research process. It has a variety of useful metrics including estimating the relative CTR of organic results and it surfaces results from almost all popular sources used by SEOs.

And best of all, you can run 2 free searches per day without logging in, another five with a free community account, and if you’re a Pro subscriber you already have full access.

keyword explorer

from Search Engine Watch https://searchenginewatch.com/2016/05/06/seven-most-interesting-search-marketing-news-stories-of-the-week/
via Auto Feed

Do bounce rates affect a site’s search engine ranking?

The bounce rate debate continues…

Bounce rates and how they affect a website’s ranking on Google has been discussed, dissected, and dismembered over and over again.

As fully transcribed on this site, a conversation between Rand Fishkin, CEO of Moz, and Andrey Lipattsev, Google’s search quality senior strategist, led to a surprising discussion on click and bounce rates affecting search rankings.

Rand stated that he has recently been running a few experimental tests with various crowds of 500 to a couple thousand people.

Everyone participating was prompted to take out their cellphones, laptops, and digital what-have-yous and perform a specific search. Once the search listing appeared, he had everyone in the crowd click one of the listings at the bottom of the results page and then click away from that site. He then monitored the results over the next few days.

Rand found a whole bunch of inconsistencies. In a little more than half of the experiments, the ranking did change on the search engine results page (SERP), and in a little less than half of the experiments, the rankings did not change.

This begs the question:

Do bounce rates affect a site’s search engine ranking? If so, how much?

Lipattsev believes that for each individual search query in the experiment, the generated interest regarding those specific searches impacts the rankings change rather than just the clicks and bounces.

He said that if a certain topic is gaining a substantial amount of searches and an increase in social media mentions, Google would pay more attention to that rather than a site getting more clicks.

Lipattsev says that it is certainly doable to determine exactly what causes a large rankings jump for an individual listing, but Internet-wide, it is much more difficult.

All this being said, what actually is a bounce rate?

The bounce rate is the percentage of visitors to a particular site who navigate or “bounce” away after only viewing that individual webpage.

Usually, the term ‘bounce rate’ has a negative connotation associated with it. People think that if a visitor only visits one page and then leaves, it’s bad for business. Their logic isn’t that flawed, either. After all, a high bounce rate would indicate that a site does not have the high-quality, relevant content Google wants out of its top ranked sites.

A great Search Engine Journal article shows nine negative reasons why your website could potentially have a high bounce rate, including poor web design, incorrect keyword selection, improper links, and just bad content. It’s true that these high bounce rates can reflect poorly on a website… sometimes.

So, what gives?

Having a high bounce rate on something like a ‘contact us’ page can actually be a good thing. That’s more of a call-to-action site, where the goal of that particular page is to have the user find the contact information, and then actually contact the business. The visitor got what they came for and then left. Extra navigation around the website doesn’t really mean anything in this case.

Of course, if your site is more content-driven or offers a product or service, then your goal should be to have a higher click-through rate (CTR) and more traffic to each page.

bouncy castles

But what about Google?

Does Google know your bounce rate and are they using it to affect rankings? This Search Engine Roundtable article provides the short answer (which is “no”).

Many organizations don’t use Google Analytics, so Google has no way of tracking their bounce rate information. And even with the analytics that they can trace, it’s difficult to determine what they actually mean because every situation is different.

There are many factors that go into determining how long a visitor stays on a particular webpage. If a visitor remains on a site for over 20 minutes, they could be so engaged with your site’s content that they can’t even imagine leaving your wonderful webpage… or… it could mean they fell asleep at the screen because your website was so boring. It’s too difficult to tell.

If you are operating one of those websites that should have a lower bounce rate, these tips on lowering that number should be able to help. Some highlights include making sure each of your pages loads quickly, offers user-friendly navigation, avoids cluttered advertisements, and features quality content!

If bounce rates don’t affect Google’s rankings as much as you thought, you wonder how significant other ranking factors are. Well, Google recently revealed that magical information. They narrowed it down to three top ranking factor used by Google to drive search results:

  • Links: strong links and link votes play a major role in search rankings.
  • Content: having quality content is more important than ever.
  • RankBrain: Google’s AI ranking system.

It’s no shock that links and content matter, but RankBrain is still relatively new. It’s Google’s new algorithm to help determine search results (after factoring in links and content). RankBrain filters more complex searches and converts them into shorter ones, all the while maintaining the complexity of the search, thusly refining the results.

Google’s newest AI technology – and whatever other secret technologies they are working on – may resolve the never-ending debate over bounce rates, but it’s certainly going to be a difficult process.

More research is to come and Andrey believes the challenge to make bounce rate click data a strong and measurable metric is “gameable,” but Google still has a long way to go.

“If we solve it, good for us,” Andrey said, “but we’re not there yet.”

There is no one-size-fits-all answer when it comes to SEO and all its intricacies. The greatest answer to any SEO question is always “it depends.”

from Search Engine Watch https://searchenginewatch.com/2016/05/04/do-bounce-rates-affect-a-sites-search-engine-ranking/
via Auto Feed

How Google fights webspam and what you need to learn from this

Google has this week revealed its annual report on how it has policed the internet over the last 12 months. Or at least how it policed the vast chunk of the internet it allows on its results pages.

Although it’s self-congratulatory stuff, and as much as you can rightfully argue with some of Google’s recent penalties, you do need to understand what Google is punishing in terms of ‘bad quality’ internet experiences so you can avoid the same mistakes.

It’s important to remember that Google for some people IS the internet, or at least the ‘front door’ to it (sorry Reddit), but it’s equally important to remember that Google is still a product; one that needs to make money to survive and (theoretically) provide the best possible experience for its users, or else it is off to DuckDuckGo they… uh… go.

So therefore Google has to ensure the results it serves on its SERPs (search engine results pages) are of the highest quality possible. Algorithms are built and manual reviews by actual human beings are carried out to ensure crappy websites with stolen/thin/manipulative/harmful content stay hidden.

Here’s how Google is currently kicking ass and taking names… and how you can avoid falling between its crosshairs.

google webspam

How Google fought webspam

According to Google, an algorithmic update helped remove the amount of webspam in search results, impacting 5% of queries.

The remaining spam was tackled manually. Google sent more than 4.3 million messages to webmasters notifying them of manual actions it had imposed on sites affected by spam.

Following this, Google saw a 33% increase in the number of sites that went through a spam clean-up “towards a successful reconsideration process.” It’s unclear whether the remaining sites are still in the process of appealing, or have been booted off the face of the internet.

Who watches the watchmen?

More than 400,000 spam reports were manually submitted by Google users around the world. Google acted on 65% of them, and considered 80% of those acted upon to be spam.

Hacking

There was a huge 180% increase in websites being hacked in 2015, compared to the previous year. Hacking can take on a number of guises, whether its website spam or malware, but the result will be the same. You’ll be placed ‘in quarantine’ and your site will be flagged or removed.

Google has a number of official guidelines on how to help avoid being hacked. These include:

  • Strengthen your account security with lengthy, difficult to guess or crack passwords and not reusing those passwords across platforms.
  • Keep your site’s software updated, including its CMS and various plug-ins.
  • Research how your hosting provider handles security issues and check its policy when it comes to cleaning up hacked sites. Will it offer live support if your site is compromised?
  • Use tools to stay informed of potential hacked content on your site. Signing up to Search Console is a must, as it’s Google’s way of communicating any site issues with you.

google spam fighting

Thin, low quality content

Google saw an increase in the number of sites with thin, low quality content, a substantial amount likely to be provided by scraper sites.

Unfortunately there is very little you can do if your site is being scraped, as Google has discontinued its reporting tool and believes this problem to be your own fault. You just have to be confident that your own site’s authority, architecture and remaining content is enough to ensures it ranks higher than a scraper site.

If you have been served a manual penalty for ‘thin content with little or no added value’ there are things you can do to rectify it, which can mostly be boiled down to ‘stop making crappy content, duh’.

1) Start by checking your site for the following:

  • Auto-generated content: automatically generated content that reads like it was written by a piece of software because it probably was.
  • Thin content pages with affiliate links: affiliate links in quality articles are fine, but pages where the affiliates contain descriptions or reviews copied directly from the original retailer without any added original content are bad. As a rule, affiliates should form only a small part of the content of your site.
  • Scraped content: if you’re a site that automatically scrapes and republishes entire articles from other websites without permission then you should just flick the off-switch right away.
  • Doorway pages: these are pages which can appear multiple times for a particular query’s search results but ultimately lead users to the same destination. The purpose of doorway pages are purely to manipulate rankings.

2) Chuck them all in the bin.

3) If after all that you’re 100% sure your site somehow offers value, then you can resubmit to Google for reconsideration.

For more information on Google’s fight against webspam, read its official blog-post.

And finally, I’ll leave you with this terrifying vision of things to come…

robots and people

from Search Engine Watch https://searchenginewatch.com/2016/05/04/how-google-fights-webspam-and-what-you-need-to-learn-from-this/
via Auto Feed

Announcing Keyword Explorer: Moz’s New Keyword Research Tool

Posted by randfish

A year ago, in April of 2015, I pitched a project internally at Moz to design and launch a keyword research tool, one of the few areas of SEO we’ve never comprehensively tried to serve. The pitch took effort and cajoling (the actual, internal pitch deck is available here), but eventually received approval, with one big challenge… We had to do it with a team already dedicated to the maintenance and development of our rankings collections and research tools. This project wouldn’t get additional staffing — we had to find a way to build it with only the spare bandwidth of this crew.

Sure, we didn’t have the biggest team, or the ability to work on the project free from our other obligations, but we had grit. We had passion. We wanted to prove ourselves to our fellow Mozzers and to our customers. We had pride. And we desperately wanted to build something that wasn’t just “good enough,” but was truly great. Today, I think we’ve done that.

Meet our new keyword research tool, Keyword Explorer:

If you want to skip hearing about it and just try it out, head on over. You can run 2 free searches/day without even logging in, another 5 with a free community account, and if you’re a Pro subscriber, you’ve already got access. For those who want to learn more, read on!

The 5 big, unique features of Keyword Explorer

Keyword Explorer (which we’ve taken to calling “KWE” for short) has lots of unique features, metrics, and functionality, but the biggest ones are pretty obvious and, we believe, highly useful:

  1. KWE takes you all the way through the keyword research process — from discovering keyword ideas to getting metrics to building a list, filtering the keywords on it, and prioritizing which ones to target based on the numbers that matter.
  2. KWE features metrics essential to the SEO process — two you’re familiar with — Volume and Difficulty — and three that are less familiar: Opportunity, Importance, and Potential. Opportunity estimates the relative CTR of the organic web results on a SERP. Importance is a metric you can modify to indicate a keyword that’s more or less critical to your campaign/project. And Potential is a combination of all the metrics built to help you prioritize a keyword list.
  3. Our volume score is the first volume estimation metric we know of that goes beyond what AdWords reports. We do that using Russ Jones’ volume bucket methodology and adding in anonymized clickstream data from ~1 million real searchers in the US. From there, Russ has built a model that predicts the search volume range a keyword is likely to have with ~95% accuracy.
  4. Keyword suggestions inside KWE come from almost all the sources we saw SEOs accessing manually in their research processes — Keyword Planner data, Google Suggest, Related Searches, other keywords that the ranking pages also ranked for, topic-modeling ideas, and keywords found from our clickstream data. All of these are available in KWE’s suggestions.
  5. Import and export functionality are strongly supported. If you’ve already got a list of keywords and just want KWE’s metrics, you can easily upload that to us and we’ll fetch them for you. If you like the KWE process and metrics, but have more you want to do in Excel, we support easy, powerful, fast exports. KWE is built with power users in mind, so go ahead and take advantage of the tool’s functionality however works best with your processes.

These five are only some of the time-saving, value-adding features in the tool, but they are, I think, enough to make it worthwhile to give Keyword Explorer a serious look.

A visual walkthrough

As an experiment, I’ve created a visual, slide-by-slide walkthrough of the tool. If you’d rather *see* vs. read the details, this format might be for you:

The Power of Moz’s Keyword Explorer from Rand Fishkin

 

And, for those of you who prefer video, we made a short, 2 minute demo of the tool in that format, too:

 

Of course, there’s a ton of nuance and complexity in a product like this, and given Moz’s dedication to transparency, you can find all of that detail in the more thorough explanation below.

Keyword Explorer’s metrics

KWE’s metrics are among the biggest data-driven advances we’ve made here at Moz, and a ton of credit for that goes to Dr. Pete Meyers and Mr. Russ Jones. Together, these two have crafted something extraordinary — unique metrics that we’ve always needed for SEO-based keyword research, but never had before. Those include:

Keyword volume ranges

Nearly every keyword research tool available uses a single source for volume data: Google AdWords’ Keyword Planner. We all know from studying it that the number AdWords provides is considerably off from reality, and last year, Moz’s Russ Jones was able to quantify those discrepancies in his blog post: Keyword Planner’s Dirty Secrets.

Since we know that Google’s numbers don’t actually have precision, but do indicate a bucket, we realized we could create ranges for volume and be significantly more accurate, more of the time. But, that’s not all… We also have access to anonymized clickstream data here at Moz, purchased through a third-party (we do NOT collect or use any of our own user data via, for example, the MozBar), that we were able to employ in our new volume ranges.

Using sampling, trend data, and the number of searchers and searches for a given keyword from the clickstream, combined with AdWords’ volume data, we produced a volume range that, in our research, showed ~95% accuracy with the true impression counts Google AdWords would report for a keyword whose ad showed during a full month.

We’re pretty excited about this model and the data it produces, but we know it’s not perfect yet. As our clickstream data grows, and our algorithm for volume improves, you should see more and more accurate ranges in the tool for a growing number of keywords. Today, we have volume data on ~500mm (half a billion) English-language search queries. But, you’ll still see plenty of “no data” volume scores in the tool as we can access considerably more terms and phrases for keyword suggestions (more on suggestion sources below).

NOTE: KWE uses volume data modeled on the quantity of searches in the US for a given term/phrase (global English is usually 1.5-3X those numbers). Thus, while the tool can search any Google domain in any country, the volume numbers will always be for US-volume. In the future, we hope to add volume data for other geos as well.

An upgraded, more accurate Keyword Difficulty score

The old Keyword Difficulty tool was one of Moz’s most popular (it’s still around for another month or so, but will be retired soon in favor of Keyword Explorer). But, we knew it had a lot of flaws in its scoring system. For Keyword Explorer, we invested a lot of energy in upgrading the model. Dr. Pete, Dr. Matt Peters, myself, and Russ had 50+ reply email threads back and forth analyzing graphs, suggesting tweaks, and tuning the new score. Eventually, we came up with a Keyword Difficulty metric that:

  • Has far more variation than the old model — you’ll see way more scores in the 20s and 30s as well as the 80s and 90s than the prior model, which put almost every keyword between 50–80.
  • Accounts for pages that haven’t yet been assigned a PA score by using the DA of the domain.
  • Employs a smarter, CTR-curve model to show when weaker pages are ranking higher and a page/site may not need as much link equity to rank.
  • Adjusts for a few domains (like Blogspot and WordPress) where DA is extremely high, but PA is often low and the inherited domain authority shouldn’t pass on as much weight to difficulty.
  • Concentrates on however many results appear on page 1, rather than the top 20 results.

This new scoring model matches better with my own intuition, and I think you’ll find it vastly more useful than the old model.

As you can see from one of my lists above (for Haiku Deck, whose board I joined this year), the difficulty ranges are considerably higher than in the past, and more representative of how relatively hard it would be to rank in the organic results for each of the queries.

A true Click-Through Rate Opportunity score

When you look at Google’s results, it’s pretty clear that some keywords are worthy of pursuit in the organic web results, and some are not. To date, no keyword research tool we know of has attempted to accurately quantify that, but it’s a huge part of determining the right terms and phrases to target.

Once we had access to clickstream data, we realized we could accurately estimate the percent of clicks on a given search result based on the SERP features that appeared. For example, a classic, “ten-blue-links” style search result had 100% of click traffic going to organic results. Put a block of 4 AdWords ads above it, though, and that dropped by ~15%. Add a knowledge graph to the right-hand side and another ~10% of clicks are drawn away.

It would be crazy to treat the prioritization of keywords with loads of SERP features and little CTR on the organic results the same as a keyword with few SERP features and tons of organic CTR, so we created a metric that accurately estimates Click-Through-Rate (CTR), called “Opportunity.”

The search above for “Keanu” has an instant answer, knowledge graph, news results, and images (further down). Hence, its Opportunity Score is a measly 37/100, which means our model estimates ~37% of clicks go to the organic results.

But, this search, for “best free powerpoint software” is one of those rare times Google is showing nothing but the classic 10 blue links. Hence, its Opportunity Score is 100/100.

If you’re prioritizing keywords to target, you need this data. Choosing keywords without it is like throwing darts with a blindfold on — someone’s gonna get hurt.

Importance scores you can modify

We asked a lot of SEOs about their keyword research process early in the design phases of Keyword Explorer and discovered pretty fast that almost everyone does the same thing. We put keyword suggestions from various sources into Excel, get metrics for all of them, and then assign some type of numeric representation to each keyword based on our intuition about how important it is to this particular campaign, or how well it will convert, or how much we know our client/boss/team desperately wants to rank for it.

That self-created score was then used to help weight the final decision for prioritizing which terms and phrases to target first. It makes sense. You have knowledge about keywords both subjective and objective that should influence the process. But it needs to do so in a consistent, numeric fashion that flows with the weighting of prioritization.

Hence, we’ve created a toggle-able “Importance” score in Keyword Explorer:

After you add keywords to a list, you’ll see the Importance score is, by default, set to 3/10. We chose this number to make it easy to increase a keyword’s importance by 3X and easy to bring it down to 1/3rd. As you modify the importance value, overall Keyword Potential (below) will change, and you can re-sort your list based on the inputs you’ve given.

For example, in my list above, I set “free slideshow software” to 2/10, because I know it won’t convert particularly well (the word “free” often does not). But, I also know that churches and religious organizations love Haiku Deck and find it hugely valuable, so I’ve bumped up the importance of “worship presentation software” to 9/10.

Keyword Potential

In order to prioritize keywords, you need a metric that combines all the others — volume, difficulty, opportunity, and importance — with a consistent, sensible algorithm that lets the best keywords rise to the top. In Keyword Explorer, that metric is “Potential.”

Sorting by Potential shows me keywords that have lots of search volume, relatively low difficulty, relatively high CTR opportunity, and uses my custom importance score to push the best keywords to the top. When you build a list in Keyword Explorer, this metric is invaluable for sorting the wheat from the chaff and identifying the terms and phrases with the most promise.

Keyword research & the list building process

Keyword Explorer is built around the idea that, starting from a single keyword search, you can identify suggestions that match your campaign’s goals and include them in your list until you’ve got a robust, comprehensive set of queries to target.

List building is easy — just select the keywords you like from the suggestions page and use the list selector in the top right corner (it scrolls down as you do) to add your chosen keywords to a list, or create a new list:

Once you’ve added keywords to a list, you can go to the lists page to see and compare your sets of keywords:

Each individual list will show you the distribution of metrics and data about the keywords in it via these helpful graphs:

The graphs show distributions of each metric, as well as a chart of SERP features to help illustrate which types of results are most common in the SERPs for the keywords on your list:

For example, you can see in my Rock & Grunge band keywords, there’s a lot of news results, videos, tweets, and a few star reviews, but no maps/local results, shopping ads, or sitelinks, which makes sense. Keyword Explorer is using country-level, non-personalized, non-geo-biased results, and so some SERPs won’t match perfectly to what you see in your local/logged-in results. In the future, we hope to enable even more granular location-based searches in the tool.

The lists themselves have a huge amount of flexibility. You can sort by any column, add, move, or delete in bulk, filter based on any metric, and export to CSV.

If your list gets stale, and you need to update the metrics and SERP features, it’s just a single click to re-gather all the data for every keyword on your list. I was particularly impressed with that feature; to me it’s one of the biggest time-savers in the application.

Keyword Explorer’s unique database of search terms & phrases

No keyword research tool would be complete without a massive database of search terms and phrases, and Keyword Explorer has just that. We started with a raw index of over 2 billion English keywords, then whittled that down to the ~500 million highest-quality ones (we collapsed lots of odd suggestions we found via iterative crawls of AdWords, autosuggest, related searches, Wikipedia titles, topic modeling extractions, SERPscape — via our acquisition last year — and more) into those we felt relatively confident had real volume).

Keyword Explorer’s suggestions corpus features six unique filters to get back ideas. We wanted to include all the types of keyword sources that SEOs normally have to visit many different tools to get, all in one place, to save time and frustration. You can see those filters at the top of the suggestions page:

The six filters are:

  1. Include a Mix of Sources
    • This is the default filter and will mix together results from all the others, as well as ideas crawled from Google Suggest (autocomplete) and Google’s Related Searches.
  2. Only Include Keywords With All of the Keyword Terms
    • This filter will show only suggestions that include all of the terms you’ve entered in the query. For example, if you entered “mustache wax” this filter would only show suggestions that contain both the word “mustache” and the word “wax.”
  3. Exclude Your Query Terms to Get Broader Ideas
    • This filter will show only suggestions that do not include your query terms. For example, if you entered “mustache wax,” suggestions might include “facial grooming products” or “beard oil” but nothing with either “mustache” or “wax.”
  4. Based on Closely Related Topics
    • This filter uses Moz’s topic modeling algorithm to extract terms and phrases we found on many web pages that also contained the query terms. For example, keywords like “hair gel” and “pomade” were found on many of the pages that had the words “mustache wax” and thus will appear in these suggestions.
  5. Based on Broadly Related Topics and Synonyms
    • This filter expands upon the topic modeling system above to include synonyms and more broadly related keywords for a more iterative extraction process and a wider set of keyword suggestions. If “Closely Related Topics” suggestions are too far afield for what you’re seeking, this filter often provides better results.
  6. Related to Keywords with Similar Results Pages
    • This filter looks at the pages that ranked highly for the query entered and then finds other search terms/phrases that also contained those pages. For example, many pages that ranked well for “mustache wax” also ranked well for searches like “beard care products” and “beard conditioner” and thus, those keywords would appear in this filter. We’re big fans of SEMRush here at Moz, and this filter type shows suggestions very similar to what you’d find using their competitive dataset.

Some of my favorite, unique suggestions come from the “closely related topics” filter, which uses that topic modeling algorithm and process. Until now, extracting topically related keywords required using something like Alchemy API or Stanford’s topic modeling software combined with a large content corpus, aka a royal pain in the butt. The KWE team, mostly thanks to Erin, built a suitably powerful English-language corpus, and you can see how well it works:

NOTE: Different filters will work better and worse on different types of keywords. For newly trending searches, topic modeling results are unlikely to be very good, and on longer tail searches, they’re not great either. But for head-of-demand-curve and single word concepts, topic modeling often shows really creative lexical relationships you wouldn’t find elsewhere.

SERPs Analysis

The final feature of Keyword Explorer I’ll cover here (there are lots of cool nooks and crannies I’ve left for you to find on your own) is the SERPs Analysis. We’ve broadened the ability of our SERP data to include all the features that often show up in Google’s results, so you’ll see a page much more representative of what’s actually in the keyword SERP:

Holy smack! There’s only 3 — yes, THREE — organic results on page one for the query “Disneyland.” The rest is sitelinks, tweets, a knowledge graph, news listings, images — it’s madness. But, it’s also well-represented in our SERPs Analysis. And, as you can see, the Opportunity score of “7” effectively represents just how little room there is for organic CTR.

Over time, we’ll be adding and supporting even more features on this page, and trying to grab more of the metrics that matter, too (for example, after Twitter pulled their tweet counts, we had to remove those from the product and are working on a way to get them back).

Yes, you can buy KWE separately (or get it as part of Moz Pro)

Keyword Explorer is the first product in Moz Pro to be available sold separately. It’s part of the efforts we’ve been making with tools like Moz Local, Followerwonk, and Moz Content to offer our software independently rather than forcing you to bundle if you’re only using one piece.

If you’re already a Moz Pro subscriber, you have access to Keyword Explorer right now! If you’re not a subscriber and want to try it out, you can run a few free queries per day (without list building functionality though). And, if you want to use Keyword Explorer on its own, you can buy it for $600/year or $1,800/year depending on your use.

The best part of Keyword Explorer — we’re going to build what you want

There’s lots to like in the new Keyword Explorer, but we also know it’s not complete. This is the first version, and it will certainly need upgrades and additions to reach its full potential. That’s why, in my opinion, the best part of Keyword Explorer is that, for the next 3–6 months, the team that built this product is keeping a big part of their bandwidth open to do nothing but make feature additions and upgrades that YOU need.

It was pretty amazing to have the team’s schedule for Q2 and Q3 of 2016 make the top priority “Keyword Explorer Upgrades & Iterations.” And, in order to take advantage of that bandwidth, we’d love to hear from you. We have dozens (maybe hundreds) of ideas internally of what we want to add next, but your feedback will be a huge part of that. Let us know through the comments below, by tweeting at me, or by sending an email to Rand at Moz.com.

A final note: I want to say a massive thanks to the Keyword Explorer team, who volunteered to take on much more than they bargained for when they agreed to work with me :-) Our fearless, overtime-investing, never-complaining engineers — Evan, Kenny, David, Erin, Tony, Jason, and Jim. One of the best designers I’ve ever worked with — Christine. Our amazingly on-top-of-everything product manager — Kiki. Our superhero-of-an-engineering-manager — Shawn. Our bug-catching SDETs — Uma and Gary. Our product marketing liaison — Brittani. And Russ & Dr. Pete, who helped with so many aspects of the product, metrics, and flow. You folks all took time away from your other projects and responsibilities to make this product a reality. Thank you.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/3231846
via Auto Feed

Why are SEOs slow to implement Accelerated Mobile Pages?

It’s been just over two months since Google launched Accelerated Mobile Pages (AMP), its super-fast brand of mobile webpages running on an amped-up version of HTML.

Accelerated Mobile Pages are designed to speed up the experience of browsing the mobile web, providing page load times which are anywhere from 15 to 85% faster than regular mobile pages. We know that site speed has been a signal in Google’s search ranking algorithms since 2010, and that Google has repeatedly given ranking preference to sites which are optimised for mobile.

Given these two facts, together with the fact that AMP is Google’s own initiative, and it isn’t much of a stretch to conclude that AMP websites are likely rank much better in search than sites with ordinary mobile webpages.

Yet the first survey conducted since the advent of AMP has found that uptake of AMP among professional SEOs is still relatively low. The survey, carried out by SEO PowerSuite, looked at awareness and uptake of AMP among a pool of 385 SEO professionals in North America and Europe.

Of the respondents surveyed, less than a quarter (23%) had taken concrete steps to implement AMP on their mobile sites since the feature’s launch.

Overall awareness of Accelerated Mobile Pages among respondents was high: 75% of the SEO professionals surveyed were aware of AMP. But of these, 21% said they were only aware of the existence of AMP “in passing”, meaning that altogether, nearly half of SEOs (46%) were either unaware of AMP or had only a passing awareness of the feature.

A column graph showing awareness of AMP among SEOs surveyed, with 21% of SEOs aware of AMP "in passing", 35% "have done SOME research" into AMP, 18% "have done A LOT of research" into AMP, while 25% are "not aware" of AMP.

Of those SEOs who hadn’t yet begun to implement AMP on their mobile sites, a high proportion (42%) intended to do more research before making definitive plans. 29% said they had plans to implement AMP in the next six months, and only 5% of respondents said they had no intention of supporting AMP on their mobile sites whatsoever.

The evidence suggests that AMP is fertile ground for getting ahead in search, and a full 80% of survey respondents believed that AMP would have a significant (49%) or moderate (31%) effect on search rankings. So why have SEOs mostly held back from implementing AMP on their mobile sites so far?

A column graph showing the implementation plans for AMP among SEOs. 23% of SEOs are "currently implementing" AMP, 29% are planning to implement AMP in the next 6 months, 42% are "researching" AMP and 5% "have no plans to support" AMP.

Creating an AMP version of your mobile site sounds like a concrete and straightforward way to get ahead in search results, but in practical terms it’s easier said than done. As Damon Kiesow, Head of Product at publishing company McClatchy, told Neiman Lab:

“Everything we know about building a webpage we have to relearn. But we’re relearning it from the premise of converting a current product over, not creating a product from scratch. It’s a fairly complex process!”

Because AMP strips out a lot of the dynamic elements that slow down page loading time, embracing AMP might also mean that SEOs and search marketers have to do away with features that they depend on for business, such as comment systems, lead capture forms and other types of pop-up.

There’s also the very obvious fact that Google’s roll-out of AMP isn’t all that widespread yet. Accelerated Mobile Pages still don’t show up in search results outside of Google.com, meaning that many of the non-US respondents to SEO Powersuite’s survey may be deliberately holding fire until AMP will make a difference to search results in their country.

A screenshot of Google.com mobile results for "EU referendum", showing AMP-ified BBC News stories in the "top stories" carousel at the top of search results.Accelerated Mobile Pages still have yet to make an appearance in search results outside of Google.com

And of course, there is the occasionally-forgotten fact that the world of search consists of more than just Google. In a situation where implementing AMP would take a lot of time and resources, SEOs may be hesitant to go all-in on a feature that will only affect the standing of their mobile site on Google, especially if they market to a country which favours another major search engine, such as China or Russia.

Ultimately, SEOs have to weigh up the potential benefits of getting in on AMP ahead of their competitors and possibly securing a better spot on the Google SERP versus the drawbacks and costs of implementing the new protocol. SEO Powersuite noted in their published results of the survey that, “the delay in quick adoption [of AMP] offers an opportunity for agile marketers to get ahead of their competition in mobile search by implementing AMP immediately.”

They pointed out that getting in early with AMP has the potential to be beneficial for a long time thereafter, because “As any SEO professional working to overtake competitors knows, Google’s institutional memory is long. It can be difficult to get the search behemoth to “forget” (i.e. to stop ranking) brands it has mentally defined as industry leaders and therefore deserving of higher ranking because of AMP support.”

Therefore, investing resources in AMP at this stage could allow SEOs and search marketers to reap the rewards further down the line. It’s still early days, and with relatively few SEOs apparently having staked their claim with AMP so far, the field is wide open for others to make a move if they judge it to be worthwhile.

For lots more valuable insight on the changing face of digital marketing, attend our two-day Shift London event in May.

from Search Engine Watch https://searchenginewatch.com/2016/05/03/why-are-seos-slow-to-implement-accelerated-mobile-pages/
via Auto Feed

Hacking Your Way to 5x Higher Organic Click-Through Rates (and Better Conversion Rates & Rankings, too)

Posted by larry.kim

[Estimated read time: 13 minutes]

Last month we discussed why organic CTR is kind of a big deal. I believe that click-through rate is tremendously valuable and that achieving above-average CTRs can lead to better rankings, particularly on long tail queries.Hacking Your Way to 5x Higher Organic Click-Through Rates

But even if you don’t believe click-through rate can impact rankings, optimizing for a higher CTR still means you’re optimizing toward the goal of attracting more clicks. More clicks means more traffic and higher conversion rates — because if you can make people more worked up about your product/solution, that carries through to conversions, leads, and sales.

All great, important things!

So what the heck — why isn’t every SEO obsessed with raising organic click-through rates like myself and many other PPC marketers are?

Image of a unicorn on a purple background. "Always be yourself. Unless you can be a unicorn. Then be a unicorn."

Why isn’t CTR optimization a bigger deal in organic search today?

For starters, it’s ridiculously hard to tell what your organic CTR is for a keyword. Thanks, Google.

In the Search Analytics section of the Search Console, Google only gives you a sampling of 1,000 queries. Because you only have access to a sample of keywords, you can’t arbitrarily find out a CTR for any individual keywords.

It’s much easier to find our your CTR in paid search with AdWords. You can type in any word and find out what your CTR is for that word.

Another challenge preventing CTR from being a bigger deal today is Google Analytics, which hasn’t provided keywords to us for years. You can figure out the number of impressions and clicks for your top 1,000 pages, but the limited query data (1 percent of total) is a killer. It might be easy to see your CTR data, but it’s hard to know whether what you can see is good or not.

Also, many people just don’t realize how much leverage there is in increasing CTR. Donkey headlines (bottom 10%) tend to do around 3x worse than average, whereas unicorn headlines (top 10%) tend to do around 2x better than average. By converting donkeys to unicorns, you might not realize that by boosting your CTR could increase clicks to your site by 5x.

And one final important point (and yet another reason to kill your donkeys!): low CTRs typically also lead to low conversion rates — this is true for both organic and paid search. You can easily test this out yourself by analyzing your own website data.

Search Query Data for Organic SEO

Conversion Rate vs. CTR for one of my customers.

Introducing Larry’s High CTR Pyramid Scheme

Let’s look at a graph that shows the click-through rate by rank for my 1,000 keywords obtained through Google Search Console:

CTR vs. Ranking

The blue curve shows the CTRs on average for any given spot for all keywords. But that’s an average. An average includes all the top performers (unicorns) as well as the worst performers (donkeys).

There is considerable variance here.

  • The top 10 percent (the unicorns) have CTRs that are more than double the average (~55 percent vs. ~27 percent in first position).
  • The bottom 10 percent (the donkeys) have organic CTRs that are three times lower than average (~27 percent vs. ~8 percent in first position).

This is such a great opportunity. But it’s hard to realize just how great your CTR can be.

You can increase clicks by as much as 5x or even 6x by identifying your crappiest keyword donkeys and making them into high CTR headline unicorns, rather than stupid “optimized” title tag formulas — like:

Main Keyword, Long-Tail Keyword Variation 1, Long-Tail Keyword Variation 2.

This is a title tag optimization formula from ancient times — we’re talking B.H. (Before Hummingbird). This is no longer necessary because Google is now much better at inferring query intent.

Welcome to the new world. To help you adapt, I’ve developed a repeatable SEO workflow to turn your donkeys into unicorns.

Behold! It’s Larry’s High CTR Pyramid Scheme! Here’s how it works.

Detecting your donkeys

Donkeys versus Unicorns: Image of a donkey and a unicorn.

This whole process starts by finding your underperforming content donkeys using another of my hacks — Larry’s Donkey Detection Algorithm. Download all of your query data from the Search Console or Google Analytics. Next, graph CTR vs. Average Position for the queries you rank for organically and add a trend line, like this:

Organic Search Query Data - CTR vs. Ranking

The red line here is your average click through rate.

You want to focus ONLY on the keywords at very bottom of your curve. You don’t want to turn any of your unicorns into donkeys. You only want to turn your donkeys into unicorns!

Now you can sort by secondary metrics, such as conversion rates, if that’s what you care most about. Which of those donkeys have the highest conversion rates? Focus on these first because when you’re able to turn that page into a traffic unicorn, it will also convert more!

If you care most about engagement, then you can filter by that metric. If you can improve the CTR of this page, then you can be reasonably confident that more people will engage with your content.

Your content is a diamond in the rough — or a great book with a terrible cover. Now is the time to polish your diamond and help it become exceptional.

Warning: Don’t go crazy reoptimizing your title

"I'm a unicorn": Screenshot of Ralph with an ice cream cone on his forehead from The Simpsons TV show.

Image courtesy of Fox

This is important. You shouldn’t change the title tag over and over every week because this will cause problems in your quest for a magical cure to your donkey blues.

For one, Google will think your title is being dynamically populated. For another, you’re just guessing, which is probably why you have this CTR issue.

Also, multiple changes will make it hard to get a good reading on why the CTR changed. Is it due to the title tag change or is it something else entirely (a SERP change, a competitor change, seasonality, etc.)? If you keep changing it, you won’t have enough statistically significant data to make a data-driven decision.

Additionally, your ranking position could change, which would also further screw up things.

Bottom line: Don’t just go and change titles willy-nilly.

We can make a unicorn — we have the technology!

"Be a unicorn in a sea of donkeys!" A pink unicorn among dozens of gray donkeys.

To improve your organic click-through rate, you’ll need to collect some data. You can do this by creating ads on Google AdWords for no more than $50.

You’re going to create an ad pointing to the page you’re reoptimizing using 10 different headlines. The reason you need 10 headlines is so you can discover your statistical unicorn, the headline with a CTR that stands above the rest in the top 10 percent.

Think of it like a lottery where the odds of winning are 1 in 10. Your odds of winning are much greater if you buy 10 lottery tickets instead of just one, right?

You can absolutely create more headlines; 10 is just the minimum. If you really want to do this well, writing 12, 13, or 14 headlines dramatically increases the odds that you’ll find a unicorn.

Don’t half-ass your new headlines

"Old Man Yells At Cloud" newspaper headline; clip from The Simpsons TV show.

Image courtesy of Fox

I can’t stress this enough: You really have to try out different headlines. It can’t be the same headline, just with insignificant little changes (e.g., commas in different places, different punctuation, upper case vs. lower case).

Pop quiz: How many headlines do you count here?

  • 1. How to Write a Book Fast
  • 2. How to Write a Book FAST
  • 3. How to Write a Book…FAST
  • 4. How To Write A Book…Fast!
  • 5. How to write a book, fast.

Did you say 5?

WRONG!

No, the answer is 1.

These aren’t different headlines. They’re just different punctuations and capitalizations.

You have to REALLY change the headlines.

Write your headlines using different personas. Who is the person speaking to the reader? Is it the bearer of bad news? The hero? The villain? The comedian? The feel-good friend?

Also change emotional trigger in your headlines. You can use emotional drivers like amusement, surprise, happiness, hope, or excitement:

Plutchik's Wheel of Emotions: The top 10 emotional drivers.

Source: Plutchik’s Wheel of Emotions

Other emotions include anger, disgust, affirmation, and fear. All four of these can become huge winners.

Vary your headlines. Get super creative!

What keywords should you choose?

Add the keywords that you were hoping to appear for when you created the content, along with keywords you’re currently ranking for using query data from Google Analytics. Set those keywords to the broad match keyword match type.

Broad match is the default keyword match type and reaches the widest audience. It makes your ad eligible to appear whenever a user’s search query includes any word in your key phrase, in any order, and any syllables.

For example, if you use broad match on “luxury car,” your ad might be displayed if a user types “luxury cars,” “fast cars,” “luxury apartments,” or even “expensive vehicles,” which doesn’t include any of the terms in your keyword. Broad match will, in a way, act like RankBrain does — testing your headlines against a diverse set of queries, including related terms.

It’s a perfect keyword sample set.

10 awesome tips to help you write outstanding headlines

Ultimately, you want to think about three things when writing your ads: your target customer; the persona you want to use to speak to them; and what emotionally-charged words you can use to incite action.

Steve Rayson of BuzzSumo recently shared some great research on the five elements of viral headlines. Here’s what your headlines need to have:

  • Emotional Hook: This could be a certain emotional word or superlative — words like: amazing, unbelievable, shocking, disgusting, or inspiring.
  • Content Type: This tells the reader exactly what your content is — is your content images, quotes, pictures, or facts?
  • Topic: Think of this as your keyword — it could be something evergreen like “content marketing” or more news-oriented like a Google algorithm update or SERP test.
  • Format: This sets the expectation of the format your content will be in, whether it’s a listicle, quiz, ebook, or something else.
  • Promise Element: The reader benefit — tell the reader why your content will solve a problem, make them smarter or better at something, or that it provides vital information they need to know.

Here are five additional tips:

  • Choose your words wisely: Go either extremely positive (e.g., best, greatest, biggest) or negative (e.g., stop, avoid, don’t) with your headline word choices.
  • Be specific: Make it clear to the reader what your content is about.
  • Be unique: Show some personality. Create content that nobody else is doing (or improve on what others have already done). Dare to be different from your competitors.
  • Create a sense of urgency: What will the reader learn, lose, fail at, or miss out on it they don’t click right now?
  • Be useful: How does clicking on your content benefit the reader?

So let’s go back to our earlier headline example, How to Write a Book Fast. Based on this advice, what are some new headlines you could test? How about:

  • Write Your Book Fast: X Trusted Time-Saving Tips
  • X Surprising Tricks Nobody Told You About Writing Books Fast
  • How to Finish Writing Your Book 5x Faster
  • Write Fast Right Now: What Published Authors Don’t Want You to Know
  • X Ridiculously Easy Steps to Write Your Book Faster
  • What’s the Secret of Writing Great Books Fast?
  • X Inspiring Tips That Will Help You Write Your Book Faster
  • This Unusual Book Writing Technique Will Make You Write Faster
  • Your Book is Doomed: How I Write Way Faster Than You

Which one of these do you think would win our ad test? The answer may just surprise all of us.

How would you reoptimize this headline based on this advice? I’d love to see your ideas in the comments.

Where to run your ad

By now you may be saying, “Larry this is great, but I’m a little worried about how much this all will cost. Any suggestions to keep costs down?”

YES!

We’re just targeting English speakers. So you can save money by taking advantage of countries with lower CPCs.

Heat map of average cost per click around the world.

Rather than running ads in New York City, where CPCs would likely be very expensive, maybe you could set up your ads to appear only in Canada (which has 29 percent lower CPCs on average than the U.S.) or in Ireland (which has 40 percent lower CPCs on average).

Prepare your Unicorn Detector

Make sure to set your ads to rotate evenly. You want to ensure that all 10–14 of your ads have a chance to run.

Before analyzing your results, you’ll want at least 200 impressions per ad. This is actually the number of impressions Google AdWords uses before ascertaining a quality score, but more is better.

Also, you should bid to a specific position (e.g., bid to position 3, 4, or 5) using flexible bid strategies. That way you don’t have to compare CTRs where one ad had a CTR of 20% in position 1 but a 2% CTR in position 8.

Now you can analyze your results and see which headline had the best CTR. Pretty easy, huh?

Usually one of your 10 ads will be a unicorn. However, if all the CTRs turn out the same (e.g., 2% vs. 2.1%) throw them all out and try out more headlines.

"Looks like our unicorn is just a donkey with a plunger stuck to its face." Quote from Dr. Gregory House, House MD.

Your goal is to find an outlier, a headline that generates 2x, 3x, or 4x higher CTR than the rest.

Did it work?

Now we’ve reached the end. We’ve identified the donkeys. We have a workflow for auditioning new possible headlines. And we’ve identified the winning headlines. Now what?

You just swap them out. Replace your donkey title with the winning unicorn headline from your PPC ad test, and put it live.

To determine whether you’ve succeeded, track the number of clicks to the page to ensure that your CTR has indeed increased.

This is a ridiculously easy, low-risk, high-return strategy with a high probability of success because the new headline is battl- tested and should do just as well organically.

Conclusion: Say no to low CTR

Abraham Lincoln riding a unicorn through outer space.

Guys, this is crazy. First of all, think about all the SEO tasks you have to do. None of that is easy. It’s all manual work.

Just take link building as one example. You’re hoping for other people to link to you to help you rank better. In the end it’s very much a hit-or-miss approach to SEO because you have no control over whether you actually get the link (or if it will even help).

Also, link building is more of an art, and one that some people just don’t have the skills to do properly. Plus, when done poorly, bad link building can kill your rankings.

Here, the workflow — my High CTR Pyramid Scheme is all within your own control. This is more like on-page SEO, changing titles and text, but this is a more methodical, data-driven way of doing it.

Optimizing for CTR is very leveraged. You can 5x your CTR if you’re successful in turning a donkey into a unicorn. There’s even more bonus points because it should result in ever better rankings, which should result in even more clicks. And your conversion rates will improve.

I personally believe that CTR is calculated both at a query/page and at the domain level (like domain and page authority in link building). Since we can’t have CTR data for every possible page/query, it makes sense to have something to fall back on. So by killing off your CTR donkeys, you’re improving your domain CTR score, which should help rankings of all the other pages on your site.

There’s a famous Abraham Lincoln quote: “Give me six hours to chop down a tree and I will spend the first four sharpening the axe.”

Well, if I had one hour to spend on SEO, I would spend that one hour finding and fixing my donkey headlines, turning them into unicorn headlines. Hour for hour, I’m convinced you have a really great return here.

Your odds of winning the organic CTR lottery are 1 in 10. So go buy 10 lottery tickets!

Are you optimizing for CTR? If not, why?

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/3228603
via Auto Feed

The Start-to-Finish Guide to Optimizing Your WordPress Blog Posts [Plus a Checklist]

Posted by sergeystefoglo

WordPress is the most popular content management system (CMS) in the world. There’s a good chance you’ll need to optimize or work on a website that uses WordPress, if you haven’t already! Whether you’re a business owner, designer, developer, PPC expert, SEO consultant, or writer — getting familiar with WordPress is a smart move.

When I started out in SEO, I worked with local businesses that hired smaller firms to design or develop their sites. Naturally, most people gravitated towards WordPress as their CMS of choice: it was easy to customize, even easier to maintain, simple to use, and did the job well.

It wasn’t until I started working with websites that were using Joomla or Drupal that I began to appreciate the simplicity and flexibility that WordPress offers. Don’t get me wrong, Joomla and Drupal are both great, but they require a lot more setup and learning beforehand (especially if your goal is to optimize the site for organic search).

What this post is about

This post is going to walk through the process of uploading and optimizing a blog post using WordPress and Yoast SEO. I’ll go into detail on both of these topics and provide you with a downloadable checklist that you can give to your team or use yourself.

Before we get started

Yoast SEO

While it’s true that there are a variety of SEO plugins available for WordPress, I prefer Yoast SEO and will be referencing it as an essential plugin for this post. If you don’t currently have Yoast installed, you can visit their website to download it or simply search for “Yoast SEO” in WordPress and install it directly.

Pages and posts

WordPress has two basic sections for uploading content. There are pages (which are defined as landing pages on your website), and there are posts (which are essentially blog posts). One could argue that this article could be used as a guide to uploading and optimizing landing pages on WordPress, but I believe there’s a different approach for that and therefore will keep the focus of this article around posts.

Uploading your blog post

Before you get to optimizing your blog posts for organic search, you need to get them live on your site. If you’re familiar with how posting a blog works on WordPress, feel free to skip ahead to the optimization section of this article.

1. After logging into your site, hover over “Posts” and then click on “Add New.”

2. Copy and paste the title of your post where it says “Enter title here,” then paste the body text of your post in the section below (don’t copy over images yet).

Pro Tip: I personally write all of my blog posts in a separate program (like Word or Ulysses) and then copy over the text into WordPress when I’m ready to post it. You can definitely write your blog within WordPress and save it as a draft if you aren’t ready to publish it, but if you like having a local copy of your writing I’d recommend simply writing it in a different program.

Pro Tip: You can alternate between the “visual” and “text” editor here. If you’re familiar with HTML, I’d recommend “text,” as you can spot any potential errors in the code and have more control. If not, the “visual” editor works perfectly fine.

Pro Tip: If you have links in your post (which you should), double check that they were added correctly. If not, you can add a link using the WYSIWYG editor. In general, try to at least have 3 relevant internal links in each of your posts. Don’t be afraid of adding external links, either! The important thing to remember is that if the reader will find it useful, it’s okay to add it.

3. If you have images, place your cursor where you want the image. Click on “Add Media” and select “Upload Files.” After choosing your preferred settings, click “insert into post” to add your image in your article.

Note: There are various settings and options for sizing and aligning images. Please see this write up for a more detailed explanation of how images and featured images work in WordPress.

Pro Tip: It’s always a good idea to compress your images before uploading them so they don’t cause long load times. Here’s a great guide to compressing your images.

4. Scroll down a bit and you should see the “Categories” section on the right side of your screen. You don’t have to categorize your post (unless your site is organized by categories), but you can add one if you wish. If you do, WordPress will create category pages that pull in posts within that category. Here’s a great write-up on how WordPress utilizes category pages and what you should consider from an SEO perspective.

5. Under the “Categories” section, you’ll see the tags section. Similar to categories, you don’t have to use tags. In fact, I would argue that you should always noindex tagged pages that are auto-generated by WordPress, as oftentimes it can cause duplication issues. Nonetheless, you can add tags to your post here.

6. If you scroll down further you’ll see an “Author” section, where you can choose the author of your blog post.

7. Scroll back up and find the section that’s called “Publish.” Here you can choose “Preview” to make sure everything looks right in your post before optimizing/uploading it. If something doesn’t look the way you want it to, just edit that section.

8. If you want a snippet of your post to appear on your blog homepage instead of the entire thing, simply place your cursor where you want the break to be and click on the “Insert Read More tag” button. Read this post that explains the “Read More” tag and its function in WordPress.

This should get you to a point where you’re ready to optimize your blog — let’s focus on this next.

Optimizing your blog post

Getting down the foundational elements of uploading a blog post on WordPress is crucial, but we are marketers, aren’t we? This section breaks down what you (or your team) should be doing to optimize a post on WordPress as best as possible. My goal with creating the checklist at the bottom of this article is so that you and your team can reference it when uploading posts. Pretty soon it’ll become second nature!

1. Assuming you’re still on the “Edit Post” page, scroll down until you see a section titled “Yoast SEO.”

Pro Tip: If you don’t see this section, make sure you have the correct plugin installed. If you do and still don’t see this section, scroll up to the very top right of the screen and click on “Screen Options.” From here, make sure that “Wordpress SEO by Yoast” is checked.

2. Click on “Edit Snippet” in the Yoast SEO section. The “SEO title” box will be where you input your title tag.

Pro Tip: In general, you want to include your main keyword first followed by your brand name or website name. Also, make sure that you stay within 40–65 characters here.

3. You guessed it — the “Meta description” box is where you’ll input your meta description.

Pro Tip: Although not necessary, including your main keyword in the meta description can be a great idea if it flows well with your content. Google has explicitly mentioned that meta descriptions aren’t important to search engine rankings, but that doesn’t mean using a keyword won’t help users click on your post. Because of this, try to make your meta description as enticing as possible to a potential user. Why should they click on your blog post instead of the other options available in the SERP? Also, as a general rule, stay within 70–156 characters here.

4. A new addition to Yoast SEO (although not WordPress), the “Slug” section allows you to edit the URL of your post. By default, WordPress will add the title of your post to the URL (which isn’t a bad way to go), but if you want to alter it this is where you can.

Pro Tip: There are “standard practice” tips for URL optimization that don’t necessarily affect your rankings, but solidify what your post is about to users and search engines. These standard practice tips include keeping your URL short, including a keyword if possible, and having the URL make obvious what the post is about. Here is a great write up from Rand on URL optimization.

5. If you click on the gear icon tab within the Yoast SEO section, you’ll notice options for things like meta robots and the canonical URL. In most cases, these settings will already be set on a global scale; however, you can override your global settings for specific posts here.

6. If you click on the “Share” icon, you can override the default metadata (titles, images, etc.) that Facebook and Twitter will pull for your post. In general, you can leave these blank. However, if you have a good reason to override them (testing different images, optimizing for various target audiences, etc.) this is where you can.

7. We’ve covered a lot of important on-page elements so far, but one we haven’t covered is the <h1> tag. This tag is crucial for telling search engines what your page is about. In most cases, your title will automatically be an <h1> tag.

Pro Tip: I see a lot of sites who have multiple <h1> tags on a page, as well as many sites who have duplicate <h1> tags across the site. Often times, the logo or phone number can be wrapped in an <h1> tag. Make sure to double check that you have one <h1> tag for every page, and make sure that these tags are all unique.

8.A dding alt tags to images is fairly simple with WordPress. There are various ways to do this, but it all comes down to whether you’re using the visual editor or the text editor.

Visual: Click on the image you want to add alt text to, and click on the “Edit” icon. Add your alt text in the “Alternative Text” field. Make sure to click on “Update” after.

Text: Simply add the alt=“” snippet of code inside the image tag. It should look something like this:

<img src="http://www.domain.com/images/1" alt="keyword goes here">

In general, alt tags should describe the photo. So, if I was writing a blog post about central vacuum systems and I had an image of a man using a central vacuum system, the ideal alt tag would be “Man Using Central Vacuum System” or “Man Cleaning With Central Vacuum System.”

9. It’s important to take a look at your internal links within your post. Are they topically relevant? Try to include at least 3–4 links that point to your internal pages and don’t be scared to throw in good external links as well.

10. Does your post have a clear CTA? Oftentimes this can be a “Read more posts like this” callout or a “Sign up for our newsletter” button; however, it could also look like a “buy now” CTA for sites that write about products.

11. After following the above steps, take a second glance at everything before hitting “publish.” If you publish your post and realize that something doesn’t look right later on, just head back to the editor, make your changes, and click “update.”

Extras

Optimization checklist

As promised, please download and distribute this checklist as you please. My hope is that after going through it multiple times, posting and optimizing your blog posts on WordPress will come as second nature to you (or your team).

I want the checklist!

3 more essential WordPress plugins for marketers

  1. Broken Link Checker – Essential plugin that monitors all of your internal links and regularly reports on where they are. Easily one of the most simple yet helpful plugins out there.
  2. W3 Total Cache – This plugin helps increase the speed of your site by leveraging caching, and minifying code. Highly recommended!
  3. Gravity Forms – While there are some decent options for contact form plugins on WordPress, Gravity Forms beats them all because of the customization options, continued plugin support, and add-ons..

If you’re interested, I wrote an all-around guide to using Yoast SEO on the Distilled blog earlier this year. Also, please visit the good people at Yoast, as their blog is full of great advice and tutorials.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/3219311
via Auto Feed

8 Old School SEO Practices That Are No Longer Effective – Whiteboard Friday

Posted by randfish

[Estimated read time: 14 minutes]

Are you guilty of living in the past? Using methods that were once tried-and-true can be alluring, but it can also prove dangerous to your search strategy. In today’s Whiteboard Friday, Rand spells out eight old school SEO practices that you should ditch in favor of more effective and modern alternatives.

8 Old School SEO Practices That Are No Longer Effective Whiteboard

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Howdy, Moz fans, and welcome to another edition of Whiteboard Friday. This week we’re going to chat about some old school SEO practices that just don’t work anymore and things with which we should replace them.

Let’s start with the first one — keywords before clicks.

Look, I get the appeal here. The idea is that we’ve done a bunch of keyword research, now we’re doing keyword targeting, and we can see that it might be important to target multiple keywords on the same page. So FYI, “pipe smoking,” “tobacco smoking,” “very dangerous for your health,” not recommended by me or by Moz, but I thought it was a funny throwback keyword and so there you go. I do enjoy little implements even if I never use them.

So pipes, tobacco pipes, pipe smoking, wooden pipes, this is not going to draw anyone’s click. You might think, “But it’s good SEO, Rand. It’s good to have all my keywords in my title element. I know that’s an important part of SEO.” Not anymore. It really is not anymore an important . . . well, let’s put it this way. It’s an important part of SEO, which is subsumed by wanting to draw the clicks. The user is searching, they’re looking at the page, and what are they going to think when they see pipes tobacco, pipes, pipe smoking, wooden pipes? They have associations with that — spammy, sketchy, I don’t want to click it — and we know, as SEOs, that Google is using click signals to help documents rank over time and to help websites rank over time.

So if they’re judging this, you’re going to fall in the rankings, versus a title like “Art of Piping: Studying Wooden Pipes for Every Price Range.” Now, you’re not just playing off the, “Yes, I am including some keywords in there. I have ‘wooden’ and ‘pipes.’ I have ‘art of piping,’ which is maybe my brand name.” But I’m worried more about drawing the click, which is why I’m making this part of my message of “for every price range.” I’m using the word “stunning” to draw people in. I’m saying, “Our collection is not the largest but the hand-selected best. You’ll find unique pipes available nowhere else and always free, fast shipping.”

I’m essentially trying to create a message, like I would for an AdWords ad, that is less focused on just having the raw keywords in there and more focused on drawing the click. This is a far more effective approach that we’ve seen over the last few years. It’s probably been a good six or seven years that this has been vastly superior to this other approach.

Second one, heavy use of anchor text on internal links.

This used to be a practice that could have positive impacts on rankings. But what we’ve seen lately, especially the last few years, is that Google has discounted this and has actually even punished it where they feel like it’s inappropriate or spammy, manipulative, overdone. We talked about this a little in our internal and external linking Whiteboard Friday a couple of weeks back.

In this case, my suggestion would be if the internal link is in the navigation, if it’s in the footer, if it’s in a sidebar, if it’s inside content, and it is relevant and well-written and it flows well, has high usability, you’re pretty safe. However, if it has low usability, if it looks sketchy or funny, if you’re making the font small so as to hide it because it’s really for search engines and not for searchers and users, now you’re in a sketchy place. You might count on being discounted, penalized, or hurt at some point by Google.

Number three, pages for every keyword variant.

This is an SEO tactic that many folks are still pursuing today and that had been effective for a very long time. So the idea was basically if I have any variation of a keyword, I want a single page to target that because keyword targeting is such a precise art and technical science that I want to have the maximum capacity to target each keyword individually, even if it’s only slightly different from another one. This still worked even up to four or five years ago, and in some cases, people were sacrificing usability because they saw it still worked.

Nowadays, Google has gotten so smart with upgrades like Hummingbird, obviously with RankBrain last year, that they’ve taken to a much more intent- and topic-matching model. So we don’t want to do something like have four different pages, like unique hand-carved pipes, hand-carved pipes, hand-carved tobacco pipes, and hand-carved tobacco smoking pipes. By the way, these are all real searches that you’ll find in Google Suggest or AdWords. But rather than taking all of these and having a separate page for each, I want one page targeting all of them. I might try and fit these keywords intelligently into the content, the headline, maybe the title, the meta description, those kinds of things. I’m sure I can find a good combination of these. But the intent for each of these searchers is the same, so I only want one page targeting them.

Number four — directories, paid links, etc.

Every single one of these link building, link acquisition techniques that I’m about to mention has either been directly penalized by Google or penalized as part of an update, or we’ve seen sites get hit hard for doing it. This is dangerous stuff, and you want to stay away from all of these at this point.

Directories, well, generic directories and SEO directories for sure. Article links, especially article blasts where you can push an article in and there’s no editorial review. Guest content, depending on the editorial practices, the board might be a little different. Press releases, Google you saw penalized some press release websites. Well, it didn’t penalize the press release website. Google said, “You know what? Your links don’t count anymore, or we’re going to discount them. We’re not going to treat them the same.”

Comment links, for obvious reasons, reciprocal link pages, those got penalized many years ago. Article spinners. Private link networks. You se private and network, or you see network, you should just generally run away. Private blog networks. Paid link networks. Fiverr or forum link buys.

You see advertised on all sorts of SEO forums especially the more aggressive, sketchy ones that a lot of folks are like, “Hey, for $99, we have this amazing package, and I’ll show you all the people whose rankings it’s increased, and they come from PageRank six,” never mind that Page Rank is totally defunct. Or worse, they use Moz. They’ll say like, “Domain authority 60-plus websites.” You know what, Moz is not perfect. Domain authority is not a perfect representation of the value you’re going to get from these things. Anyone who’s selling you links on a forum, you should be super skeptical. That’s somewhat like someone going up to your house and being like, “Hey, I got this Ferrari in the yard here. You want to buy this?” That’s my Jersey coming out.

Social link buys, anything like this, just say no people.

Number five, multiple microsites, separate domains, or separate domains with the same audience or topic target.

So this again used to be a very common SEO practice, where folks would say, “Hey, I’m going to split these up because I can get very micro targeted with my individual websites.” They were often keyword-rich domain names like woodenpipes.com, and I’ve got handmadepipes.net, and I’ve got pipesofmexico.co versus I just have artofpiping.com, not that “piping” is necessarily the right word. Then it includes all of the content from all of these. The benefit here is that this is going to gain domain authority much faster and much better, and in a far greater fashion than any of these will.

Let’s say that it was possible that there is no bias against the exact match domain names folks. We’re happy to link to them, and you had just as much success branding each of these and earning links to each of these, and doing content marketing on each of these as you did on this one. But you split up your efforts a third, a third, a third. Guess what would happen? These would rank about a third as well as all the content would on here, which means the content on handmadepipes.net is not benefitting from the links and content on woodenpipes.com, and that sucks. You want to combine your efforts into one domain if you possibly can. This is one of the reasons we also recommend against subdomains and microsites, because putting all of your efforts into one place has the best shot at earning you the most rankings for all of the content you create.

Number six, exact and partial keyword match domain names in general.

It’s the case like if I’m a consumer and I’m looking at domain names like woodenpipes.com, handmadepipes.net, uniquepipes.shop, hand-carved-pipes.co, the problem is that over time, over the last 15, 20 years of the Web, those types of domain names that don’t sound like real brands, that are not in our memories and don’t have positive associations with them, they’re going to draw clicks away from you and towards your competitors who sound more credible, more competent, and more branded. For that reason alone, you should avoid them.

It’s also that case that we’ve seen that these types of domains do much more poorly with link earning, with content marketing, with being able to have guest content accepted. People don’t trust it. The same is true for public relations and getting press mentions. The press doesn’t trust sites like these.

For those reasons, it’s just a barrier. Even if you thought, “Hey, there’s still keyword benefits to these,” which there is a little bit because the anchor text that comes with them, that points to the site always includes the words and phrases you’re going after. So there’s a little bit of benefit, but it’s far overwhelmed by the really frustrating speed bumps and roadblocks that you face when you have a domain like this.

Number seven — Using CPC or Adwords’ “Competition” to determine the difficulty of ranking in organic or non-paid results

A lot of folks, when they’re doing keyword research, for some reason still have this idea that using cost per click or AdWords as competition scores can help determine the difficulty of ranking in organic, non-paid results. This is totally wrong.

So see right here, I’ve got “hand-carved pipes” and “unique wooden pipes,” and they have an AdWords CPC respectively of $3.80 and $5.50, and they have AdWords competition of medium and medium. That is in no way correlated necessarily with how difficult they’ll be to rank for in the organic results. I could find, for example, that “unique wooden pipes” is actually easier or harder than “hand-carved pipes” to rank for in the organic SEO results. This really depends on: Who’s in the competition set? What types of links do they have and social mentions do they have? How robust is their content? How much are they exciting visitors and drawing them in and serving them well? That sort of stuff is really hard to calculate here.

I like the keyword difficulty score that Moz uses. Some other tools have their own versions. Doctor Pete, I think, did a wonderful job of putting together a keyword difficulty score that’s relatively comprehensive and well-thought through, uses a lot of the metrics about the domain and the page authority scores, and it compensates for a lot of other things, to look at a set of search results and say, “This is probably about how hard it’s going to be,” and whether it’s harder or easier than some other keyword.

Number eight — Unfocused, non-strategic “linkbait”

Last one, some folks are still engaging in this, I think because content strategy, content marketing, and content as a whole has become a very hot topic and a point of investment. Many SEOs still invest in what I call “nonstrategic and unfocused link bait.” The idea being if I can draw links to my website, it doesn’t really matter if the content doesn’t make people very happy or if it doesn’t match and gel well with what’s on my site. So you see a lot of these types of practices on sites that have nothing to do with it. Like, “Here are seven actors who one time wore too little clothing.” That’s an extreme example, but you get the idea if you ever look at the bottom ads for a lot of content stuff. It feels like pretty much all of them say that.

Versus on topic link bait or what I’d call high quality content that is likely to draw in links and attention, and create a positive branding association like, “Here’s the popularity of pipes, cigarettes, electronic cigarettes, and cigars in the U.S. from 1950 to today.” We’ve got the data over time and we’ve mapped that out. This is likely to earn a lot of links, press attention. People would check it out. They’d go, “Oh, when was it that electronic cigarettes started getting popular? Have pipes really fallen off? It feels like no one uses them anymore. I don’t see them in public. When was that? Why was that? Can I go over time and see that dataset?” It’s fundamentally interesting, and data journalism is, obviously, very hot right now.

So with these eight, hopefully you’ll be able to switch from some old school SEO techniques that don’t work so well to some new ways of thinking that will take your SEO results to a great place. And with that, we’ll see you again next week for another edition of Whiteboard Friday. Take care.

Video transcription by Speechpad.com

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/3195778
via Auto Feed

How to achieve face-melting content marketing ROI

Jason Miller knows a thing or two about content. He’s the Senior Content Marketing Manager at Linkedin, who presented a session titled “How to Achieve Face-Melting Content Marketing ROI” at ClickZ Live NY last week.

With that title, the session immediately piqued my interest and Jason did not disappoint.

In case you were wondering, face melting is:

“The condition in which, due to an extreme exposure to an event of epic awesomeness, horror or any other emotion on the more extreme end of the spectrum of emotions, one loses all perception of space and time including (but not limited to) a brief lapse in physical awareness. Such an emotional rush can even override Pain, which in some cases may be the cause of the rush.”

Source: The Urban Dictionary

To put this in context, you don’t need more content; you need more EPIC and AWESOME content – aka more relevant content.

According to Jason, in a recent survey, 44% of overall respondents say they would consider ending a brand relationship because of irrelevant promotions. An additional 22%, say they would definitely defect from the brand.

Rage against irrelevance

Developing relevant content doesn’t need to be a difficult exercise. It doesn’t require any special tools or secret sauce. It all begins with having empathy with your prospects and customers. The formula looks like this:

Useful x Enjoyable x Inspired = Innovative Content ~ Ann Handley

The process begins with the creation of a piece of “Big Rock Content” Not a 2,000 word evergreen piece, but something closer to a 50 or 100 page ebook. Something Awesome. Something Epic. Something like The Sophisticated Marketers Guide to LinkedIn

Big, thick, juicy content is the gift that keeps on giving. A single piece of Big Rock content can be repurposed to attract links, generate traffic and build brand awareness for a year or more. Jason suggests thinking of it as something akin to Leftover Turkey.

leftover-turkey

If distracted by the turkey, this may be a better visual for you:

repurpose

Once your content is published, blast the news EVERYWHERE: Company pages, email, blog, sponsored updates, Display ads, SlideShare, PPC, Twitter, etc. Use turkey slices to fuel your content hubs.

Execution

It’s easy to develop a set of goals, but a plan is specific, time phased and measurable. After determining what constitutes your Big Rock & turkey slices, Jason gave an example of a five week rollout

  • Week 1: Publish Big Rock Content, Influencer Outreach, Sponsored Updates
  • Week 2: Big Rock Webinar, Influencer Outreach, Sponsored Updates
  • Week 3: Big Rock Webinar, Influencer Outreach, Sponsored Updates, Turkey Slice 1, Turkey Slice 2
  • Weeks 4 & 5: Big Rock Webinar, Influencer Outreach, Sponsored Updates, Turkey Slice 1, Turkey Slice 2, Turkey Slice 3, Turkey Slice 4

Your blog ties it all together

Sticking with his food analogy, Jason developed some Blogging Food Groups:

Blogging Food Groups

To be served up on the following schedule (with the associated time commitment)

  • Monday: Vegetables (35% time spent in development)
  • Tuesday: Meats (20% time spent in development)
  • Wednesday : Whole wheat & grains (25% time spent in development)
  • Thursday: Condiments (5% time spent in development)
  • Friday: Desserts (15% time spent in development)

The marketing team of the future

Jason may like his food, but he really loves Kiss.

He used the band as an analogy of how digital marketing symmetry works:

  • SEO – Lays the groundwork
  • Social – Fuels the content
  • Content – Fuels the demand

In the case of the band:

  • They consistently deliver content that their fans want to consume and share
  • Their PR efforts guide their vision as one of the hottest bands in the world
  • They deliver amazing experiences on tour (Event Marketing)
  • They built a thriving community

The Takeaway

Big Rock content isn’t something that would be nice to have. It’s something that you need. As Hummingbird, RankBrain and other algorithms get better; you need to become a smarter marketer. Following this approach to content marketing is sure to give you an edge over most competitors.

To learn more about the changing face of digital marketing, come to our two-day Shift London event in May.

Chuck Price is the founder of Measurable SEO and contributor to Search Engine Watch.

from Search Engine Watch https://searchenginewatch.com/2016/04/28/how-to-achieve-face-melting-content-marketing-roi/
via Auto Feed

The Local SEO Agency’s Complete Guide to Client Discovery and Onboarding

Posted by MiriamEllis

Why proper onboarding matters

Imagine getting three months in on a Local SEO contract before realizing that your client’s storefront is really his cousin’s garage. From which he runs two other “legit” businesses he never mentioned. Or that he neglected to mention the reviews he bought last year. Worse yet, he doesn’t even know that buying reviews is a bad thing.

The story is equally bad if you’re diligently working to build quality unique content around a Chicago client’s business in Wicker Park but then realize their address (and customer base) is actually in neighboring Avondale.

What you don’t know will hurt you. And your clients.

A hallmark of the professional Local SEO department or agency is its dedication to getting off on the right foot with a new client by getting their data beautifully documented for the whole team from the start. At various times throughout the life of the contract, your teammates and staff from complementary departments will be needing to access different aspects of a client’s core NAP, known challenges, company history, and goals.

Having this information clearly recorded in shareable media is the key to both organization and collaboration, as well as being the best preventative measure against costly data-oriented mistakes. Clear and consistent data play vital roles in Local SEO. Information must not only be gathered, but carefully verified with the client.

This article will offer you a working Client Discovery Questionnaire, an Initial Discovery Phone Call Script, and a useful Location Data Spreadsheet that will be easy for any customer to fill out and for you to then use to get those listings up to date. You’re about to take your client discovery process to awesome new heights!

Why agencies don’t always get onboarding right

Lack of a clearly delineated, step-by-step onboarding process increases the potential for human error. Your agency’s Local SEO manager may be having allergies on Monday and simply forget to ask your new client if they have more than one website, if they’ve ever purchased reviews, or if they have direct access to their Google My Business listings. Or they could have that information and forget to share it when they jump to a new agency.

The outcomes of disorganized onboarding can range from minor hassles to disastrous mistakes.

Minor hassles would include having to make a number of follow-up phone calls to fill in holes in a spreadsheet that could have been taken care of in a single outreach. It’s inconvenient for all teammates when they have to scramble for missing data that should have been available at the outset of the project.

Disastrous mistakes can stem from a failure to fully gauge the details and scope of a client’s holdings. Suddenly, a medium-sized project can take on gigantic proportions when the agency learns that the client actually has 10 mini-sites with duplicate content on them, or 10 duplicate GMB listings, or a series of call tracking numbers around the web.

It’s extremely disheartening to discover a mountain of work you didn’t realize would need to be undertaken, and the agency can end up having to put in extra uncompensated time or return to the client to renegotiate the contract. It also leads to client dissatisfaction.

Setting correct client expectations is completely dependent on being able to properly gauge the scope of a project, so that you can provide an appropriate timeline, quote, and projected benchmarks. In Local, that comes down to documenting core business information, identifying past and present problems, and understanding which client goals are achievable. With the right tools and effective communication, your agency will be making a very successful start to what you want to be a very successful project.

Professional client discovery made simple

There’s a lot you want to learn about a new client up front, but asking (and answering) all those questions right away can be grueling. Not to mention information fatigue, which can make your client give shorter and shorter answers when they feel like they’ve spent enough time already. Meanwhile your brain reaches max capacity and you can’t use all that valuable information because you can’t remember it.

To prevent such a disaster, we recommend dividing your Local SEO discovery process into a questionnaire to nail down the basics, a follow-up phone call to help you feel out some trickier issues, and a CSV to gather the location data. And we’ve created templates to get you started…

Client Discovery Questionnaire

Use our Local SEO Client Discovery Questionnaire to understand your client’s history, current organization, and what other consultants they might also be working with. We’ve annotated each question in the Google Doc template to help you understand what you can learn and potential pitfalls to look out for.

If you want to make collecting and preserving your clients’ answers extra easy, use Google Forms to turn that questionnaire into a form like this:

You can even personalize the graphic, questions, and workflow to suit your brand.

Client Discovery Phone Script

Once you’ve received your client’s completed questionnaire and have had time to process the responses and do any necessary due diligence (like using our Check Listings tool to check how aggregators currently display their information), it’s time to follow up on the phone. Use our annotated Local SEO Client Discovery Phone Script to get you started.

local seo client discovery phone script

No form necessary this time, because you’ll be asking the client verbally. Be sure to pay attention to the client’s tone of voice as they answer and refer to the notes under each question to see what you might be in for.

Location Data CSV

Sometimes the hardest part of Local SEO is getting all the location info letter-perfect. Make that easier by having the client input all those details into your copy of the Location Data Spreadsheet.

local seo location data csv

Then use the File menu to download that document as a CSV.

You’ll want to proof this before uploading it to any data aggregators. If you’re working with Moz Local, the next step is an easy upload of your CSV. If you’re working with other services, you can always customize your data collection spreadsheet to meet their standards.

Keep up to date on any business moves or changes in hours by designing a data update form like this one from SEER and periodically reminding your client contact to use it.

Why mutual signals of commitment really matter

There are two sides to every successful client project: one half belongs to the agency and the other to the company it serves. The attention to detail your agency displays via clean, user-friendly forms and good phone sessions will signal your professionalism and commitment to doing quality work. At the same time, the willingness of the client to take the necessary time to fill out these documents and have these conversations signals their commitment to receiving value from their investment.

It’s not unusual for a new client to express some initial surprise when they realize how many questions you’re asking them to answer. Past experience may even have led them to expect half-hearted, sloppy work from other SEO agencies. But, what you want to see is a willingness on their part to share everything they can about their company with you so that you can do your best work.

Anecdotally, I’ve fully refunded the down payments of a few incoming clients who claimed they couldn’t take the time to fill out my forms, because I detected in their unwillingness a lack of genuine commitment to success. These companies have, fortunately, been the exception rather than the rule for me, and likely will be for your agency, too.

It’s my hope that, with the right forms and a commitment to having important conversations with incoming clients at the outset, the work you undertake will make your Local team top agency and client heroes!

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/3185995
via Auto Feed