Search marketing in China: the rise of so.com

Baidu, the leading Chinese search engine, is the third most popular search engine in the world, despite being mostly concentrated in and around China. That speaks clearly to the immense size and power of the Chinese market.

An estimated 507 million Chinese use search engines. This is an enormous marketplace for companies who want to grow overseas and engage with new prospective customers.

Although Google dominates much of the search engine traffic in North America and Europe, in China it is one of the least popular search engines.

Instead, Baidu, and its rising competitor Qihoo 360, control the landscape. Those interested in doing business in China will need to make sure they understand these search engines if they want to compete.

How is the Chinese market changing? – So.com

The market in China is quickly changing and evolving. Baidu has long dominated the search engine sphere, and they still control an estimated 54% of the search engine market share. Over the past few years, however, there has been a fast rising competitor that is seizing an increasing percentage of the search volume.

baidu serp

Qihoo 360 was developed by a security software company and its search engine so.com. It was only launched in 2012, but by 2015 it controlled an estimated 30% of the Chinese search market.

Its popularity has likely been influenced by the growth of mobile. By Q3 in 2014, mobile devices were the leading source of searches and revenue for Chinese search engine marketing, and Qihoo 360 has been responsible for building the most popular app store in China.

How is search engine marketing different in the APAC region than in the US?

Brands who want to expand overseas into the APAC region need to be familiar with the local ranking factors and how to conduct SEO for the popular search engines, particularly Baidu and so.com as optimizing for one site will allow you to improve your rankings on both.

Tips for SEO in China:

Do not try to get a website ranked by using automatic translators or just students of the language. Using a native speaker will provide you with an infinitely superior site, as you will be able to avoid major grammatical errors, have the content flow more naturally, select more relevant keywords and use vocabulary that resonates better with the local audience. Your site will fit better overall into the framework of the Chinese digital ecosystem. Translation issues can hurt your reputation and cause you to rank lower on the SERPs.

When setting up a website, you want to try and get a .CN domain. If that is not possible, then seek a .COM or .Net. You website should also be hosted in China and you should secure an ICP license from the Chinese Ministry of Industry and Information Technology. Avoid having multiple domains or subdomains.

It is imperative that you know the list of blacklisted words that cannot be posted online. Inclusion of these words can cause your site to be de-indexed and even taken down. Remember that your website can not criticize the government in any way.

As you build the website, keep your title tags under 35 characters in Simplified Chinese and your meta descriptions below 78 characters in Simplified Chinese.

Website speed is highly valued. Regularly test your site to make sure it loads quickly. Inbound links are also viewed as valuable for search rankings, so finding opportunities to build a strong backlink profile can be very helpful.

All of the links you create should be in plain HTML. In general, avoiding Javascript is preferred, because sometimes content in that format is not indexed.

The Chinese search engines value fresh content. So regularly publishing on your page will help boost your reputation and success. You should submit your blog posts to the Baidu News Feed, which will help you attract new readers to your material.

For businesses interested in expanding into Asia, understanding how the local search engine market is evolving and changing can be critical to creating sites that rank well on the local search engines.

For business expanding globally outside of the US, make sure you optimize for premium search engines for key regions such as Naver (South Korea) and Yandex (Russia) also!

from Search Engine Watch https://searchenginewatch.com/2016/05/09/search-marketing-in-china-the-rise-of-so-com/
via Auto Feed

Google Search Console: a complete overview

The Search Console (or Google Webmaster Tools as it used to be known) is a completely free and indispensably useful service offered by Google to all webmasters.

Although you certainly don’t have to be signed up to Search Console in order to be crawled and indexed by Google, it can definitely help with optimising your site and its content for search.

Search Console Dashboard

Search Console is where you can monitor your site’s performance, identify issues, submit content for crawling, remove content you don’t want indexed, view the search queries that brought visitors to your site, monitor backlinks… there’s lots of good stuff here.

Perhaps most importantly though, Search Console is where Google will communicate with you should anything go wrong (crawling errors, manual penalties, increase in 404 pages, malware detected, etc.)

If you don’t have a Search Console account, then you should get one now. You may find that you won’t actually need some of the other fancier, more expensive tools that essentially do the same thing.

To get started, all you need is a Google sign-in, which you probably already have if you regularly use Google or Gmail, and visit Search Console.

Then follow this complete guide which will take you through every tool and feature, as clearly and concisely as possible.

Please note: we published a guide to the old Webmaster Tools service, written by Simon Heseltine, back in 2014. This is an updated, rewritten version that reflects the changes and updates to Search Console since, but much of the credit should go to Simon for laying the original groundwork.

Quick Links:

Add a property

If you haven’t already, you will have to add your website to Search Console.

Just click on the big red Add a Property button, then add your URL to the pop-up box.

add property in search console

Verification

Before Search Console can access your site, you have to prove to Google that you’re an authorized webmaster. You don’t have be in charge, but you do need permission from whoever is.

There are five methods of verification for Search Console There’s no real preference as to which method you use, although Google does give prominence to its ‘recommended method’…

1) The HTML file upload: Google provides you with a HTML verification file that you need to upload to the root directory of your site. Once you’ve done that, you just click on the provided URL, hit the verify button and you’ll have full access to Search Console data for the site.

verify your site in search console

There are also four alternative methods if the above doesn’t suit…

alternate methods of uploading to Search Console2) HTML tag: this provides you with a meta tag that needs to be inserted in the <head> section of your homepage, before the first <body> section.

If you make any further updates to the HTML of your homepage, make sure the tag is still in place, otherwise your verification will be revoked. If this does happen, you’ll just have to go through the process again.

3) Domain Name Provider: here you’re presented with a drop down list of domain registrars or name providers, then Google will give you a step-by-step guide for inserting a TXT record to your DNS configuration.

4) Google Analytics: assuming you’re using Google Analytics and your Google account is the same one you’re using for Search Console, then you can verify the site this way, as long as the GA code is in the <head> section of your home page (and remains there), and you have ‘edit’ permission.

5) Google Tag Manager: this option allows you to use your own Google Tag Manager account to verify your site, providing you’re using the ‘container snippet’ and you have ‘manage’ permission.

Now that you’re verified, you’ll be able to see your site on the Home screen. (As well as any sites you’re also a webmaster for). Here you can access the site, add another property and see how many unread messages you’ve received from Google.

Search Console Home

if you click on your site, you will be taken to its own unique Dashboard.

For the purposes of the following walk-throughs, I’ll be using my own website Methods Unsound, which means you can see all the things I need to fix and optimise in my own project.

Dashboard

Here’s where you can access all of your site’s data, adjust your settings and see how many unread messages you have.

Search Console Dashboard

The left-hand Dashboard Menu is where you can navigate to all the reports and tools at your disposal.

The three visualisations presented on the Dashboard itself (Crawl Errors, Search Analytics, and Sitemaps) are quick glimpses at your general site health and crawlability. These act as short-cuts to reports found in the left-hand menu, so we’ll cover these as we walk-through the tools.

Also note that Google may communicate a message directly on the dashboard, if it’s deemed important enough to be pulled out of your Messages. As you can see I have errors on my AMP pages that need fixing, but we’ll look at this when we get to the Dashboard Menu section further down.

First let’s take a look at settings…

Settings

Clicking on the gear icon in the top right corner will give you access to a variety of simple tools, preferences and admin features.

search console preferences

Search Console Preferences

This is simply where you can set your email preferences. Google promises not to spam you with incessant emails so it’s best to opt-in.

Search Console Preferences - email

Site Settings

Here’s where you can set your preferred domain and crawl rate.

site settings

  • Preferred domain let’s you set which version of your site you’d like indexed and whether your site shows up in search results with the www prefix or without it. Links may point to your site using http://www.example.com or http://example.com, but choosing a preference here will set how the URL is displayed in search.Google states that: “If you don’t specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages” thus cannibalising your search visibility.
  • Crawl rate lets you slow down the rate that Googlebots crawls your site. You only need to do this if you’re having server issues and crawling is definitely responsible for slowing down the speed of your server. Google has pretty sophisticated algorithms to make sure your site isn’t hit by Googlebots too often, so this is a rare occurrence.

Change of Address

This is where you tell Google if you’ve migrated your entire site to a new domain.

Search Console Change of Address

Once your new site is live and you’ve permanently 301 redirected the content from your old site to the new one, you can add the new site to Search Console (following the Add a Property instructions from earlier). You can then check the 301 redirects work properly, check all your verification methods are still intact on both old and new sites, then submit your change of address.

This will help Google index your new site quicker, rather than if you just left the Googlebots to detect all your 301 redirects on their own accord.

Google Analytics Property

If you want to see Search Console data in Google Analytics, you can use this tool to associate a site with your GA account and link it directly with your reports.

Search Console Google Analytics Property

If you don’t have Google Analytics, there’s a link at the bottom of the page to set up a new account.

Users & Property Owners

Here you can see all the authorized users of the Search Console account, and their level of access.

Search Console Users and Property Owners

You can add new users here and set their permission level.

  • Anyone listed as an Owner will have permission to access every report and tool in Search Console.
  • Full permission users can do everything except add users, link a GA account, and inform Google of a change of address.
  • Those with Restricted permission have the same restrictions as Full permission users plus they only have limited viewing capabilities on data such as crawl errors and malware infections. Also they cannot submit sitemaps, URLs, reconsideration requests or request URL removals.

Verification Details

This lets you see the all the users of your Search Console account, their personal email addresses and how they were verified (including all unsuccessful attempts.)

Search Console verification details

You can unverify individuals here (providing you’re the owner).

Associates

Another Google platform, such as a G+ or AdWords, can be associated (or connected) with your website through Search Console. if you allow this association request, it will grant them capabilities specific to the platform they are associating with you.

Here’s an example direct from Google: “Associating a mobile app with a website tells Google Search to show search result links that point to the app rather than the website when appropriate.”

If you add an associate, they won’t be able to see any data in Search Console, but they can do things like publish apps or extensions to the Chrome Web Store on behalf of your site.

Search Console associates

Here’s where you’ll find all your reports and tools available in the Search Console.

Search Console Dashboard menu

Let’s look at each option one-by-one.

Messages

Here’s where Google communicates with webmasters.

Search Console All Messages

Again, you won’t get spammed here as Google promises not to bombard you with more than a couple of messages a month. You do need to pay attention when you do receive one though as this is where you’ll be informed if your site’s health is compromised.

This can be anything from a rise in 404 pages, to issues with crawling your site, or even more serious problems like your site being infected with malware.

Search Appearance

If you click on the ? icon to the right of ‘Search Appearance’ a handy pop-up will appear. Search Appearance Overview breaks down and explains each element of the search engine results page (SERP).

Search appearance Dashboard

By clicking on each individual element, an extra box of information will appear telling you how to optimise that element to influence click-through, and where to find extra optimisation guidance within Search Console.

Search Console Dashboard explainer

Structured Data

Structured data is a way for a webmaster to add information to their site that informs Google about the context of any given webpage and how it should appear in search results.

For example, you can add star ratings, calorie counts, images or customer ratings to your webpage’s structured data and these may appear in the snippets of search results.

captain america civil war review rich snippet

The Structured Data section in Search Console contains information about all the structured data elements Google has located on your site, whether from Schema markup or other microformats.

structured data in search console

It will also show you any errors it has found while crawling your structured data. If you click on the individual ‘Data Types’ it will show you exactly which URLs contain that particular markup and when it was detected.

If you click one of the URLs listed, you can see a further breakdown of the data, as well as a tool to show exactly how it looks in live search results. Just click on ‘Test Live Data’ and it will fetch and validate the URL using Google’s Structured Data Testing Tool.

Search Console Structured Data test

Data Highlighter

Data Highlighter is an alternative to adding structured data to your HTML. As the explainer video below says, it’s a point and click tool where you can upload any webpage then highlight various elements to tell Google how you want that page to appear in search results.

There’s no need to implement any code on the website itself and you can set the Data Highlighter so it tags similar pages for you automatically.

To begin, click on the big red ‘Start Highlighting’ button…

Search Console Data Highlighter

Then enter the URL you wish to markup…

Search Console Data Highlighter upload

Then start highlighting and tagging…

structured data highlighter

After you hit publish, Google will take your added structured data into account once it has recrawled your site. You can also remove any structured data by clicking ‘Unpublish’ on the same page if you change your mind.

HTML Improvements

This is where Search Console will recommend any improvements to your meta descriptions and title tags, as well as informing you of any non-indexable content.

Search Console HTML Improvements

This is a very handy, easy-to-use feature that gives you optimisation recommendations that you can action right away.

For instance, if I click on the ‘Short meta descriptions’ link, I’ll be able to see the 14 URLs and their respective meta descriptions. I can then go into each one of these pages in my own CMS and add lengthier, more pertinent text.

Search Console HTML Improvements meta descriptions

Title tags and meta descriptions should be unique for each page and fall within certain character lengths, so for the purposes of both user experience and keeping Google informed about your site, this is a worthwhile report.

Sitelinks

Sitelinks are the subcategories that appear under the main URL when you search for a brand or a publisher.

sitelinks example

Sadly you can’t specify to Google which categories you want highlighted here, but if you’re popular enough and your site’s architecture is solid enough then these will occur organically.

However in the Sitelinks section of Search Console, you can tell Google to remove a webpage that you DON’T wish to be included as a sitelink in your search results.

Search Console Sitelinks

Accelerated Mobile Pages

This is a brand new tool, as Google’s AMP programme has only been available since earlier this year. AMP is a way for webmasters to serve fast-loading, stripped down webpages specifically to mobile users. Site speed and mobile friendliness are considered ranking signals so this is an important feature, although some SEOs are slow to adopt it.

As you can see from the report below, we’ve just started introducing AMP to our webpages and making a bit of a hash of it…

Search Console Accelerated Mobile Pages report

Accelerated Mobile Pages lets you see all the pages on your site with AMP implemented and which ones have errors. If you click on the error, you can see a list of your URLs with errors. Then by clicking on the URL, you will be recommended a fix by Google.

Search Console Accelerated Mobile Pages fix

Clearly we have some custom JavaScript issues on our site that need addressing. If you click on the ‘Open Page’ button, you can see exactly how your AMP content appears on mobile.

Search Traffic

Search Analytics

Search Analytics tells you how much traffic you get from search, revealing clicks and impressions delivered on SERPs. It will also work out your click-through rate (CTR) and reveal your average organic position for each page.

And here’s the *really* good stuff… you can also see the queries that searchers are using in order to be served your site’s content.

Search Console Search Analytics

The data for this is collected differently from Google Analytics, so don’t expect it to tally, however what this feature is really useful for is seeing which keywords and phrases are driving traffic to your site, as well as individual traffic-generating pages.

You can toggle between a variety of options, filters and date-ranges. I highly recommend looking at Impressions and CTR, to see which pages are generating high visibility but low click-through rate. Perhaps all these pages need is a tweak of a meta-description or some structured data?

Links to Your Site

Here’s where you can see the domains that link to your site and its content the most, as well as your most linked webpages.

Search Console Links to Your Site

This isn’t an exhaustive list, but a good indicator of where your content is appreciated enough to be linked. Clicking on the URLs on the right hand-side will show where they’re being linked to individually.

Internal Links

Here is where you can see how often each page on your site has been internally linked. Clicking on each ‘Target page’ will show a list of URLs where the internal link occurs.

Search Console Internal Links

There is a limit to how many ‘Target pages’ Search Console will show you, but if you have a small number of pages you can reverse the sort order and see which target pages have zero internal links. You can then go into your site and give these pages an internal link, or redirect them to somewhere else if they’re old legacy pages.

Manual Actions

This is where Google will inform you if it has administered a manual action to your site or specific webpage.

GWT Manual Actions

Google will offer any recommendations for you to act upon here, and will give you the chance to resubmit your site for reconsideration after you’ve fixed any problems.

Here’s a guide to what Google will most likely give you a manual penalty for and how you can avoid it.

International Targeting

Here you can target an audience based on language and country.

Search Console International Targeting

  • Country: If you have a neutral top-level domain (.com or .org), geotargeting helps Google determine how your site appears in search results, particularly for geographic queries. Just pick your chosen country from the drop-down menu. If you don’t want your site associated with any country, select ‘Unlisted’.
  • Language: If you manage a website for users speaking a different language, you need to make sure that search results display the correct version of your pages. To do this, insert hreflang tags in your site’s HTML, as this is what Google uses to match a user’s language preference to the right version of your pages. Or alternatively you can use sitemaps to submit language and regional alternatives for your pages.

Mobile usability

As mobile has overtaken desktop for searches this year, obviously your site has to be mobile-friendly, otherwise you’re providing a poor user experience to potentially half your visitors.

This report tells you of any issues your site has with mobile usability. And you’ll really want to be seeing the following message, as Google explicitly states you’ll otherwise be demoted.

Search Console Mobile Usability

Possible errors that will be highlighted by Search Console here include:

  • Flash usage: mobile browsers do not render Flash-based content, so don’t use it.
  • Viewport not configured: visitors to your site use a variety of devices with differing screen sizes so your pages should specify a viewport using the meta viewport tag.
  • Fixed-width viewport: viewports fixed to a pixel-size width will flag up errors. Responsive design should help solve this.
  • Content not sized to viewport: if a user has to scroll horizontally to see words and images, this will come up as an error.
  • Small font size: if your font size is too small to be legible and requires mobile users to ‘pinch to zoom’ this will need to be changed.
  • Touch elements too close: tappable buttons that are too close together can be a nightmare for mobile visitors trying to navigate your site.
  • Interstitial usage: Google will penalise you if you’re using a full-screen interstitial pop-up to advertise an app when a user visits your mobile site.

Google Index

Index Status

This lets you know how many pages of your website are currently included in Google’s index.

Search Console Index Status

You can quickly see any worrying trends from the last year (for instance that little dip in May 2015), as well as any pages that have been blocked by robots or removed.

Content Keywords

Here you can see the most common keywords found by the Googlebots as they last crawled your site.

Search Console Content Keywords

If you click on each keyword, you’ll be able to see the other synonyms found for that keyword, as well as the number of occurrences.

As Simon Heseltine suggests, look out for unexpected, unrelated keywords showing up as it’s an indication your site may have been hacked and hidden keywords have been injected into your pages.

Blocked resources

This section lets you know of any images, CSS, JavaScript or other resources on your site that’s blocked to Googlebots.

Search Console Blocked Resources

These are listed by host-name, then by specific pages, which you can follow steps to diagnose and resolve.

Remove URLs

Where essentially you can make your content disappear from Google.

remove urls search console

This only acts as a temporary fix, but by the time you’ve done this and either deleted your offending webpage or 301 redirected it elsewhere, there theoretically should no longer be a record of it.

Just enter the URL then select whether you want it removed from the search results and the cache, just from the cache or if you want an entire directory removed.

Be warned: this request can take between two to 12 hours to be processed.

Crawl

Crawl Errors

This report shows all the errors that Google has found when crawling your site over the last 90 days.

Search Console Crawl Errors

Site errors: the top half of the screen shows three tabs, where if you click on each you can see any past problems with your DNS, your server connectivity or whether a crawl had to be postponed. (Google will postpone a crawl rather than risk crawling URLs you don’t want indexed).

URL errors: the bottom half of the screen shows URL errors for desktop, smartphone and feature phone (a phone that can access the internet, but doesn’t have the advanced features of a smartphone).

You’ll likely see reports for the following on all three device types:

  • Server error: Google can’t access your site because the server is too slow to respond, or because your site is blocking Google.
  • Soft 404: this occurs when your server returns a real page for a URL that doesn’t actually exist on your site. You should replace these pages with 404 (Not found) or a 410 (Gone) return codes.
  • Not found: these are all your 404 pages that occur when a Googlebot attempts to visit a page that doesn’t exist (because you deleted it or renamed it without redirecting the old URL, etc.) Generally 404 pages are fine and won’t harm your rankings, so only pay attention to the ones related to high-ranking content.

Crawl Stats

This section shows the progress of Googlebots crawling your site in the last 90 days.

Search Console Crawl Stats

You can see how fast your pages are being crawled, kilobytes downloaded per day and average time spent downloading pages on your site.

Spikes are perfectly normal, and there’s not very much you can do about them. But if you see a sustained drop in any of these charts then it might be worth investigating to see what’s dragging it down.

Fetch as Google

Here you can check how any page on your website is seen by Google once its been been crawled.

You can also submit these webpages for indexing. You may find this is a quicker way to be crawled and indexed then if you were to let Google find the page automatically.

Search Console Fetch as Google

  • When you ‘Fetch’ a page, Google will simulate a crawl and you can quickly check any network connectivity problems or security issues with your site.
  • ‘Fetch and Render’ does the same as the above, but it also lets you check how the page itself looks on mobile or desktop, including all resources on the page (such as images and scripts) and will let you know if any of these are blocked to Googlebots.

Remember the crawler is meant to see the same page as the visitor would, so this is a good way to get a direct on-page comparison.

If the page is successfully fetched and rendered, you can submit it to the index. You are allowed 500 webpage fetches per week, but you can only submit a webppage and have Google crawl ALL the pages linked within it, 10 times per month.

robots.txt Editor

A robots.txt file placed within the root of your site, is where you can specify pages you don’t want crawled by search engines. Typically this is used because you don’t want your server overwhelmed by Googlebots, particularly if you want them to ignore script or style files, or if you want certain images not to appear in Google Image Search.

Here is where you can edit your robots.txt and check for errors. The bottom of the page reveals your errors and warnings.

robots.txt editor search console

Sitemaps

Sitemaps are hosted on the server of your website and they basically inform search engines of every page of your site, including any new ones added. It’s a good way to let Google better crawl and understand your website.

Here’s where you can access all of the information about any sitemaps either submitted manually or found by Search Console. The blue bar represents pages or images submitted, the red bar represents actual pages and images indexed.

sitemaps search console

You can test a sitemap by clicking the ‘Add/Test sitemap’ button, and if it’s valid you can then add it to Search Console.

URL Parameters

As Simon Heseltine has previously commented, this section isn’t used much anymore since the introduction of canonical tags.

However you should use URL Parameters if, for instance, you need to tell Google to distinguish between pages targeted to different countries. These preferences can encourage Google to crawl a preferred version of your URL or prevent Google from crawling duplicate content on your site.

URL parameters

Security Issues

Although any security issues will be communicated with you in the Messages section and on the Dashboard screen, here’s where you can check on problems in more detail.

Search Console Security Issues

There’s also plenty of accessible information here about how to fix your site if it’s been hacked or been infected with malware.

Other Resources

Here’s where you can access all the tools provided by Google, outside of Search Console. Including the Structured Data Testing Tool and Markup Helper, which we went into greater detail about in earlier sections.

Search Console Other Resources

Other helpful resources here are the Google My Business Center, where you can use to improve your business’s local search visibility and the PageSpeed Insights tool, which will tell you exactly how well your site is performing on mobile and desktop in terms of loading time, and how to fix any issues.

from Search Engine Watch https://searchenginewatch.com/2016/05/09/google-search-console-a-complete-overview/
via Auto Feed

Local SEO: Key challenges and tips from #ClickZChat

In previous ClickZChat sessions we’ve largely covered content and platforms, but seeing as it’s a Twitter event held by both ClickZ AND Search Engine Watch, it seemed only right that we spend some time looking at search in more depth.

This week we took to Twitter for an hour to ask our followers for their local SEO challenges and solutions. Here’s everything we learned:

Question 1: What are the biggest challenges you face when optimizing for local search?

  • SPAM

Several users (including me) mentioned spam being a much bigger issue for local ranking, with maps being particularly open to abuse, and Search Engines slower to act on this than in other cases:

  • Citations

Many people also felt that citations were a hassle for a variety of reasons.

  • Resources

Indeed, the issue of keeping up to date was seen as a major challenge. Data is often fragmented and many organisations with several locations do not have the time or resources to roll out best practice – or even standard practice – to all location listings, with local stores and outlets being left to fend for themselves:

This issue is compounded when you consider the lack of SEO expertise on site. In many cases it simply isn’t considered an issue.

With that said, it was also felt that this state of affairs meant there were big opportunities for those businesses that are getting it right, with small changes making a big difference

Q2: What are the absolute essentials for a decent local SEO presence?

This is where those quick wins we mentioned really come into their own. As our own Graham Charlton mentioned, not enough businesses are taking time to claim their Google Business listings:

Of course, once you do start claiming listings, you need to have a consistent data structure in mind. Google will focus on listings that are formatted correctly in multiple locations:

Once you have your listings in order, there’s also a big case to be made for (you guessed it) content. While there’s no doubt that technical optimisation plays a huge part, it is worth remembering that with so many local searches taking place on mobile devices, user intent is the primary motivator.

  • Reviews

This of course brings us into the realm of reviews, a hugely important component for local business. Even if you lack the resources to optimise your listings properly, this can still make you stand out to a certain extent:

Finally our very own Andrew Warren-Payne mentioned this useful list from Moz, very helpful if you want to get organised:

Q3: What one local SEO tip has proven the most successful for you?

We had a rush of great suggestions to this question, so I’ve pulled them into a quick reference list of ‘Golden rules for Local SEO’ for you:

1: Build on your past success
Leverage existing product content. Marketing reaches across the isle to customer success. ‪#SEO improves ‪ via @colincrook

‬2: Be as focused as you can on the needs of the local customer
We created separate web pages for each of the locations, with unique content & optimised them with local keywords – @anshikamails

3: Good local SEO takes time. Make time to maintain it
Build citations. An oldie but a goldie. https://www.brightlocal.com/2013/09/11/top-50-local-citation-sites/ – @Lexx2099

4: But just doing what you can will help

In some areas, just the basics of listings and data are enough if your competitors aren’t up to speed. ‪#ClickZChat – @gcharlton

5: Remember that Google services are linked together. Focus on the bigger picture
Google business page creation and posting in G+ page. – @shaileshk

Be sure to publish FROM Google+ TO OTHER platforms.. ‪#ClickZChat – @steveplunkett

6: Get your data in order

Site 1st with NAP for all locations, Category, Description, Social, reviews, schema & repeat in citations ‪#ClickZChat ‪#ClickZChat – @rajnijjer

7: And make sure you never stop learning

SEO changes so fast that it’s hard for anything to be easy! :p Best advice: stay aware & current on industry trends! – @hilph

8: Remember why people are searching in the first place

And of course, we can always rely on Search Engine Watch’s editor to chime in with some useful advice…

That’s it for this week. A huge thank you as always to everyone who took part. We’ll be holding another chat this Wednesday at 12 noon Eastern Time.

For more on Local SEO, check out Graham Charlton’s handy list of 30 quick and easy SEO tips for small businesses.

from Search Engine Watch https://searchenginewatch.com/2016/05/09/local-seo-key-challenges-and-tips-from-clickzchat/
via Auto Feed

Should you republish old content?

Lots of content you write is timeless. One year from now this article about republishing old content will still be as valid as it is today. Still, if you don’t share it or talk about it, very few people will notice it. A way to make sure your content won’t be forgotten is to republish it. But what’s a smart strategy for that? You don’t want to annoy your audience with old news. In this post, I’ll talk you through different ways to republish your old content.

Republish old content

Why would you republish old content?

A lot of content is valid for a longer period of time. And your audience changes and grows. Things you’ve written a year ago probably won’t be read by your new audience. So, it’s a waste of quality copy if you’d publish it only once.

Moreover, sometimes an article or blog post isn’t picked up properly the first time. Maybe your timing was off. An article that was published in summer, perhaps got little attention because of a very hot day. If you share posts on Facebook, you’ll most definitely notice that some posts are shared and liked much more than others. The reason why some posts are picked up by a large audience, while others aren’t, isn’t necessarily related to the quality of your post. Republishing can be a way to give your content a second chance to reach your audience.

Make sure your content is up to date!

Most important advice on republishing old content is that you should never republish anything that isn’t up to date. Nobody wants to read something that is out of date or no longer applicable. So before republishing, you should do some reviewing and possibly some re-writing!

Republishing content with minor changes

If you want to republish an article in which you’ve made minor changes, we would advise you to change the last modified date. That way, people are able to see when the article or post was altered last. It’s instantly clear that the information is still up to date.

We would advise you to hide the comments on a post you republish. It just looks weird if you push out an article with comments that are made a year earlier.

When updated, push out your article using social media like Facebook and Twitter. Or write about the article in your newsletter. You can mention that you wrote this post some time ago and that the information is still very useful. You can also choose to treat the republished post as a normal post and do the things you normally do to draw attention to new content.

Republishing old content with major changes

Sometimes you’ll need to make major adjustments to articles. Things can change entirely, making your old article rather useless. Or, your opinion or advice on how to handle certain things might change.

If you make big changes on an article, rewriting the entire text, we would advice you to publish it as if it were new content. You’ll then change the date of the article. Changing the date will enable you to keep all of the links from other websites to your original post (which is great for SEO of course).

Conclusion

Republishing can be a great way to get extra attention to those great articles you wrote some time ago. Make sure to keep those articles up to date, though. And, don’t go overboard! You shouldn’t republish your articles every other week. If people notice you’re publishing the same blogpost again and again, they’ll definitely get annoyed!

Read more: ‘10 tips for an awesome and SEO-friendly blog post’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/republish-old-content/
via KCG Auto Feed

Seven most interesting search marketing news stories of the week

Welcome to our weekly round-up of all the latest news and research from around the world of search marketing and beyond.

This week we have a bountiful collection of news, a heaving trove of stats and a swollen haul of insight from the last seven days.

These adjectives will make more sense in about one headline’s time.

Google AI is improving its conversational skills with… romance novels

Yep.

my fair viking

According to Buzzfeed – YES that’s where we get our intel from – for the past few months, Google has been feeding text from romance novels into its AI engine because the’ve determined that “parsing the text of romance novels could be a great way of enhancing the company’s technology with some of the personality and conversational skills it lacks.”

Buzzfeed also reports that the plan this seems to be working. “Google’s research team recently got the AI to write sentences that resemble those in the books.”

So expect your next Google search for a well reviewed local restaurant to include at least 12 synonyms for ‘throbbing’.

AdWords will launch redesigned AdWords on May 24th

And you can watch the launch live, if that’s the sort of thing you like doing with your time.

AdWords is being being redesigned for the mobile-first market and aesthetically will fall into line with its recently launched 360 suite.

Here’s a sneak peek:

Redesigned AdWords

You can get an early demo during the Google Performance Summit livestream on May 24th at 9:00am PT/12:00pm ET, which you can sign-up for here.

Seats are being booked up fast though, so hurry.

Jk, it’s on the internet. You just need to nudge the cat off the sofa.

Half of SEOs are either unaware of AMP or only have a “passing awareness”

As Rebecca Sentance reported this week, SEOs have been slow to implement Google’s accelerated mobile pages (AMP) in the two months since its official launch, despite the promise that AMP is an important ranking signal.

A survey, carried out by SEO PowerSuite, looked at awareness and uptake of AMP among 385 SEO professionals in North America and Europe. Of the respondents surveyed, less than a quarter (23%) had implemented AMP for their mobile sites.

Although general awareness of Accelerated Mobile Pages was high – 75% of the SEO professionals surveyed were aware of AMP – 21% said they were only aware of it “in passing.”

A column graph showing awareness of AMP among SEOs surveyed, with 21% of SEOs aware of AMP "in passing", 35% "have done SOME research" into AMP, 18% "have done A LOT of research" into AMP, while 25% are "not aware" of AMP.

Of those SEOs who hadn’t yet begun to implement AMP on their mobile sites, only 29% said they would do so in the next six months, and 5% of respondents said they had no intention of supporting AMP on their mobile sites whatsoever.

180% increase in websites being hacked in 2015

This week we reported on Google’s fight against webspam in 2015, revealing the following info:

  • An algorithmic update helped remove the amount of webspam in search results, impacting 5% of queries.
  • Google sent more than 4.3 million messages to webmasters notifying them of manual actions it had imposed on sites affected by spam.
  • Google saw a 33% increase in the number of sites that went through a spam clean-up “towards a successful reconsideration process.”

Most worrying of all was the massive 180% increase in hacking from 2014. If you haven’t already, it’s time to seriously think about the security of your website.

Google is moving all blogspot domain blogs to HTTPS

This week, Google has introduced a HTTPS version for every blogspot domain blog, meaning that visitors can access any blogspot domain blog over an encrypted channel.

https

Google has also removed the HTTPS Availability setting and all blogs will automatically have a HTTPS version enabled. So you don’t have to do a thing, and you may even get a little traffic boost as secure servers are seen as a ranking signal.

Google has taken action against sneaky mobile redirects

To tackle the trend of websites redirecting mobile users to spammy, unrelated domains Google has taken action on sites that sneakily redirect users in this way by issuing manual penalties.

sneaky mobile redirects

If your site has been affected, Google offers help on getting rid of these redirects to clean up your site and hopefully avoid further action.

Moz has introduced a new, free to use, keyword research tool

The new Keyword Researcher launched by Moz this week, can help take you all the way through the keyword research process. It has a variety of useful metrics including estimating the relative CTR of organic results and it surfaces results from almost all popular sources used by SEOs.

And best of all, you can run 2 free searches per day without logging in, another five with a free community account, and if you’re a Pro subscriber you already have full access.

keyword explorer

from Search Engine Watch https://searchenginewatch.com/2016/05/06/seven-most-interesting-search-marketing-news-stories-of-the-week/
via Auto Feed

Positioning your shop in the online market

Successful positioning adds value to your business and gives you a head start on the competition. Positioning is the art of distinguishing your business from others in the mind of your customers. You can make your webshop stand out by high product quality, great service, low prices or dedicated care for the environment. But it’s equally important to communicate this distinctive factor to your target group. Your position is their minds. In this post I’ll help you construct your desired position for your webshop.

positioning your shop online

The fifth P

Every marketing expert in the world knows the name of Philip Kotler. And even if you don’t know that name, you must have heard from the four P’s: product, price, place and promotion. These were the core of every marketing strategy when I studied Marketing decades ago. Since then, many have added their own extra P’s like people and purpose. Philip Kotler himself mentions another P as well: Positioning.

Definition of positioning

Kotler defines positioning as:

“the act of designing the company’s offering and image to occupy a distinctive place in the mind of the target market. The end result of positioning is the successful creation of a customer-focused value proposition, a cogent reason why the target market should buy the product.”
(Philip Kotler: Marketing Management, 2003)

This is closely related to finding your niche market. In my post about finding your shop’s niche, I explained how a product and target audience can be considered shop shapers. You can build an entire shop just based on the right product and the right market. Since positioning is about finding your spot in the mind of the target market, it’s clear that emotions play a part as well.

Questions to ask yourself

If you want to position your shop, it might help to ask yourself some questions:

  • What is your ideal customer? Not in terms of budget, but in terms of values.
  • What are my personal values and how do these relate to my products or company?
  • What do I consider the core competences of my company and how can I make these visible?
  • What brands do I like and how would people associated our company with these brands?
  • What are current trends in my market and what can our products contribute to that?

It’s not that simple to answer these questions. It’s quite heavy stuff, come to think of it. Especially since it’s almost all emotions. But thinking about these topics can help you find your shop’s position.

Construct your shop’s position

There is a simple way to construct your position. First define the following variables:

  1. Company name
  2. Product
  3. Target market
  4. Needs of your target market
  5. Distinctiveness of your company

That might require some research, and perhaps you haven’t thought about a number of these variables. But when you have defined them, your brand position will be something like this:

[Company] supplies [product] to [target market], looking for [needs]. [Company] distinguishes itself from competitors by [distinctiveness].

Some examples

This is quite a strict format, where you should of course craft this to fit you as a person or your company. Let’s look at some possible examples for known companies.

Coca-Cola

Cola is popular worldwide and is liked by people of all age groups while the diet coke targets the niche segment for people who are more health conscious. Coca Cola uses competitive positioning strategy to be way ahead of its competitors in the non-alcoholic beverages market.
(Source: Marketing91.com)

Patagonia

Build the best product, cause no unnecessary harm, use business to inspire and implement solutions to the environmental crisis for males, females, and children, at any age that love the outdoors. Patagonia calls out other companies with “environmental initiatives” to beat theirs.
(Source: Adventures in Branding)

Body Shop

The Body Shop expects its customers to view its products as beauty products with great quality, from a trustworthy brand. The fact that its products have a compelling natural, ethical and environmental story is an added advantage, and how it differentiates its brand from other big mainstream brands and retailers, instead of ethical or charity purchases to customers.
(Source: Natural Cosmetics Lovers)

Note that these aren’t the brand positions these companies set up in their mission statement or marketing plans. These are the positions that others imagine these companies have or had. These examples are simply here to illustrate to you what your position could be.

So what to do?

Find the elements that your desired clients would look for in a product or company. And find the areas where you want to and are able to distinct yourself from your competition. Kotler refers to these as ‘points-of-parity’ and ‘points-of-differentiation’. That sums it up quite nicely, I think.

Positioning is the first thing to do, and creating buzz should be the second. And strongly agree with that. Tell the world about your brand position! Use your blog, use social media, even use your site design to express your values and position your (company and) products in an online market with competition from all over the world.

Make sure your buzz is related to your products. Animal testing and the environment could be topics for your blog, if you want to position your company as conscious. Write about promotions and other sales if your desired position is to be the cheapest online perfume outlet ever. Positioning is about distinctiveness and relevance.

Over to you

What about your shop? Do you have a hard time construction your shop’s position? Or do you manage to occupy a “distinctive place in the mind of the target market”? Share your experience in the comments below!

Read more: ‘Find your shop’s niche’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/positioning-your-shop-online/
via KCG Auto Feed

Do bounce rates affect a site’s search engine ranking?

The bounce rate debate continues…

Bounce rates and how they affect a website’s ranking on Google has been discussed, dissected, and dismembered over and over again.

As fully transcribed on this site, a conversation between Rand Fishkin, CEO of Moz, and Andrey Lipattsev, Google’s search quality senior strategist, led to a surprising discussion on click and bounce rates affecting search rankings.

Rand stated that he has recently been running a few experimental tests with various crowds of 500 to a couple thousand people.

Everyone participating was prompted to take out their cellphones, laptops, and digital what-have-yous and perform a specific search. Once the search listing appeared, he had everyone in the crowd click one of the listings at the bottom of the results page and then click away from that site. He then monitored the results over the next few days.

Rand found a whole bunch of inconsistencies. In a little more than half of the experiments, the ranking did change on the search engine results page (SERP), and in a little less than half of the experiments, the rankings did not change.

This begs the question:

Do bounce rates affect a site’s search engine ranking? If so, how much?

Lipattsev believes that for each individual search query in the experiment, the generated interest regarding those specific searches impacts the rankings change rather than just the clicks and bounces.

He said that if a certain topic is gaining a substantial amount of searches and an increase in social media mentions, Google would pay more attention to that rather than a site getting more clicks.

Lipattsev says that it is certainly doable to determine exactly what causes a large rankings jump for an individual listing, but Internet-wide, it is much more difficult.

All this being said, what actually is a bounce rate?

The bounce rate is the percentage of visitors to a particular site who navigate or “bounce” away after only viewing that individual webpage.

Usually, the term ‘bounce rate’ has a negative connotation associated with it. People think that if a visitor only visits one page and then leaves, it’s bad for business. Their logic isn’t that flawed, either. After all, a high bounce rate would indicate that a site does not have the high-quality, relevant content Google wants out of its top ranked sites.

A great Search Engine Journal article shows nine negative reasons why your website could potentially have a high bounce rate, including poor web design, incorrect keyword selection, improper links, and just bad content. It’s true that these high bounce rates can reflect poorly on a website… sometimes.

So, what gives?

Having a high bounce rate on something like a ‘contact us’ page can actually be a good thing. That’s more of a call-to-action site, where the goal of that particular page is to have the user find the contact information, and then actually contact the business. The visitor got what they came for and then left. Extra navigation around the website doesn’t really mean anything in this case.

Of course, if your site is more content-driven or offers a product or service, then your goal should be to have a higher click-through rate (CTR) and more traffic to each page.

bouncy castles

But what about Google?

Does Google know your bounce rate and are they using it to affect rankings? This Search Engine Roundtable article provides the short answer (which is “no”).

Many organizations don’t use Google Analytics, so Google has no way of tracking their bounce rate information. And even with the analytics that they can trace, it’s difficult to determine what they actually mean because every situation is different.

There are many factors that go into determining how long a visitor stays on a particular webpage. If a visitor remains on a site for over 20 minutes, they could be so engaged with your site’s content that they can’t even imagine leaving your wonderful webpage… or… it could mean they fell asleep at the screen because your website was so boring. It’s too difficult to tell.

If you are operating one of those websites that should have a lower bounce rate, these tips on lowering that number should be able to help. Some highlights include making sure each of your pages loads quickly, offers user-friendly navigation, avoids cluttered advertisements, and features quality content!

If bounce rates don’t affect Google’s rankings as much as you thought, you wonder how significant other ranking factors are. Well, Google recently revealed that magical information. They narrowed it down to three top ranking factor used by Google to drive search results:

  • Links: strong links and link votes play a major role in search rankings.
  • Content: having quality content is more important than ever.
  • RankBrain: Google’s AI ranking system.

It’s no shock that links and content matter, but RankBrain is still relatively new. It’s Google’s new algorithm to help determine search results (after factoring in links and content). RankBrain filters more complex searches and converts them into shorter ones, all the while maintaining the complexity of the search, thusly refining the results.

Google’s newest AI technology – and whatever other secret technologies they are working on – may resolve the never-ending debate over bounce rates, but it’s certainly going to be a difficult process.

More research is to come and Andrey believes the challenge to make bounce rate click data a strong and measurable metric is “gameable,” but Google still has a long way to go.

“If we solve it, good for us,” Andrey said, “but we’re not there yet.”

There is no one-size-fits-all answer when it comes to SEO and all its intricacies. The greatest answer to any SEO question is always “it depends.”

from Search Engine Watch https://searchenginewatch.com/2016/05/04/do-bounce-rates-affect-a-sites-search-engine-ranking/
via Auto Feed

How Google fights webspam and what you need to learn from this

Google has this week revealed its annual report on how it has policed the internet over the last 12 months. Or at least how it policed the vast chunk of the internet it allows on its results pages.

Although it’s self-congratulatory stuff, and as much as you can rightfully argue with some of Google’s recent penalties, you do need to understand what Google is punishing in terms of ‘bad quality’ internet experiences so you can avoid the same mistakes.

It’s important to remember that Google for some people IS the internet, or at least the ‘front door’ to it (sorry Reddit), but it’s equally important to remember that Google is still a product; one that needs to make money to survive and (theoretically) provide the best possible experience for its users, or else it is off to DuckDuckGo they… uh… go.

So therefore Google has to ensure the results it serves on its SERPs (search engine results pages) are of the highest quality possible. Algorithms are built and manual reviews by actual human beings are carried out to ensure crappy websites with stolen/thin/manipulative/harmful content stay hidden.

Here’s how Google is currently kicking ass and taking names… and how you can avoid falling between its crosshairs.

google webspam

How Google fought webspam

According to Google, an algorithmic update helped remove the amount of webspam in search results, impacting 5% of queries.

The remaining spam was tackled manually. Google sent more than 4.3 million messages to webmasters notifying them of manual actions it had imposed on sites affected by spam.

Following this, Google saw a 33% increase in the number of sites that went through a spam clean-up “towards a successful reconsideration process.” It’s unclear whether the remaining sites are still in the process of appealing, or have been booted off the face of the internet.

Who watches the watchmen?

More than 400,000 spam reports were manually submitted by Google users around the world. Google acted on 65% of them, and considered 80% of those acted upon to be spam.

Hacking

There was a huge 180% increase in websites being hacked in 2015, compared to the previous year. Hacking can take on a number of guises, whether its website spam or malware, but the result will be the same. You’ll be placed ‘in quarantine’ and your site will be flagged or removed.

Google has a number of official guidelines on how to help avoid being hacked. These include:

  • Strengthen your account security with lengthy, difficult to guess or crack passwords and not reusing those passwords across platforms.
  • Keep your site’s software updated, including its CMS and various plug-ins.
  • Research how your hosting provider handles security issues and check its policy when it comes to cleaning up hacked sites. Will it offer live support if your site is compromised?
  • Use tools to stay informed of potential hacked content on your site. Signing up to Search Console is a must, as it’s Google’s way of communicating any site issues with you.

google spam fighting

Thin, low quality content

Google saw an increase in the number of sites with thin, low quality content, a substantial amount likely to be provided by scraper sites.

Unfortunately there is very little you can do if your site is being scraped, as Google has discontinued its reporting tool and believes this problem to be your own fault. You just have to be confident that your own site’s authority, architecture and remaining content is enough to ensures it ranks higher than a scraper site.

If you have been served a manual penalty for ‘thin content with little or no added value’ there are things you can do to rectify it, which can mostly be boiled down to ‘stop making crappy content, duh’.

1) Start by checking your site for the following:

  • Auto-generated content: automatically generated content that reads like it was written by a piece of software because it probably was.
  • Thin content pages with affiliate links: affiliate links in quality articles are fine, but pages where the affiliates contain descriptions or reviews copied directly from the original retailer without any added original content are bad. As a rule, affiliates should form only a small part of the content of your site.
  • Scraped content: if you’re a site that automatically scrapes and republishes entire articles from other websites without permission then you should just flick the off-switch right away.
  • Doorway pages: these are pages which can appear multiple times for a particular query’s search results but ultimately lead users to the same destination. The purpose of doorway pages are purely to manipulate rankings.

2) Chuck them all in the bin.

3) If after all that you’re 100% sure your site somehow offers value, then you can resubmit to Google for reconsideration.

For more information on Google’s fight against webspam, read its official blog-post.

And finally, I’ll leave you with this terrifying vision of things to come…

robots and people

from Search Engine Watch https://searchenginewatch.com/2016/05/04/how-google-fights-webspam-and-what-you-need-to-learn-from-this/
via Auto Feed

The reviews are on sale! And more news

Do you need to improve the SEO of your website? Do you want our Yoast SEO experts to thoroughly analyze the SEO of your website? You should definitely order a website review now! Our Gold SEO reviews are on sale until May 18 and will cost only $599 instead of $799.

Sale on our SEO reviews

Change in types of SEO reviews

We’ve decided to simplify our assortment of SEO reviews a bit. Up until now you could choose between four types of reviews: Silver, Gold, Diamond and Platinum. As of today, we’ll offer two types of reviews. You can choose our Gold SEO review in which we give lots of practical advice. Or, you can choose our Platinum SEO review, which is a full audit of your website. The Gold SEO review is on sale and costs only $599 (instead of $799). Our Platinum SEO review costs $2999.

Upcoming: Yoast Consulting project

As of next month, we’ll offer a new type of review. At Yoast, we regularly get questions from people who need more guidance in SEO than our reviews can give them. Also, our SEO team likes to carry out more in-depth SEO projects. They love to really dive into a website and give high quality and personal advice. That’s why, as of next month, we’ll start offering Yoast Consulting projects.

In a Yoast Consulting project, we’ll look at every aspect of your website with our complete SEO team! This team consists of Joost, Michiel, Annelieke, Judith, Jaro, Michelle, Patrick and Meike. You’ll receive a complete analysis and many practical tips. We’ll start with an intake meeting by Skype (or you can come by our office in the Netherlands). Later we’ll also have a Skype follow-up meeting, to make sure you’re completely satisfied. A Yoast Consulting project will cost $10.000. We’ll only do one Yoast Consulting project a month, as it will take much of our time. If you’re interested in purchasing a Yoast Consulting project, make sure to contact us.

Read more: ‘What our website reviews can do for you’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/reviews-sale-news/
via KCG Auto Feed

Announcing Keyword Explorer: Moz’s New Keyword Research Tool

Posted by randfish

A year ago, in April of 2015, I pitched a project internally at Moz to design and launch a keyword research tool, one of the few areas of SEO we’ve never comprehensively tried to serve. The pitch took effort and cajoling (the actual, internal pitch deck is available here), but eventually received approval, with one big challenge… We had to do it with a team already dedicated to the maintenance and development of our rankings collections and research tools. This project wouldn’t get additional staffing — we had to find a way to build it with only the spare bandwidth of this crew.

Sure, we didn’t have the biggest team, or the ability to work on the project free from our other obligations, but we had grit. We had passion. We wanted to prove ourselves to our fellow Mozzers and to our customers. We had pride. And we desperately wanted to build something that wasn’t just “good enough,” but was truly great. Today, I think we’ve done that.

Meet our new keyword research tool, Keyword Explorer:

If you want to skip hearing about it and just try it out, head on over. You can run 2 free searches/day without even logging in, another 5 with a free community account, and if you’re a Pro subscriber, you’ve already got access. For those who want to learn more, read on!

The 5 big, unique features of Keyword Explorer

Keyword Explorer (which we’ve taken to calling “KWE” for short) has lots of unique features, metrics, and functionality, but the biggest ones are pretty obvious and, we believe, highly useful:

  1. KWE takes you all the way through the keyword research process — from discovering keyword ideas to getting metrics to building a list, filtering the keywords on it, and prioritizing which ones to target based on the numbers that matter.
  2. KWE features metrics essential to the SEO process — two you’re familiar with — Volume and Difficulty — and three that are less familiar: Opportunity, Importance, and Potential. Opportunity estimates the relative CTR of the organic web results on a SERP. Importance is a metric you can modify to indicate a keyword that’s more or less critical to your campaign/project. And Potential is a combination of all the metrics built to help you prioritize a keyword list.
  3. Our volume score is the first volume estimation metric we know of that goes beyond what AdWords reports. We do that using Russ Jones’ volume bucket methodology and adding in anonymized clickstream data from ~1 million real searchers in the US. From there, Russ has built a model that predicts the search volume range a keyword is likely to have with ~95% accuracy.
  4. Keyword suggestions inside KWE come from almost all the sources we saw SEOs accessing manually in their research processes — Keyword Planner data, Google Suggest, Related Searches, other keywords that the ranking pages also ranked for, topic-modeling ideas, and keywords found from our clickstream data. All of these are available in KWE’s suggestions.
  5. Import and export functionality are strongly supported. If you’ve already got a list of keywords and just want KWE’s metrics, you can easily upload that to us and we’ll fetch them for you. If you like the KWE process and metrics, but have more you want to do in Excel, we support easy, powerful, fast exports. KWE is built with power users in mind, so go ahead and take advantage of the tool’s functionality however works best with your processes.

These five are only some of the time-saving, value-adding features in the tool, but they are, I think, enough to make it worthwhile to give Keyword Explorer a serious look.

A visual walkthrough

As an experiment, I’ve created a visual, slide-by-slide walkthrough of the tool. If you’d rather *see* vs. read the details, this format might be for you:

The Power of Moz’s Keyword Explorer from Rand Fishkin

 

And, for those of you who prefer video, we made a short, 2 minute demo of the tool in that format, too:

 

Of course, there’s a ton of nuance and complexity in a product like this, and given Moz’s dedication to transparency, you can find all of that detail in the more thorough explanation below.

Keyword Explorer’s metrics

KWE’s metrics are among the biggest data-driven advances we’ve made here at Moz, and a ton of credit for that goes to Dr. Pete Meyers and Mr. Russ Jones. Together, these two have crafted something extraordinary — unique metrics that we’ve always needed for SEO-based keyword research, but never had before. Those include:

Keyword volume ranges

Nearly every keyword research tool available uses a single source for volume data: Google AdWords’ Keyword Planner. We all know from studying it that the number AdWords provides is considerably off from reality, and last year, Moz’s Russ Jones was able to quantify those discrepancies in his blog post: Keyword Planner’s Dirty Secrets.

Since we know that Google’s numbers don’t actually have precision, but do indicate a bucket, we realized we could create ranges for volume and be significantly more accurate, more of the time. But, that’s not all… We also have access to anonymized clickstream data here at Moz, purchased through a third-party (we do NOT collect or use any of our own user data via, for example, the MozBar), that we were able to employ in our new volume ranges.

Using sampling, trend data, and the number of searchers and searches for a given keyword from the clickstream, combined with AdWords’ volume data, we produced a volume range that, in our research, showed ~95% accuracy with the true impression counts Google AdWords would report for a keyword whose ad showed during a full month.

We’re pretty excited about this model and the data it produces, but we know it’s not perfect yet. As our clickstream data grows, and our algorithm for volume improves, you should see more and more accurate ranges in the tool for a growing number of keywords. Today, we have volume data on ~500mm (half a billion) English-language search queries. But, you’ll still see plenty of “no data” volume scores in the tool as we can access considerably more terms and phrases for keyword suggestions (more on suggestion sources below).

NOTE: KWE uses volume data modeled on the quantity of searches in the US for a given term/phrase (global English is usually 1.5-3X those numbers). Thus, while the tool can search any Google domain in any country, the volume numbers will always be for US-volume. In the future, we hope to add volume data for other geos as well.

An upgraded, more accurate Keyword Difficulty score

The old Keyword Difficulty tool was one of Moz’s most popular (it’s still around for another month or so, but will be retired soon in favor of Keyword Explorer). But, we knew it had a lot of flaws in its scoring system. For Keyword Explorer, we invested a lot of energy in upgrading the model. Dr. Pete, Dr. Matt Peters, myself, and Russ had 50+ reply email threads back and forth analyzing graphs, suggesting tweaks, and tuning the new score. Eventually, we came up with a Keyword Difficulty metric that:

  • Has far more variation than the old model — you’ll see way more scores in the 20s and 30s as well as the 80s and 90s than the prior model, which put almost every keyword between 50–80.
  • Accounts for pages that haven’t yet been assigned a PA score by using the DA of the domain.
  • Employs a smarter, CTR-curve model to show when weaker pages are ranking higher and a page/site may not need as much link equity to rank.
  • Adjusts for a few domains (like Blogspot and WordPress) where DA is extremely high, but PA is often low and the inherited domain authority shouldn’t pass on as much weight to difficulty.
  • Concentrates on however many results appear on page 1, rather than the top 20 results.

This new scoring model matches better with my own intuition, and I think you’ll find it vastly more useful than the old model.

As you can see from one of my lists above (for Haiku Deck, whose board I joined this year), the difficulty ranges are considerably higher than in the past, and more representative of how relatively hard it would be to rank in the organic results for each of the queries.

A true Click-Through Rate Opportunity score

When you look at Google’s results, it’s pretty clear that some keywords are worthy of pursuit in the organic web results, and some are not. To date, no keyword research tool we know of has attempted to accurately quantify that, but it’s a huge part of determining the right terms and phrases to target.

Once we had access to clickstream data, we realized we could accurately estimate the percent of clicks on a given search result based on the SERP features that appeared. For example, a classic, “ten-blue-links” style search result had 100% of click traffic going to organic results. Put a block of 4 AdWords ads above it, though, and that dropped by ~15%. Add a knowledge graph to the right-hand side and another ~10% of clicks are drawn away.

It would be crazy to treat the prioritization of keywords with loads of SERP features and little CTR on the organic results the same as a keyword with few SERP features and tons of organic CTR, so we created a metric that accurately estimates Click-Through-Rate (CTR), called “Opportunity.”

The search above for “Keanu” has an instant answer, knowledge graph, news results, and images (further down). Hence, its Opportunity Score is a measly 37/100, which means our model estimates ~37% of clicks go to the organic results.

But, this search, for “best free powerpoint software” is one of those rare times Google is showing nothing but the classic 10 blue links. Hence, its Opportunity Score is 100/100.

If you’re prioritizing keywords to target, you need this data. Choosing keywords without it is like throwing darts with a blindfold on — someone’s gonna get hurt.

Importance scores you can modify

We asked a lot of SEOs about their keyword research process early in the design phases of Keyword Explorer and discovered pretty fast that almost everyone does the same thing. We put keyword suggestions from various sources into Excel, get metrics for all of them, and then assign some type of numeric representation to each keyword based on our intuition about how important it is to this particular campaign, or how well it will convert, or how much we know our client/boss/team desperately wants to rank for it.

That self-created score was then used to help weight the final decision for prioritizing which terms and phrases to target first. It makes sense. You have knowledge about keywords both subjective and objective that should influence the process. But it needs to do so in a consistent, numeric fashion that flows with the weighting of prioritization.

Hence, we’ve created a toggle-able “Importance” score in Keyword Explorer:

After you add keywords to a list, you’ll see the Importance score is, by default, set to 3/10. We chose this number to make it easy to increase a keyword’s importance by 3X and easy to bring it down to 1/3rd. As you modify the importance value, overall Keyword Potential (below) will change, and you can re-sort your list based on the inputs you’ve given.

For example, in my list above, I set “free slideshow software” to 2/10, because I know it won’t convert particularly well (the word “free” often does not). But, I also know that churches and religious organizations love Haiku Deck and find it hugely valuable, so I’ve bumped up the importance of “worship presentation software” to 9/10.

Keyword Potential

In order to prioritize keywords, you need a metric that combines all the others — volume, difficulty, opportunity, and importance — with a consistent, sensible algorithm that lets the best keywords rise to the top. In Keyword Explorer, that metric is “Potential.”

Sorting by Potential shows me keywords that have lots of search volume, relatively low difficulty, relatively high CTR opportunity, and uses my custom importance score to push the best keywords to the top. When you build a list in Keyword Explorer, this metric is invaluable for sorting the wheat from the chaff and identifying the terms and phrases with the most promise.

Keyword research & the list building process

Keyword Explorer is built around the idea that, starting from a single keyword search, you can identify suggestions that match your campaign’s goals and include them in your list until you’ve got a robust, comprehensive set of queries to target.

List building is easy — just select the keywords you like from the suggestions page and use the list selector in the top right corner (it scrolls down as you do) to add your chosen keywords to a list, or create a new list:

Once you’ve added keywords to a list, you can go to the lists page to see and compare your sets of keywords:

Each individual list will show you the distribution of metrics and data about the keywords in it via these helpful graphs:

The graphs show distributions of each metric, as well as a chart of SERP features to help illustrate which types of results are most common in the SERPs for the keywords on your list:

For example, you can see in my Rock & Grunge band keywords, there’s a lot of news results, videos, tweets, and a few star reviews, but no maps/local results, shopping ads, or sitelinks, which makes sense. Keyword Explorer is using country-level, non-personalized, non-geo-biased results, and so some SERPs won’t match perfectly to what you see in your local/logged-in results. In the future, we hope to enable even more granular location-based searches in the tool.

The lists themselves have a huge amount of flexibility. You can sort by any column, add, move, or delete in bulk, filter based on any metric, and export to CSV.

If your list gets stale, and you need to update the metrics and SERP features, it’s just a single click to re-gather all the data for every keyword on your list. I was particularly impressed with that feature; to me it’s one of the biggest time-savers in the application.

Keyword Explorer’s unique database of search terms & phrases

No keyword research tool would be complete without a massive database of search terms and phrases, and Keyword Explorer has just that. We started with a raw index of over 2 billion English keywords, then whittled that down to the ~500 million highest-quality ones (we collapsed lots of odd suggestions we found via iterative crawls of AdWords, autosuggest, related searches, Wikipedia titles, topic modeling extractions, SERPscape — via our acquisition last year — and more) into those we felt relatively confident had real volume).

Keyword Explorer’s suggestions corpus features six unique filters to get back ideas. We wanted to include all the types of keyword sources that SEOs normally have to visit many different tools to get, all in one place, to save time and frustration. You can see those filters at the top of the suggestions page:

The six filters are:

  1. Include a Mix of Sources
    • This is the default filter and will mix together results from all the others, as well as ideas crawled from Google Suggest (autocomplete) and Google’s Related Searches.
  2. Only Include Keywords With All of the Keyword Terms
    • This filter will show only suggestions that include all of the terms you’ve entered in the query. For example, if you entered “mustache wax” this filter would only show suggestions that contain both the word “mustache” and the word “wax.”
  3. Exclude Your Query Terms to Get Broader Ideas
    • This filter will show only suggestions that do not include your query terms. For example, if you entered “mustache wax,” suggestions might include “facial grooming products” or “beard oil” but nothing with either “mustache” or “wax.”
  4. Based on Closely Related Topics
    • This filter uses Moz’s topic modeling algorithm to extract terms and phrases we found on many web pages that also contained the query terms. For example, keywords like “hair gel” and “pomade” were found on many of the pages that had the words “mustache wax” and thus will appear in these suggestions.
  5. Based on Broadly Related Topics and Synonyms
    • This filter expands upon the topic modeling system above to include synonyms and more broadly related keywords for a more iterative extraction process and a wider set of keyword suggestions. If “Closely Related Topics” suggestions are too far afield for what you’re seeking, this filter often provides better results.
  6. Related to Keywords with Similar Results Pages
    • This filter looks at the pages that ranked highly for the query entered and then finds other search terms/phrases that also contained those pages. For example, many pages that ranked well for “mustache wax” also ranked well for searches like “beard care products” and “beard conditioner” and thus, those keywords would appear in this filter. We’re big fans of SEMRush here at Moz, and this filter type shows suggestions very similar to what you’d find using their competitive dataset.

Some of my favorite, unique suggestions come from the “closely related topics” filter, which uses that topic modeling algorithm and process. Until now, extracting topically related keywords required using something like Alchemy API or Stanford’s topic modeling software combined with a large content corpus, aka a royal pain in the butt. The KWE team, mostly thanks to Erin, built a suitably powerful English-language corpus, and you can see how well it works:

NOTE: Different filters will work better and worse on different types of keywords. For newly trending searches, topic modeling results are unlikely to be very good, and on longer tail searches, they’re not great either. But for head-of-demand-curve and single word concepts, topic modeling often shows really creative lexical relationships you wouldn’t find elsewhere.

SERPs Analysis

The final feature of Keyword Explorer I’ll cover here (there are lots of cool nooks and crannies I’ve left for you to find on your own) is the SERPs Analysis. We’ve broadened the ability of our SERP data to include all the features that often show up in Google’s results, so you’ll see a page much more representative of what’s actually in the keyword SERP:

Holy smack! There’s only 3 — yes, THREE — organic results on page one for the query “Disneyland.” The rest is sitelinks, tweets, a knowledge graph, news listings, images — it’s madness. But, it’s also well-represented in our SERPs Analysis. And, as you can see, the Opportunity score of “7” effectively represents just how little room there is for organic CTR.

Over time, we’ll be adding and supporting even more features on this page, and trying to grab more of the metrics that matter, too (for example, after Twitter pulled their tweet counts, we had to remove those from the product and are working on a way to get them back).

Yes, you can buy KWE separately (or get it as part of Moz Pro)

Keyword Explorer is the first product in Moz Pro to be available sold separately. It’s part of the efforts we’ve been making with tools like Moz Local, Followerwonk, and Moz Content to offer our software independently rather than forcing you to bundle if you’re only using one piece.

If you’re already a Moz Pro subscriber, you have access to Keyword Explorer right now! If you’re not a subscriber and want to try it out, you can run a few free queries per day (without list building functionality though). And, if you want to use Keyword Explorer on its own, you can buy it for $600/year or $1,800/year depending on your use.

The best part of Keyword Explorer — we’re going to build what you want

There’s lots to like in the new Keyword Explorer, but we also know it’s not complete. This is the first version, and it will certainly need upgrades and additions to reach its full potential. That’s why, in my opinion, the best part of Keyword Explorer is that, for the next 3–6 months, the team that built this product is keeping a big part of their bandwidth open to do nothing but make feature additions and upgrades that YOU need.

It was pretty amazing to have the team’s schedule for Q2 and Q3 of 2016 make the top priority “Keyword Explorer Upgrades & Iterations.” And, in order to take advantage of that bandwidth, we’d love to hear from you. We have dozens (maybe hundreds) of ideas internally of what we want to add next, but your feedback will be a huge part of that. Let us know through the comments below, by tweeting at me, or by sending an email to Rand at Moz.com.

A final note: I want to say a massive thanks to the Keyword Explorer team, who volunteered to take on much more than they bargained for when they agreed to work with me :-) Our fearless, overtime-investing, never-complaining engineers — Evan, Kenny, David, Erin, Tony, Jason, and Jim. One of the best designers I’ve ever worked with — Christine. Our amazingly on-top-of-everything product manager — Kiki. Our superhero-of-an-engineering-manager — Shawn. Our bug-catching SDETs — Uma and Gary. Our product marketing liaison — Brittani. And Russ & Dr. Pete, who helped with so many aspects of the product, metrics, and flow. You folks all took time away from your other projects and responsibilities to make this product a reality. Thank you.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog http://tracking.feedpress.it/link/9375/3231846
via Auto Feed