Do 50% of adults really not recognise ads in search results?

Around half of adults are unable to recognise ads in Google’s search results, according to a survey. 

This surprising statistic comes from Ofcom’s Adults’ media use and attitudes report, released this month.

While I’ve seen studies suggesting that many people don’t know the difference between paid and organic ads, that 50% could look at a set of results like those below and still not spot them seems bizarre.

paid and organic results

The stats

For Ofcom’s study, ‘adults who use search engines’ were shown a picture of the SERPs for ‘walking boots’.

This is what the SERP looks like now, but the study was carried out in 2015, so the shopping results were not there at that time. As the study says:

“Their attention was drawn to the first three results at the top of the list, which were distinguished by an orange box with the word ‘Ad’ written in it. They were then prompted with three options and asked whether any of these applied to these first three results.”

walking boots

The 1,328 survey respondents were allowed to select more than one answer so, for example, some may have said that the ads were both paid links and the best results.

Understanding of paid-for results returned by Google searches, among adults who use search engine websites or apps:

ofcom 1

To clarify the results, 60% identified them as paid links, while 49% identified them only as paid ads, i.e. they selected only the correct answer.

Ofcom also split the results out between newer and more established internet users. Newer users in this case are defined as those who first went online less than five years ago. There were 160 newer users surveyed, and 1,113 older users.

These are the response to the same question as before, just split by old and new:

ofcom 2

In a nutshell: newer users were less likely to identify that the results with the yellow ad label were indeed paid results. 34% of newer and 51% of established users gave only the correct answer.

I asked Andrew GirdwoodHead of Media Technology at Cello Signal about the findings. He was pretty surprised: 

“I’ve closely followed the evolution of disclosure in search engine ads over the years. At one point the lines were blurred – Yahoo’s paid inclusion, for example, traded your money with for some sort of organic search position. Those days, in Europe and America, are long gone. Regulators on both sides of the Atlantic watch closely.

The ad badge updates to Google’s paid search should have made it crystal clear the listing has been paid for. We’re talking about a bright yellow “Ad” label beside the result. How can you miss it? Searching for competitive keyword? Google returns a whole column of Ad, Ad, Ad and Ad mentions. It leaps off the badge to me.

It is just short of mind boggling that 50% of searchers in the UK can’t see the Ad disclosure. When Steve Krug published “Don’t Make Me Think” in 2000 to offer advice on web usability I wonder if he had imaged an audience that was both digitally savvy and web-blind as this.”

Other studies into PPC ads

I’ve looked at this issues before. In 2014, I reported on stats from UX firm Bunnyfoot, which found that 36% didn’t know that PPC ads were indeed ads (a previous study from the same firm produced a figure of 41%).

This was a relatively small sample – 103 people took UX tests with eye-tracking technology and were asked afterwards if they saw any ads.

With the help of Dan Barker, I carried out further tests on this using two separate polls of more than 2,000 UK internet users in total. We asked:

  1. Are people aware of the existence of ads on Google Search?
  2. Do they believe they click Google ads? And, if so, how frequently?

The results were very different to Ofcom’s, with just around 10% not seeing ads in Google results.

However, the very presence of the word ‘ad’ in the question perhaps implied to respondents that there are ads on Google, and gave them a clue about the answer.

There was another study by Varn earlier this year which produced a similar answer to that from Ofcom.

This time, 1,010 Uk internet users were asked the following question. 50.6% couldn’t identify ads:


It is tricky to devise the perfect test for this issue. If you ask users questions, there is the obvious temptation for them to second-guess the answer and say what they think is the right answer, rather than just answering honestly.

The Ofcom test, showing users the results and asking the question seems sound enough to me. Also, that several different studies have found a reasonably high percentage of people not recognising ads, so I can only conclude that there’s something in this.

Why can’t people see the ads?

This is the big question. As someone who has worked in digital for more than 10 years, it’s hard to imagine.

After all, there’s a pretty clear yellow ad label next to the results. You can hardly accuse Google of not disclosing the nature of the link.

However, Google has taken steps which some would interpret as reducing the visibility of ads. Remember, Google has an interest in increasing the number of clicks on its ads.

For example, PPC ads used to be shaded until a couple of years ago, though there were no ad labels.

PPC ads shaded

Recently, Google has experimented with green ad labels. The reason is unclear, but it could be a way to help the ad label blend in with the URL text. Or it could simply be one of a series of experiments to find the best performing format.

green ad labels

I suspect this is a similar thing to banner blindness, in which people have just become immune to, or have learned to ignore the elements on the page that don’t interest them.

Indeed, plenty of eye-tracking studies have shown that users will simply not look at certain elements on a page. Could it be that users are looking at the top results and simply not seeing (or processing) the ‘ad’ label?

Whatever the reason, and whatever the exact proportion of search users who don’t recognise ads in Google, it seems clear that there is an issue here.

from Search Engine Watch
via Auto Feed

Measuring Content: You’re Doing it Wrong

Posted by MatthewBarby

The traditional ways of measuring the success or failure of content are broken. We can’t just rely on metrics like the number of pageviews/visits or bounce rate to determine whether what we’re creating has performed well.

“The primary thing we look for with news is impact, not traffic,” says Jonah Peretti, Founder of BuzzFeed. One of the ways that BuzzFeed have mastered this is with the development of their proprietary analytics platform, POUND.

POUND enables BuzzFeed to predict the potential reach of a story based on its content, understand how effective specific promotions are based on the downstream sharing and traffic, and power A/B tests — and that’s just a few examples.

Just because you’ve managed to get more eyeballs onto your content doesn’t mean it’s actually achieved anything. If that were the case then I’d just take a few hundred dollars and buy some paid StumbleUpon traffic every time.

Yeah, I’d generate traffic, but it’s highly unlikely to result in me achieving some of my actual business goals. Not only that, but I’d have no real indication of whether my content was satisfying the needs of my visitors.

The scary thing is that the majority of content marketing campaigns are measured this way. I hear statements like “it’s too difficult to measure the performance of individual pieces of content” far too often. The reality is that it’s pretty easy to measure content marketing campaigns on a micro level — a lot of the time people don’t want to do it.

Engagement over entrances

Within any commercial content marketing campaign that you’re running, measurement should be business goal-centric. By that I mean that you should be determining the overall success of your campaign based on the achievement of core business goals.

If your primary business goal is to generate 300 leads each month from the content that you’re publishing, you’ll need to have a reporting mechanism in place to track this information.

On a more micro-level, you’ll want to be tracking and using engagement metrics to enable you to influence the achievement of your business goals. In my opinion, all content campaigns should have robust, engagement-driven reporting behind them.

Total Time Reading (TTR)

One metric that Medium uses, which I think adds a lot more value than pageviews, is “Total Time Reading (TTR).” This is a cumulative metric that quantifies the total number of minutes spent reading a piece of content. For example, if I had 10 visitors to one of my blog articles and they each stayed reading the article for 1 minute each, the total reading time would be 10 minutes.

“We measure every user interaction with every post. Most of this is done by periodically recording scroll positions. We pipe this data into our data warehouse, where offline processing aggregates the time spent reading (or our best guess of it): we infer when a reader started reading, when they paused, and when they stopped altogether. The methodology allows us to correct for periods of inactivity (such as having a post open in a different tab, walking the dog, or checking your phone).” (source)

The reason why this is more powerful than just pageviews is because it takes into account how engaged your readers are to give a more accurate representation of its visibility. You could have an article with 1,000 pageviews that has a greater TTR than one with 10,000 pageviews.

Scroll depth & time on page

A related and simpler metric to acquire is the average time on page (available within Google Analytics). The average time spent on your webpage will give a general indication of how long your visitors are staying on the page. Combining this with ‘scroll depth’ (i.e. how far down the page has a visitor scrolled) will help paint a better picture of how ‘engaged’ your visitors are. You’ll be able to get the answer to the following:

“How much of this article are my visitors actually reading?”

“Is the length of my content putting visitors off?”

“Are my readers remaining on the page for a long time?”

Having the answers to these questions is really important when it comes to determining which types of content are resonating more with your visitors.

Social Lift

BuzzFeed’s “Social Lift” metric is a particularly good way of understanding the ‘virality’ of your content (you can see this when you publish a post to BuzzFeed). BuzzFeed calculates “Social Lift” as follows:

((Social Views)/(Seed Views)+1)

Social Views: Traffic that’s come from outside BuzzFeed; for example, referral traffic, email, social media, etc.

Seed Views: Owned traffic that’s come from within the BuzzFeed platform; e.g. from appearing in BuzzFeed’s newsfeed.

BuzzFeed Social Lift

This is a great metric to use when you’re a platform publisher as it helps separate out traffic that’s coming from outside of the properties that you own, thus determining its “viral potential.”

There are ways to use this kind of approach within your own content marketing campaigns (without being a huge publisher platform) to help get a better idea of its “viral potential.”

One simple calculation can just involve the following:

((social shares)/(pageviews)+1)

This simple stat can be used to determine which content is likely to perform better on social media, and as a result it will enable you to prioritize certain content over others for paid social promotion. The higher the score, the higher its “viral potential.” This is exactly what BuzzFeed does to understand which pieces of content they should put more weight behind from a very early stage.

You can even take this to the next level by replacing pageviews with TTR to get a more representative view of engagement to sharing behavior.

The bottom line

Alongside predicting “viral potential” and “TTR,” you’ll want to know how your content is performing against your bottom line. For most businesses, that’s the main reason why they’re creating content.

This isn’t always easy and a lot of people get this wrong by looking for a silver bullet that doesn’t exist. Every sales process is different, but let’s look at the typical process that we have at HubSpot for our free CRM product:

  1. Visitor comes through to our blog content from organic search.
  2. Visitor clicks on a CTA within the blog post.
  3. Visitor downloads a gated offer in exchange for their email address and other data.
  4. Prospect goes into a nurturing workflow.
  5. Prospect goes through to a BOFU landing page and signs up to the CRM.
  6. Registered user activates and invites in members of their team.

This is a simple process, but it can still be tricky sometimes to get a dollar value on each piece of content we produce. To do this, you’ve got to understand what the value of a visitor is, and this is done by working backwards through the process.

The first question to answer is, “what’s the lifetime value (LTV) of an activated user?” In other words, “how much will this customer spend in their lifetime with us?”

For e-commerce businesses, you should be able to get this information by analyzing historical sales data to understand the average order value that someone makes and multiply that by the average number of orders an individual will make with you in their lifetime.

For the purposes of this example, let’s say each of our activated CRM users has an LTV of $100. It’s now time to work backwards from that figure (all the below figures are theoretical)…

Question 1: “What’s the conversion rate of new CRM activations from our email workflow(s)?”

Answer 1: “5%”

Question 2: “How many people download our gated offers after coming through to the blog content?”

Answer 2: “3%”

Knowing this would help me to start putting a monetary value against each visitor to the blog content, as well as each lead (someone that downloads a gated offer).

Let’s say we generate 500,000 visitors to our blog content each month. Using the average conversion rates from above, we’d convert 15,000 of those into email leads. From there we’d nurture 750 of them into activated CRM users. Multiply that by the LTV of a CRM user ($100) and we’ve got $75,000 (again, these figures are all just made up).

Using this final figure of $75,000, we could work backwards to understand the value of a single visitor to our blog content:


Single Visitor Value: $0.15

We can do the same for email leads using the following calculation:


Individual Lead Value: $5.00

Knowing these figures will help you be able to determine the bottom-line value of each of your pieces of content, as well as calculating a rough return on investment (ROI) figure.

Let’s say one of the blog posts we’re creating to encourage CRM signups generated 500 new email leads; we’d see a $2,500 return. We could then go and evaluate the cost of producing that blog post (let’s say it takes 6 hours at $100 per hour – $600) to calculate a ROI figure of 316%.

ROI in its simplest form is calculated as:


You don’t necessarily need to follow these figures religiously when it comes to content performance on a broader level, especially when you consider that some content just doesn’t have the primary goal of lead generation. That said, for the content that does have this goal, it makes sense to pay attention to this.

The link between engagement and ROI

So far I’ve talked about two very different forms of measurement:

  1. Engagement
  2. Return on investment

What you’ll want to avoid is actually thinking about these as isolated variables. Return on investment metrics (for example, lead conversion rate) are heavily influenced by engagement metrics, such as TTR.

The key is to understand exactly which engagement metrics have the greatest impact on your ROI. This way you can use engagement metrics to form the basis of your optimization tests in order to make the biggest impact on your bottom line.

Let’s take the following scenario that I faced within my own blog as an example…

The average length of the content across my website is around 5,000 words. Some of my content way surpasses 10,000 words in length, taking an estimated hour to read (my recent SEO tips guide is a perfect example of this). As a result, the bounce rate on my content is quite high, especially from mobile visitors.

Keeping people engaged within a 10,000-word article when they haven’t got a lot of time on their hands is a challenge. Needless to say, it makes it even more difficult to ensure my CTAs (aimed at newsletter subscriptions) stand out.

From some testing, I found that adding my CTAs closer to the top of my content was helping to improve conversion rates. The main issue I needed to tackle was how to keep people on the page for longer, even when they’re in a hurry.

To do this, I worked on the following solution: give visitors a concise summary of the blog post that takes under 30 seconds to read. Once they’ve read this, show them a CTA that will give them something to read in more detail in their own time.

All this involved was the addition of a “Summary” button at the top of my blog post that, when clicked, hides the content and displays a short summary with a custom CTA.

Showing Custom Summaries

This has not only helped to reduce the number of people bouncing from my long-form content, but it also increased the number of subscribers generated from my content whilst improving user experience at the same time (which is pretty rare).

I’ve thought that more of you might find this quite a useful feature on your own websites, so I packaged it up as a free WordPress plugin that you can download here.

Final thoughts

The above example is just one example of a way to impact the ROI of your content by improving engagement. My advice is to get a robust measurement process in place so that you’re able to first of all identify opportunities, and then go through with experiments to take advantage of the opportunity.

More than anything, I’d recommend that you take a step back and re-evaluate the way that you’re measuring your content campaigns to see if what you’re doing really aligns with the fundamental goals of your business. You can invest in endless tools that help you measure things better, but if core metrics that you’re looking for are wrong, then this is all for nothing.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog
via Auto Feed

Beyond local SEO: Greg Gifford on how to win the visibility race

Last Friday at a packed-out Brighton SEO conference, expert local search consultant Greg Gifford delivered a fast and furious presentation on the secrets of local marketing visibility: it’s not just about local SEO.

In a slide-show brimming with references to classic car movies, Greg Gifford raced through a host of tips and tricks that can massively improve your business’s local visibility and let you storm ahead of the competition.

The days of “just having a website” and trying to make it rank near the top are over; it takes more than just SEO to market to local customers. That’s not to say that local SEO isn’t important, of course, but it shouldn’t be your only consideration.

A still from the 1965 film 'Thunderball' showing two black cars on the road, one of which is exploding. The text in white reads, "But it takes more than just Local SEO to successfully market to local customers."

By thinking about local SEO as just one part of a wider visibility strategy, you can ensure that your business has a presence across multiple channels, not just in search. That’s better for your customers as well as for you.

In his talk, Gifford gave a run-down of other key areas to pay attention to, and how to optimise each one to target exactly the local audience you want to attract.

But first, because search is still hugely important for a local business, here are some handy tricks that Greg Gifford shared which will help you get ahead in the SEO game.

Quick tips for local SEO

Future-proof your SEO tactics

You only need to look at pizza delivery for an example of how much weight local visibility now carries in search, more than ever before. If you Google “pizza delivery”, even without specifying a location, Google will serve you local results – whether you asked for them or not.

Any algorithm changes that Google makes to its local search results have the potential to shake things up hugely, and the businesses who adapt fastest are often those who end up on top.

Google’s ‘Pigeon’ update in 2014 was a massive game-changer for local SEO, and mobile-oriented ranking changes like Mobilegeddon can have a huge impact on local given that 94% of mobile searches have local intent.

A slide from Greg Gifford's presentation featuring a still from 'Moonraker' with a woman driving a white car. The text reads, "Y'all are lucky... you typically get a preview of major Google updates."

But if you live outside the US, you’re in luck (for once!): Google algorithm updates always hit the US before they roll out anywhere else, so by keeping an eye on what’s happening in the States, you can ‘future-proof’ your SEO tactics and know exactly what to do by the time the update comes to you.

And even if you live in the US, there’s still a way you can get ahead: Gifford recommends keeping an eye on’s local search ranking factors research, an extensive survey conducted across SEO experts analysing the changes in ranking factors they have observed over the past year. This will give you the low-down on what changes search experts sense in the winds and how they recommend dealing with them.

Make your blog a local destination

Maintaining a blog is still an excellent content and SEO strategy, giving businesses a platform to publish regular, fresh and insightful content and build a relationship with their visitors.

But in the words of Greg Gifford, “Don’t just market your shit!” Make your blog a local destination; share all sorts of things that people want to read.

A slide featuring a still from the 1989 film 'License to Kill' with the words "A few important tips" in orange and then in white, "Make your blog a local destination". Below it is a URL to visit for ideas for local blog posts:

Visitors will be turned off by a blog that is clearly just another mouthpiece for the company to promote its products. By thoughtfully curating all sorts of valuable local content, you can turn your blog into a go-to destination, boost its visibility and build relationships and links with other local blogs and businesses.

And speaking of local businesses…

Get those local business links!

When it comes to inbound links to your website, businesses will fight tooth and nail to try and get links from sites with the most domain authority. But Greg Gifford’s tip is one that many businesses wouldn’t even consider: go after “crappy little church websites”. You know the ones, with Microsoft Word clipart and neon green Comic Sans font in the header.

These kinds of tiny hyper-local websites have a huge amount of local relevance, and so their links carry a lot of weight. Best of all, none of your competitors will be going after them, so you can snap them up and enjoy the boost.

A photograph of a village church with a tower and a spire, underneath a blue sky and surrounded by trees and gravestones.Increase your local SEO with inbound links from highly hyperlocal websites – even if they aren’t always of the best quality. Photo by Lincolnian, made available via CC BY-SA 2.0

Build ‘local silos’ to show up in nearby cities

If you want your site to show up in search for a city you’re not located in, Gifford recommends building what he calls ‘local silos’ targeted at nearby cities.

A silo is the name given to a system or sector that operates in isolation from others. You’ve probably heard a lot about we should be breaking down silos, but in this case, they can work to your advantage.

To build ‘local silos’, create little self-contained zones of information within your site that are based around the city or neighbourhood you want to target and optimise the heck out of them.

Publish blog posts about that city, get links back from local businesses, and make sure they point to pages within the silo; and before too long, you’ll see your silos start to rank in local searches for that area.

A photograph showing a row of grey silos against a blue sky, with hay bales piled at the foot of each.Building silos can be a good thing, when it comes to local SEO. Photograph by Doc Searls, made available via CC BY 2.0.

Track your Google My Business clicks

In the midst of all the cool local SEO hacks, it pays to remember the basics, like making sure your business is listed on Google My Business and your profile is complete.

Gifford also advises adding tracking parameters to your Google My Business links in order to monitor the traffic coming to your site via that page. Local SEO Guide has a good guide on how to do this with Google’s URL builder tool.

It’s also worth keeping an eye on developments with Google Posts, which Google seems to be prepping as a significant platform for business promotion, and which could possibly be the next major development in local search if Google rolls it out on a larger scale.

How to optimise your email marketing

So we’ve covered the ‘local SEO’ part of local visibility; how about the ‘beyond’? Greg Gifford’s first tip might seem a little old-school, but it’s still one of the most effective marketing tools at your disposal: email marketing.

Gifford advises making sure that you’re using a responsive email design. Brands who have fully embraced responsive emails see 55% more Clicks to Open (CTO) from mobile and 23% more from desktop compared to brands who haven’t, according to research by Yesmail.

A still from 'Fast & Furious' with Email Marketing at the top in orange text, and then in white, "Adding a video to an email results in up to a 300% increase in CTR." Below is a link to read more about videos in email:

Adding video to your emails can increase their attractiveness and interactivity, and a survey conducted by Forrester Marketing Group found that including a video in email marketing increased Click-Through Rate by 200 to 300%.

That statistic is from 2010, but more recent statistics have shown that just including the word “video” in an email subject line can raise open rates by 19% and Click-Through Rates by 65%.

Having a carefully curated list of email addresses to target can also come in handy when using Facebook Ads, as we’ll see later on.

Go beyond YouTube

If you’re going to commit to using video in your marketing strategy, Gifford has one key recommendation for you: don’t use YouTube. Instead, the video hosting service that Gifford recommends for keeping control of your content and tracking the important metrics is Wistia.

A still from the 1970 film 'Five Easy Pieces' showing a man who appears to be having an argument with a dog through a car window. The header at the top reads, "Videos" in orange. Below it, "Use Wistia to host your videos" is written in white. At the bottom it reads, "The ONLY choice for video hosting:"

Here are some of the reasons he lists for opting for Wistia instead of YouTube or another major video host:

  • Wistia provides detailed video analytics, including engagement statistics, trend graphs, and individual viewer ‘heatmaps’ which show the parts of a video each user watched, skipped and rewatched.
  • You can tie user information to email addresses, and also use Wistia’s ‘Turnstile’ tool to add a form that requires users to input their email address at any point before, after or during the video.
  • Wistia allows you to give your videos a custom play button, which according to Wikia’s former Director of Growth and Acquisition Casey Henry can increase your play rate by 19%.
  • Similarly, you can also add a custom thumbnail to your videos, which can potentially boost your play rate by as much as 35% (and often winds up looking much nicer).
  • Wistia embeds on Facebook play in the news feed, and on Twitter will expand into a Twitter card that also allows users to play them in-stream.

Facebook ads

“Facebook ads used to be the drunk guy that showed up late to the party; now, they’re the cool guy that everyone’s stoked to see,” says Gifford. And if you’re looking to gain local visibility, Facebook ads have a lot of valuable advantages.

  • Facebook’s demographic targeting is incredibly diverse and exact, allowing you to target users based on location, age, gender, interests, and some mind-bogglingly specific parameters like education level, device and mobile connection.
  • You can also load in email lists and use them to create a custom audience of Facebook users who have accounts registered with those addresses.
  • Facebook also allows you to create Lookalike Audiences which will target new groups of people who are similar to audiences you’re already targeting.
  • Facebook’s ‘local awareness ads’ are an incredibly powerful local advertising tool. Google research on local search showed that roughly 70% of users want ads customised to their city or zip code, and between 60 and 70% want ads customised to their immediate surroundings.

Facebook’s local awareness ads allow you to drop a pin on any point on a map, and ads will be shown on mobile devices within a certain radius of that point. Try dropping a map pin on your competitors, on an event or at a conference!

A still from the 1989 film 'Back to the Future Part II', showing a flying DeLorean in the rain, with the text "Facebook awareness ads are awesome!" across it.

Use Beacons

Gifford’s final hot tip for local visibility is to use Beacons. Beacons are “small, Bluetooth-enabled hardware devices that can be installed in physical locations, like retail stores. They silently broadcast a message to any Bluetooth-enabled devices in their proximity, kind of like a lighthouse with text”, as Dan Cristo writes.

Usually Beacons require a dedicated app to work, but Beacon providers have begun setting up app networks which will allow Beacons to pop up a message on someone’s phone as long as any app in the network is running.

A still from the 2014 film 'The Wolf of Wall Street' showing a man lying on the ground and clinging on to a car with one hand. The text in white reads, "We had a beacon in our booth at a recent conference, and tagged 687 unique users!"

The apps are location-aware, so they can be tagged by the Beacon even if they aren’t running. You can then connect directly to Facebook’s API, allowing you to retarget ads at actual foot traffic.

DealerOn, Gifford’s marketing and advertising company, ran some tests with Beacon and found they led to anything from a 34.6% to a 45.7% increase in Click-Through Rate on ads. At a recent conference, the Beacon at DealerOn’s booth tagged 687 unique users.

Beacons are still an emerging technology, but they have the power to improve the customer experience and potentially revolutionise search – especially in a hyperlocal context. So watch out for opportunities and get creative with how you use them.

from Search Engine Watch
via Auto Feed

The convergence of SEO and UI goals for mobile users

One year after Google put an algorithmic premium on mobile experience, the so-called “Mobilegeddon,” Google is at it again. New tools are coming in late spring to help webmasters make their websites work better on mobile devices.

Mobilegeddon was the consequence of businesses not making their websites easier to use on smartphones and other mobile devices. Google’s updates were an incentive to reward webmasters by ranking mobile-friendly websites higher in search results.


However, there was fallout: webmasters without mobile optimized sites saw as much as a 12% plunge in traffic to non-mobile sites, according to an Adobe study in July 2015.

Small businesses in particular suffered because only a third of such businesses had a mobile optimized website, according to eMarketer.

Google must focus on mobile because more than half of all Google searches are from smartphones and mobile devices. If consumers do not have a good experience when they click on a mobile SERP link, they will likely leave the page (and Google).

Specifically, a Google survey from late 2015 reported a loss of 29% of smartphone users if a site doesn’t satisfy their needs (lack of information or slower load times).

Furthermore, studies show that even a one second increase in load time can lead to an 11% decrease in page views, a 7% decrease in conversions and a 16% decrease in satisfaction.

Mobile-optimized sites are now central to a satisfying web experience.

So as the trend towards mobile continues, there are two reasons why businesses want to optimize their websites for mobile devices: superior customer experience and search engine ranking – and in fact, both are interconnected.

When Google rolled out Panda in 2011, they forever shifted algorithmic signal to include both relevance and quality. Therefore, without a superior customer experience (read: good UI), Google will not give a website a shot at a top SERP ranking and, in turn, high search results won’t result in increased user engagement.

But, mobile Information Architecture (IA) and User Interface (UI) is hard. Because of their mobility, small screen size and cumbersome keyboard entry, consumers interact with mobile devices in an entirely different way than with laptops or desktops and in a different context.

Mobile users do not have patience for anything that’s short of being intuitive, relevant and fast.

google mobile

Making your web page play nice across multiple devices, which has come to be labeled “responsive web design” means that you can use one URL that will adjust to whatever device it’s being used on.

Many websites still require horizontal scrolling or zoom on mobile, and layouts should be viewable on mobile without these shortcomings to the user experience. Responsive pages that can be viewed with ease on a desktop PC, tablet or smartphone are critical to making your webpage stickier for mobile users as well as saving on the cost of developing separate sites for multiple devices.

responsive design

Google is not, however, throwing businesses under the bus. The new tools will allow businesses to enter their website URL to get a diagnostic check to explain why their website isn’t mobile-friendly as well as how to improve the site. The time is (past) if you weren’t pro-active ensuring your site is mobile-friendly – stop everything and call a mobile savvy partner to make it so.

The good news is that brands that invest in mobile optimization and mobile strategies will crush the competition. If there is one unifying theme across all demographics, millennials in particular, it is the pervasive use of mobile devices to research and shop.

In fact, 93% of people who use a mobile device for research make a purchase and according to internal ConsumerAffairs user data, mobile visitors purchase within 2-4 hours compared to 1-3 days for desktop visitors.

Any user interface should be developed considering the way users engage with their mobile devices. Hold it in your hand, and walk through some of the sites you frequent on mobile. Ask yourself these questions:

  • How long does it take the site to load?
  • How do your hands engage with the site?
  • Are you needing to scroll far to get info?
  • Do you have trouble navigating to the point of a form fill or purchase?
  • Is it easy to click and call the business?
  • Where are menu items placed on the mobile UI, and where does your hand align with those menu items and calls to action?

Evaluate your customer journey pathways and imagine what your UI could do to give a smoother experience to mobile users.

To make sure you are converting web visits to sales, a mobile-friendly website must have print that is big enough to read without having to constantly pinch and zoom; tasks need to be simplified; links need to be clearly visible and spaced far enough apart so that errant clicks don’t occur.

For example, we love the way Flipboard’s user interface aligns with their business objectives. Flipboard gives people a single place to follow all of their interests and then save or share stories, images, and videos into their own Flipboard magazines. To that end, their mobile site is fast, simple and intuitive. The reward? 82 million monthly readers.


To compete for mobile customers and higher search engine rankings, marketers must become mobile UI architects to design a more intuitive and relevant customer experience. Mobile is no longer the alternate platform, it is now the platform of choice.

Mobile devices are the new norm. We do everything on them from making dinner reservations, finding a date, managing our bank accounts, to hailing a ride. Mobile has changed the very foundation of how consumers communicate, connect, and discover online.

Not surprisingly, consumers expect brands to provide a superior mobile experience.

Zac Carman is CEO of ConsumerAffairs and contributor to Search Engine Watch.

from Search Engine Watch
via Auto Feed

How to optimise your page images to increase site speed

Google’s official slogan is “Don’t Be Evil”, but it’s long been rumoured that the company has a second, internal motto that they tend to keep under wraps:

“You’re either fast, or you’re f***ed.”

We’ve written about site-speed in the past, and there’s no doubt of its importance (if there is, stick around for the stats section of this post) but for content marketers, improving the speed of your website is often seen as a particularly arduous technical exercise that’s completely out of your control. Only a back-end full-stack engineer can speed things up significantly, right?

As it turns out, nothing could be further from the truth. As Tom Bennet from Builtvisible explained in his excellent recent talk at Brighton SEO. Here, I’ll run through some key points Tom addressed to show how and why you should concentrate on delivering a lightning-fast experience to users.

Why is site speed important?

Now, I mentioned stats didn’t I?

According to the official Google webmaster blog, site speed matters. Google itself spends an awful lot of time checking whether or not your site is keeping up with your competitors. If you are slower, then your place in the search results will suffer.

But that’s not the only important factor here. Site speed improves the overall User Experience. As a case in point, Tom mentioned this extraordinary stat from Firefox:


When Firefox increased average page load time by 2.2 seconds, form downloads increased by 15.4%. That equates to more than 10 million downloads per year.

Once you hear figures like that, the value starts to become clear. Tom also took time to quote Steve Souder, a pioneer of much modern web performance work:


So, we know we can do something about it. But where to concentrate our efforts?

What can we do about it?

To illustrate, Tom built a simple, fairly standard content page using bootstrap and jQuery. The content marketing industry churns out thousands of these every day, so it should be fairly relevant:


Next, we fire up the page and measure it using a combination of Yahoo’s Yslow and Google PageSpeed rulesets. Here are the initial results:


Taht F Grade is going to seriously hurt our credibility in Google’s eyes, and 3.9 seconds is going to seem like a grind for users. If you don’t believe me, count slowly to four. Would you be willing to wait that long for every page on a site to open?

But where should marketers focus their efforts to have the most impact?

On a typical page like this, images are by far the largest and most common element, so this is where we should be concentrating to start with.


Now, this isn’t just a case of opening up your images in Photoshop and making them smaller. Resolution does matter (we still want our pages to look beautiful), but only up to a certain point, so the first step is to check our image sizes:


As you can see from the page element, this image has been uploaded at 1024 x 683 pixels, but the user will only ever see it at a maximum of 420 x 289, less than half the upload size.

As always, it’s important to consider the User Experience, so let’s ask ourselves a few questions:

  • What formats should we be using for images? PNGs are great for images with fewer colours or transparencies, while PEGS are perfect for photos.
  • Dimensions: what is the maximum width and height at which the image will be displayed?
  • Finally, do you really need all of those images?

If you have text within an image, get rid of it and use an actual font instead, and use vector graphics or CSS for things like logos or shading on the page. As Tom put it

 “The fastest HTTP request is the one not made.”

Google has a range of guidelines and advice on this available which you should check out.

So, Tom resized, reformatted or replaced his images. How did this affect the overall site speed?


Being diligent with images was enough to shave a whopping 1.2 seconds – or 30% – off of the total page load time.

It’s still not rocketspeed at this point, but it’s much, much better. Tom detailed several other useful tips during his presentation which I will try to cover in the future as well, but for now – time to tighten up those images.

from Search Engine Watch
via Auto Feed

Understanding intent through voice search

It’s search Jim, but not as we know it.

The dream of an ultimate personal assistant isn’t a farfetched sci-fi fantasy like the interactive computing systems in Star Trek. It’s technology available today already being applied to search engines.


Leading visionaries in search technology, including Google’s Beshad Behzadi in his keynote speech at the SMX West Keynote to Satya Nadella at Microsoft’s Build conference, are articulating a vision of smarter and more capable personalized help that will drive efficiency, focus and ultimately, happiness.

Nadella believes the next big bet for Microsoft is “conversation as a platform.” This is a more intuitive and accessible canvass integrating into apps, as well as artificial intelligence (A.I.) and bots that can interact with other bots. While the devices and technology used to access search are evolving, search will still be an increasingly integral part of everyday life.

The evolution of A.I. through voice search

Today’s digital assistants like Microsoft’s Cortana, Apple’s Siri and Google Now are voice-search enabled and growing smarter with every interaction. According to comScore, 50% of all searches will be voice searches by 2020.

Since voice search is more conversational and uses natural language, the A.I. is evolving to understand user intent and context based on the previous search queries, multiple step queries and user behavior.


Words can provide invaluable substance to A.I. technology during the search process. For marketers, the longer query strings from voice search as compared to text provide richer user intent data. While a text query would typically be one to three words, a spoken query is often three or more.

For example, on my desktop I would search for “blue t-shirt.” But when it comes to a voice query, I might ask, “Hey Cortana, where can I find a cool blue t-shirt?” The conversational tone provides a signal of intent to purchase, style preference and desired shopping locations if I granted access to my location. It permits marketers to:

  • Build user-intent models to understand where the user is in the customer journey.
  • Match advertising campaigns (messaging and landing pages) to the right stage of user intent.
  • Develop site content with a conversational tone, providing specific answers to users’ needs and top questions. Voice searchers are looking for quick answers. Content answering specific questions will make your site a go-to resource.

AI, the ‘Added Ingredient’ for enhanced consumer experience and engagement

Technology giants like Microsoft, IBM and Google are focusing on new ways machine-based learning, A.I. and bots can analyze data. Personal assistants like Cortana, powered by Bing search intelligence, can request permission to gather data from email accounts, calendars, social networks, geo-locations and mobile apps to start learning about behaviors and preferences.

The A.I. engine analyzes the information to make recommendations before they have a chance to ask a question. The more interactions a user has with their assistant, the more accurate the predictive models can be – and the more her serendipitous proposals will delight us and make life easier.

A screenshot of a conversation with Siri in which the user (our editor Christopher Ratcliff) tells Siri "Open the pod bay doors HAL", a reference to the film 2001: A Space Odyssey. Siri wearily replies, "Oh, not again."

For many, the end of the day reads like a frustrating laundry list of stuff that still needs to get done – including the laundry! Efficiency is now one of the keys to happiness, and technology give us back time to be in the moment.

If I give my personal assistant access to the locations that are important to me and my calendars, she can send reminders when I need to leave to make it on time to my next appointment. The predictive component A.I. can monitor traffic and figure out if I need to leave work now to pick my son up from day care because of a freeway accident. It can deep link into apps, such as Waze, and suggest the best alternative routes based on current road conditions.

Soon, this intelligence will integrate into shared intelligence across A.I. bots, and tasks such as renewing your driver’s license will be done on my behalf and save me time.

For marketers, it’s important to understand and adapt to this new technology to build immersive customer experiences. As deep-linking and intelligent agents are integrated into apps and products, consumer engagement with brands will reach the next evolution. This means there is more potential than ever to influence the path to purchase in the customer journey.


While “Beam me up Scotty” and journeys to the final frontier are not yet a reality for most of us, the capabilities and technology for building the ultimate digital assistant are almost here.

This new “other” way to get things done will make it more appealing for consumers to share personal data so that assistants can become more predictive and take actions on our behalf.

We’ll continue to use search, websites, and apps. But how we interact with them will provide more intent and context for A.I.s and bots to help us get things done in our daily lives. This way we can focus and be fully present in the moments that matter the most.

Steve Sirich is GM Marketing, Bing Ads, Microsoft and a contributor to Search Engine Watch. 

For lots more information, download our Marketer’s Guide to Artificial Intelligence report, which takes a look at how AI can be used for marketing, now and in the future.

from Search Engine Watch
via Auto Feed

What are featured snippets and how do I get them?

Rob Bucci, the CEO of STAT, delivered a fascinating talk at BrightonSEO last week about the mystery of featured snippets, using his observations after analysing one million queries.

Here’s a round-up of the talk, featuring Rob’s advice on why featured snippets are important and how to increase your chances of obtaining them.

What is a featured snippet?

A featured snippet is a summary of an answer to a user’s query, which is displayed on top of Google search results. It’s extracted from a webpage, and includes the page’s title and URL.

Screen Shot 2016-04-25 at 10.49.35

There are three types of snippets, depending on the query:

  • Paragraph
  • List
  • Table

According to Rob Bucci and STAT, paragraph snippets are the most common, occupying 82% of the featured snippets, with list snippets appearing in 10.8% of the results and table snippets in 7.3%.

Screen Shot 2016-04-25 at 10.38.27

Why are featured snippets so important?

Featured snippets offer various benefits for any site that can use them effectively.

1. Maximum authority

By obtaining a featured snippet you prove that Google chose your page over others as the most useful one to users’ relevant queries.

2. Beating the competition

When Google chooses your site to be the quick answer to a specific question, the result is displayed above the organic results, which means that you beat the competition, including a site that may rank #1 for the particular search result.

3. Increase of traffic

Users like featured snippets as they provide quick answers to their questions and this benefits the chosen site with an increase in traffic, which could be upwards of 20-30%.

Screen Shot 2016-04-25 at 12.35.54

How to earn a featured snippet

During his talk Rob Bucci offered practical advice regarding featured snippets and how to get them. Here are his basic steps that can bring you closer…

1. Analyse keyword opportunities

Use the right tools to start searching for keywords to target. Find the right keyword opportunity that could be ideal for your site.

2. Create new strategic content targeted at snippets

It’s a good idea to create new content while keeping featured snippets in mind, but it’s important that it doesn’t result in unnatural content. Always take into consideration user experience and use ideas that make sense to your vertical.

Screen Shot 2016-04-25 at 10.46.42

3. Bring in Q&A formatting

Devote a complete page to a single question, if possible, and find a way to incorporate FAQ into content.

4. Make it easier for Google with subheadings, lists, tables, etc

Screen Shot 2016-04-25 at 12.45.04

Help Google discover your content with basic on-page optimisation techniques.

5. Polish existing snippets for higher CTR

If you have existing snippets, then evaluate and edit them from time to time to ensure a constant traffic back to your site.

Featured snippets in numbers

STAT analysed one million high-CPC queries for its latest study, in order to take a deep dive into featured snippets and here are the most interesting stats to consider:

  • Out of the one million queries that STAT analysed, 9.28% of them contained snippets
  • More than 70% of the featured snippets didn’t come from the very first organic result

Screen Shot 2016-04-25 at 10.38.49

  • Featured snippets appear with an image 27.58% of the time.

Screen Shot 2016-04-25 at 12.36.49

  • Keywords with high search volume show featured snippets twice as often.

Screen Shot 2016-04-25 at 10.42.49

  • Higher query word counts result in featured snippets more often.

Screen Shot 2016-04-25 at 10.43.12

  • Featured snippet URLs score slightly better on readability tests
  • Featured snippets had a 12.5% higher than average social share count (by examining Facebook, LinkedIn, and Pinterest)

Screen Shot 2016-04-25 at 10.45.14


You don’t have to come first in search to increase your authority and drive traffic to your site, provided that you start creating strategic content that may lead to well-earned featured snippets.

from Search Engine Watch
via Auto Feed

Five useful ways to create content for service and product pages

Content development knows no bounds!

As you’re likely aware, every bit of your website or blog needs content development at one point or another. However, I find that many people get stuck when it comes to developing content for their business’s service and product pages.

It’s easy to get creative when writing a blog or creating a guest blogging strategy, but product and service pages can scare people off. We end up seeing pages full of boring, traditional text because the content on those pages may has been deemed “unimportant.”

It’s true that that this is a place where you might need to be more creative and you need to invent your own ways and ideas to produce content. But the good news is that if you do, product pages have an even better chance of ranking highly and getting more traffic as landing pages.

So, if you need some help figuring out how to invent new ways to create content for these pages, check out some of the ideas below.

Ask: what do people want to know about our products and services?

When you are stuck trying to develop content, a good starting place is to ask what it is exactly that your audience wants to know. After all, your products and services should act as a solution to their needs. Yet, do not leave your products and services as stand-alone documents, answer the questions that your customers are asking.

A really great way to do this is to have a creative Q & A section. You can either develop these questions (and answers) yourself based on what people ask you and your sales team the most, or you can actually turn to social media and emails where people have explicitly asked your product and service related questions.

In any case, developing content around this notion of product Q&A will show your ability to anticipate the needs of your customers, but it will also give you an interface that you can constantly update, change, and revise for fresh content.

Below is an example from QA Flooring Underlay Accessories:

product page

In the example above you can see that under the various product and services tabs they have designed a Q&A section for customers to review. This is a great way to give them the “so-what” and the “why” explanations about your product.

An interactive sales person: be conversational

There are so many ways that you can be innovative in presenting these pages as a space for an interactive sales experience.

According to CopyPress, your website should act as an online sales person – not just offering descriptions of products, but focused on sales and giving your audience plenty of reasons that people might want to buy from your site.

This idea is most important when developing summaries for each product or service. You definitely need to have one, but you have to do it right.

Your description really should be more of a “brief summary.” While it doesn’t need to be lengthy, it does need to provide some explanation as to why the customer should choose to buy it from you.

Descriptions of the product itself are not enough. Take this as an opportunity to provide some information to your client so that they are not left with questions hanging over their heads before purchase. Here are some things your summaries should tell:

  • Why the product or service is useful
  • Why they have a need for your product (that they hadn’t considered before)
  • Why your company is the best option

Remember that you don’t want to overwhelm them with information – being concise and considering careful integration with this kind of content is key. Use first-person language to help answer the questions above, but keep out all of the fluff.

Keep your content organized and then move on to other elements of the page.

Create a sense of urgency

Creating a sense of urgency with content on a product/service page is a balancing act. On the one hand, you don’t want a buyer to think that a product they fall in love with is going to go away out of the blue once they purchase; but on the other hand you don’t want them to wait to purchase your product or sign up for the service.

One of the ways you can create a sense of urgency is by making sure that you utilise content on these pages to send that message.

Develop content that is going to get visitors hooked and also feel the need to buy or sign-up ASAP. There are a variety of ways to do this: promotions, monthly offers, showcased products, all of which involve creatively developing content which can be updated on a regular basis!

The example below from Shoedazzle shows how they used a monthly offer to help keep visitors hooked with their content:

shoedazzle ad

Make content unique and capture attention

Since this is an opportunity for creative freedom and being innovative, push the limits with uniqueness.

Here are some ideas that you can make content more unique on these pages and also increase conversions:

  • Pay attention to aesthetics like layout, style, and color choices
  • Find ways to use appropriate images and video
  • Incorporate your brand and mission as often as possible
  • Utilize customer reviews to make a claim for success stories and satisfaction
  • Make customer reviews visual
  • Link to other related content on your site
  • Add a “suggested” section based on demographics and user interest

All of these points are becoming more commonplace on product and service pages, but it’s still a great way to offer your audience something more.

Take Just Fab for example. Their layout has different tabs you can click with different information, one tab being reviews. The layout is easy to understand and covers everything a customer would want to know without being overbearing:

just fab product page

Continue to brainstorm for new approaches

When you are trying to be innovative in developing your product and sales pages, it is always your best bet to turn to the people that know the product best.

Ask your employees the aspects of products they have to explain over and over again. Ask your regular customers what it is that they love about your product. Ask members of your local community why they choose to shop at your store (online or in-person) over others.

While there is not necessarily a one size fits all plan for creatively developing these pages, figuring out what your brand wants to say and how these pages need to tell your story boldly to new clients is one step of the equation.

Remember, doing this market research will be an opportunity for you to continuously develop and grow the content on these pages, and this in turn will help your SEO. Don’t miss out on the opportunity to make your service and product pages as appealing and interesting as they can be.

The takeaway

When you start with a question, “what do people want to know about our products and services?” you are making this effort customer-centered, and that is going to be apparent for those who land on your product pages.

Consider your site an interactive salesperson, where your job is to convince people through summaries (rather than boring old product descriptions) why your products are the ones they should choose.

You also need to create a sense of urgency, so that people feel just the right pressure to sign-up today rather than tomorrow.

In the end, you can do these things by making content unique and attempting to capture the attention of your audience from the very beginning. There are a lot of ways to do this, but beginning with brainstorming is just one way you can get the ball rolling!

Do you have experience developing your product and services pages with fresh content? Let us know in the comments section below.

from Search Engine Watch
via Auto Feed

Can We Predict the Google Weather?

Posted by Dr-Pete

Four years ago, just weeks before the first Penguin update, the MozCast project started collecting its first real data. Detecting and interpreting Google algorithm updates has been both a far more difficult and far more rewarding challenge than I ever expected, and I’ve learned a lot along the way, but there’s one nagging question that I’ve never been able to answer with any satisfaction. Can we use past Google data to predict future updates?

Before any analysis, I’ve always been a fan of using my eyes. What does Google algorithm “weather” look like over a long time-period? Here’s a full year of MozCast temperatures:

Most of us know by now that Google isn’t a quiet machine that hums along until the occasional named update happens a few times a year. The algorithm is changing constantly and, even if it wasn’t, the web is changing constantly around it. Finding the signal in the noise is hard enough, but what does any peak or valley in this graph tell you about when the next peak might arrive? Very little, at first glance.

It’s worse than that, though

Even before we dive into the data, there’s a fundamental problem with trying to predict future algorithm updates. To understand it, let’s look at a different problem — predicting real-world weather. Predicting the weather in the real world is incredibly difficult and takes a massive amount of data to do well, but we know that that weather follows a set of natural laws. Ultimately, no matter how complex the problem is, there is a chain of causality between today’s weather and tomorrow’s and a pattern in the chaos.

The Google algorithm is built by people, driven by human motivations and politics, and is only constrained by the rules of what’s technologically possible. Granted, Google won’t replace the entire SERP with a picture of a cheese sandwich tomorrow, but they can update the algorithm at any time, for any reason. There are no natural laws that link tomorrow’s algorithm to today’s. History can tell us about Google’s motivations and we can make reasonable predictions about the algorithm’s future, but those future algorithm updates are not necessarily bound to any pattern or schedule.

What do we actually know?

If we trust Google’s public statements, we know that there are a lot of algorithm updates. The fact that only a handful get named is part of why we built MozCast in the first place. Back in 2011, Eric Schmidt testified before Congress, and his written testimony included the following data:

To give you a sense of the scale of the changes that Google considers, in 2010 we conducted 13,311 precision evaluations to see whether proposed algorithm changes improved the quality of its search results, 8,157 side-by-side experiments where it presented two sets of search results to a panel of human testers and had the evaluators rank which set of results was better, and 2,800 click evaluations to see how a small sample of real-life Google users responded to the change. Ultimately, the process resulted in 516 changes that were determined to be useful to users based on the data and, therefore, were made to Google’s algorithm.

I’ve highlighted one phrase — “516 changes”. At a time when we believed Google made maybe a dozen updates per year, Schmidt revealed that it was closer to 10X/week. Now, we don’t know how Google defines “changes,” and many of these changes were undoubtedly small, but it’s clear that Google is constantly changing.

Google’s How Search Works page reveals that, in 2012, they made 665 “improvements” or “launches” based on an incredible 118,812 precision evaluations. In August of 2014, Amit Singhal stated on Google+ that they had made “more than 890 improvements to Google Search last year alone.” It’s unclear whether that referred to the preceding 12 months or calendar year 2013.

We don’t have a public number for the past couple of years, but it is incredibly unlikely that the rate of change has slowed. Google is making changes to search on the order of 2X/day.

Of course, anyone who has experience in software development realizes that Google didn’t evenly divide 890 improvements over the year and release one every 9 hours and 51 minutes. That would be impractical for many reasons. It’s very likely that releases are rolled out in chunks and are tied to some kind of internal process or schedule. That process or schedule may be irregular, but humans at Google have to approve, release, and audit every change.

In March of 2012, Google released a video of their weekly Search Quality meeting, which, at the time, they said occurred “almost every Thursday”. This video and other statements since reveal a systematic process within Google by which updates are reviewed and approved. It doesn’t take very advanced math to see that there are many more updates per year than there are weekly meetings.

Is there a weekly pattern?

Maybe we can’t predict the exact date of the next update, but is there any regularity to the pattern at all? Admittedly, it’s a bit hard to tell from the graph at the beginning of this post. Analyzing an irregular time series (where both the period between spikes and intensity of those spikes changes) takes some very hairy math, so I decided to start a little simpler.

I started by assuming that a regular pattern was present and looking for a way to remove some of the noise based on that assumption. The simplest analysis that yielded results involved taking a 3-day moving average and calculating the Mean Standard Error (MSE). In other words, for every temperature (each temperature is a single day), take the mean of that day and the day on either side of it (a 3-day window) and square the difference between that day’s temperature and the 3-day mean. This exaggerates stand-alone peaks, and smooths some of the noisier sequences, resulting in the following graph:

This post was inspired in part by February 2016, which showed an unusually high signal-to-noise ratio. So, let’s zoom in on just the last 90 days of the graph:

See peaks 2–6 (starting on January 21)? The space between them, respectively, is 6 days, 7 days, 7 days, and 8 days. Then, there’s a 2-week gap to the next, smaller spike (March 3) and another 8 days to the one after that. While this is hardly proof of a clear regular pattern, it’s hard to believe the weekly pacing is entirely a coincidence, given what we know about the algorithm update approval process.

This pattern is less clear in other months, and I’m not suggesting that a weekly update cycle is the whole picture. We know Google also does large data refreshes (including Penguin) and sometimes rolls updates out over multiple days (or even weeks). There’s a similar, although noisier, pattern in April 2015 (the first part of the 12-month MSE graph). It’s also interesting to note the activity levels around Christmas 2015:

Despite all of our conspiracy theories, there really did seem to be a 2015 Christmas lull in Google activity, lasting approximately 4 weeks, followed by a fairly large spike that may reflect some catch-up updates. Engineers go on vacation, too. Notice that that first January spike is followed by a roughly 2-week gap and then two 1-week gaps.

The most frequent day of the week for these spikes seems to be Wednesday, which is odd, if we believe there’s some connection to Google’s Thursday meetings. It’s possible that these approximately weekly cycles are related to naturally occurring mid-week search patterns, although we’d generally expect less pronounced peaks if change were related to something like mid-week traffic spikes or news volume.

Did we win Google yet?

I’ve written at length about why I think algorithm updates still matter, but, tactically speaking, I don’t believe we should try to plan our efforts around weekly updates. Many updates are very small and even some that are large on average may not effect our employer or clients.

I view the Google weather as a bit like the unemployment rate. It’s interesting to know whether that rate is, say, 5% or 7%, but ultimately what matters to you is whether or not you have a job. Low or high unemployment is a useful economic indicator and may help you decide whether to risk finding a new job, but it doesn’t determine your fate. Likewise, measuring the temperature of the algorithm can teach us something about the system as a whole, but the temperature on any given day doesn’t decide your success or failure.

Ultimately, instead of trying to predict when an algorithm update will happen, we should focus on the motivations behind those updates and what they signal about Google’s intent. We don’t know exactly when the hammer will fall, but we can get out of the way in plenty of time if we’re paying attention.

Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don’t have time to hunt down but want to read!

from The Moz Blog
via Auto Feed

The dangers of Googlizing your site

By optimizing your website for Google, you could be sabotaging your site for Baidu in China and Yandex in Russia and Eastern Europe.

It is an undeniable fact that Google has the largest search market footprint in the world. As a result, there is an endless supply of resources and information about how to optimize search for Google. As such, I understand how important it is for website owners to make sure that their sites are optimized and perform well in Google search results.

However, by optimizing a site solely based on Google’s algorithm changes and abilities, you may actually be de-optimizing a site for other search engines. This de-optimization could devastate your site’s performance in some critical markets.

If a market like China is important to your business, you need to ensure that the changes you are making to improve Google performance also work well for a search engine like Baidu.


For Russia and some eastern European countries, it needs to work for Yandex.

Yandex Screenshot_600

There are many differences in SEO best practices among search engines including local regulations, domain, and hosting location. Below are some of the updates that you may have implemented or are planning to implement to your site that may ravage your organic search traffic from Baidu.

JavaScript and AJAX

Last year, Google confirmed it could crawl and index links and content within JavaScript and AJAX. This was of course great news to many website owners as it would help improve their site experiences.

Unfortunately, Baidu is not good at crawling and indexing content within JavaScript and AJAX, yet. Using JavaScript for site navigation kills traffic from Baidu immediately after a new site launch. In one example, we saw how shortly after a new site launch, the number of Chinese pages indexed by Baidu decreased from 86,300 pages to 174 pages.

In November last year, Yandex announced it would start to crawl JavaScript and AJAX, and warned site owners not to block their JavaScript and CSS files. However, just because Yandex can now crawl and index those links and content, doesn’t mean pages will start to show up higher in search results. Yandex measures the value of the pages based on not only the content, but also the incoming links and other statistical data.


Where possible, do not use JavaScript and AJAX for navigation and content that need to be crawled and indexed by Baidu. If you still wish to use JavaScript and AJAX, create a separate Chinese site with static links in the navigation, and important content in HTML.

Yandex has created special mechanisms to give greater visibility in the search results for pages with JavaScript and AJAX.

Meta keywords, meta description and header tags

Content entered in the meta keywords, meta description and header tags might not be considered as important as it used to be for Google, but it still plays an important role in SEO for Baidu.

Google Meta Head_600


Therefore, place meta keywords and meta description tags in the <Head> section on all pages. You can leave them blank for other country/language sites, but be sure to fill them out on Chinese pages. Also use <H1> ~ <H6> tags on pages. They may not help as much against Google, but they also won’t hurt. It’s best to use them in your webpage templates to help your Chinese pages.

Subdomain vs. sub-directory

While most performance tests show that segmenting the country or language using a subdirectory performs better for a site without a country code top-level domain, some international SEO experts still advocate the use of subdomains.

Google’s Webmaster Help Guide allows businesses to use either method. However, Baidu specifically suggests using subdirectories.


If you wish your Chinese site to perform well in Baidu’s search results, you need to set your Chinese site as a sub-directory such as, “,” or “ not as a subdomain such as, “”

Content placement within the web page

Google and most major search engines have become really good at crawling and indexing entire pages of content.

However, Baidu is not as good as doing the job, yet. It is also known that when Baidu re-crawls pages that have been indexed before, it only crawls the first 1000 bytes or so of the content to see if it has new information. If it doesn’t find anything new, it stops indexing the rest of the page, and moves on to the next one.


Always place important content (including keywords) at the beginning of the page. If you update any content in the bottom half of the page, submit the URLs of those pages to Baidu using Baidu’s Webmaster Tool for re-indexing.

Yandex also has its own Webmaster Tools, where you can review your web site performance, submit XML sitemaps, and take other actions to improve your website.

There are many other algorithmic differences between the Google, Baidu and Yandex search engines. If China and Eastern Europe are important markets for your business, make sure that you take balanced SEO strategies that work for all of your target search engines.

from Search Engine Watch
via Auto Feed