Reinventing SEO

Back in the Day…

If you are new to SEO it is hard to appreciate how easy SEO was say 6 to 8 years ago.

Almost everything worked quickly, cheaply, and predictably.

Go back a few years earlier and you could rank a site without even looking at it. 😀

Links, links, links.

Meritocracy to Something Different

Back then sharing SEO information acted like a meritocracy. If you had something fantastic to share & it worked great you were rewarded. Sure you gave away some of your competitive advantage by sharing it publicly, but you would get links and mentions and recommendations.

These days most of the best minds in SEO don’t blog often. And some of the authors who frequently publish literally everywhere are a series of ghostwriters.

Further, most of the sharing has shifted to channels like Twitter, where the half-life of the share is maybe a couple hours.

Yet if you share something which causes search engineers to change their relevancy algorithms in response the half-life of that algorithm shift can last years or maybe even decades.

Investing Big

These days breaking in can be much harder. I see some sites with over 1,000 high quality links that are 3 or 4 months old which have clearly invested deep into 6 figures which appear to be getting about 80 organic search visitors a month.

From a short enough timeframe it appears nothing works, even if you are using a system which has worked, should work, and is currently working on other existing & trusted projects.

Time delays have an amazing impact on our perceptions and how our reward circuitry is wired.

Most the types of people who have the confidence and knowledge to invest deep into 6 figures on a brand new project aren’t creating “how to” SEO information and giving it away free. Doing so would only harm their earnings and lower their competitive advantage.

Derivatives, Amplifications & Omissions

Most of the info created about SEO today is derivative (people who write about SEO but don’t practice it) or people overstating the risks and claiming x and y and z don’t work, can’t work, and will never work.

And then from there you get the derivative amplifications of don’t, can’t, won’t.

And then there are people who read and old blog post about how things were x years ago and write as though everything is still the same.

Measuring the Risks

If you are using lagging knowledge from derivative “experts” to drive strategy you are most likely going to lose money.

  • First, if you are investing in conventional wisdom then there is little competitive advantage to that investment.
  • Secondly, as techniques become more widespread and widely advocated Google is more likely to step in and punish those who use those strategies.
  • It is when the strategy is most widely used and seems safest that both the risk is at its peak while the rewards are de minimus.

With all the misinformation, how do you find out what works?

Testing

You can pay for good advice. But most people don’t want to do that, they’d rather lose. 😉

The other option is to do your own testing. Then when you find out somewhere where conventional wisdom is wrong, invest aggressively.

“To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. Most large organizations embrace the idea of invention, but are not willing to suffer the string of failed experiments necessary to get there. Outsized returns often come from betting against conventional wisdom, and conventional wisdom is usually right.” – Jeff Bezos

That doesn’t mean you should try to go against consensus view everywhere, but wherever you are investing the most it makes sense to invest in something that is either hard for others to do or something others wouldn’t consider doing. That is how you stand out & differentiate.

But to do your own testing you need to have a number of sites. If you have one site that means everything to you and you get wildly experimental then the first time one of those tests goes astray you’re hosed.

False Positives

And, even if you do nothing wrong, if you don’t build up a stash of savings you can still get screwed by a false positive. Even having a connection in Google may not be enough to overcome a false positive.

Cutts said, “Oh yeah, I think you’re ensnared in this update. I see a couple weird things. But sit tight, and in a month or two we’ll re-index you and everything will be fine.” Then like an idiot, I made some changes but just waited and waited. I didn’t want to bother him because he’s kind of a famous person to me and I didn’t want to waste his time. At the time Google paid someone to answer his email. Crazy, right? He just got thousands and thousands of messages a day.

I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”

“How did you go bankrupt?”
Two ways. Gradually, then suddenly.”
― Ernest Hemingway, The Sun Also Rises

True Positives

A lot of SEMrush charts look like the following

What happened there?

Well, obviously that site stopped ranking.

But why?

You can’t be certain why without doing some investigation. And even then you can never be 100% certain, because you are dealing with a black box.

That said, there are constant shifts in the algorithms across regions and across time.

Paraphrasing quite a bit here, but in this video Search Quality Senior Strategist at Google Andrey Lipattsev suggested…

He also explained the hole Google has in their Arabic index, with spam being much more effective there due to there being little useful content to index and rank & Google modeling their ranking algorithms largely based on publishing strategies in the western world. Fixing many of these holes is also less of a priority because they view evolving with mobile friendly, AMP, etc. as being a higher priority. They algorithmically ignore many localized issues & try to clean up some aspects of that manually. But even whoever is winning by the spam stuff at the moment might not only lose due to an algorithm update or manual clean up, but once Google has something great to rank there it will eventually win, displacing some of the older spam on a near permanent basis. The new entrant raises the barrier to entry for the lower-quality stuff that was winning via sketchy means.

Over time the relevancy algorithms shift. As new ingredients get added to the algorithms & old ingredients get used in new ways it doesn’t mean that a site which once ranked

  • deserved to rank
  • will keep on ranking

In fact, sites which don’t get a constant stream of effort & investment are more likely to slide than have their rankings sustained.

The above SEMchart is for a site which uses the following as their header graphic

When there is literally no competition and the algorithms are weak, something like that can rank.

But if Google looks at how well people respond to what is in the result set, a site as ugly as that is going nowhere fast.

Further, a site like that would struggle to get any quality inbound links or shares.

If nobody reads it then nobody will share it.

The content on the page could be Pulitzer prize level writing and few would take it seriously.

With that design, death is certain in many markets.

Many Ways to Become Outmoded

The above ugly header design with no taste and a really dumb condescending image is one way to lose. But there are also many other ways.

Excessive keyword repetition like the footer with the phrase repeated 100 times.

Excessive focus on monetization to where most visitors quickly bounce back to the search results to click on a different listing.

Ignoring the growing impact of mobile.

Blowing out the content footprint with pagination and tons of lower quality backfill content.

Stale content full of outdated information and broken links.

A lack of investment in new content creation AND promotion.

Aggressive link anchor text combined with low quality links.

Investing in Other Channels

The harder & more expensive Google makes it to enter the search channel the greater incentive there is to spend elsewhere.

Why is Facebook doing so well? In part because Google did the search equivalent to what Yahoo! did with their web portal. The rich diversity in the tail was sacrificed to send users down well worn paths. If Google doesn’t want to rank smaller sites, their associated algorithmic biases mean Facebook and Amazon.com rank better, thus perhaps it makes more sense to play on those platforms & get Google traffic as a free throw-in.

Of course aggregate stats are useless and what really matters is what works for your business. Some may find Snapchat, Instagram, Pinterest or even long forgotten StumbleUpon as solid traffic drivers. Other sites might do well with an email newsletter and exposure on Twitter.

Each bit of exposure (anywhere) leads to further awareness. Which can in turn bleed into aggregate search performance.

People can’t explicitly look for you in a differentiated way unless they are already aware you exist.

Some amount of remarketing can make sense because it helps elevate the perceived status of the site, so long as it is not overdone. However if you are selling a product the customer already bought or you are marketing to marketers there is a good chance such investments will be money wasted while you alienate pas

Years ago people complained about an SEO site being far too aggressive with ad retargeting. And while surfing today I saw that same site running retargeting ads to where you can’t scroll down the page enough to have their ad disappear before seeing their ad once again.

If you don’t have awareness in channels other than search it is easy to get hit by an algorithm update if you rank in competitive markets, particularly if you managed to do so via some means which is the equivalent of, erm, stuffing the ballot box.

And if you get hit and then immediately run off to do disavows and link removals, and then only market your business in ways that are passively driven & tied to SEO you’ll likely stay penalized in a long, long time.

While waiting for an update, you may find you are Waiting for Godot.

Categories: 

from SEO Book http://www.seobook.com/reinventing-seo
via KCG Auto Feed

Why you NEED to raise organic CTR’s (and how to do it)

Does organic click-through rate (CTR) data impact page rankings on Google? This has been a huge topic of speculation for years within the search industry.

Why is there such a debate? Well, often people get hung up on details and semantics (are we talking about a direct or indirect ranking factor?), Google patents (which may or may not even be in use), and competing theories (everyone’s got an opinion based off something they heard or read). To make matters more confusing, Google is less than forthcoming about the secrets of their algorithm.

But if CTR truly does impact Google’s organic search rankings, shouldn’t we be able to measure it? Yes!

In this post, I’ll share some intriguing data on the relationship between Google CTR and rankings. I’ll also share four tips for making sure your Google click-through rates on the organic SERPs are where they need to be.

four-organic-ctr-hacks

To be clear: my goal with this post is to provide just a brief background and some actionable insights about the topic of organic click-through rates on Google. We won’t dissect every tweet or quote ever made by anyone at Google, dive deep into patents, or refute all the SEO theories about whether CTR is or isn’t a ranking factor. I’m sharing my own theory based on what I’ve seen, and my recommendations on how to act on it.

Google CTR & rankings: Yes! No! Who bloody knows!

Eric Enge of Stone Temple Consulting recently published a post with a headline stating that CTR isn’t a ranking factor. He clarifies within that post that Google doesn’t use CTR as a direct ranking factor.

What’s the difference between a direct and indirect ranking factor? Well, I suggest you watch Rand Fishkin’s awesome video on this very topic.

Basically, we know certain things directly impact rankings (I got a link from a reputable website, hooray!), but there are many other things that don’t have a direct impact, but nevertheless do impact ranking (some big-time influencer tweeted about my company and now tons of people are searching for us and checking out our site, awesome!).

It’s essentially the same issue as last touch attribution, which assigns all the credit to the last interaction. But in reality, multiple channels (PPC, organic, social, email, affiliates, etc.) can play important roles in the path to conversion.

The same is true with ranking. Many factors influence ranking.

So here’s my response: Direct, indirect, who cares? CTR might not be a “direct core ranking signal,” but if it impacts rank (and I believe it does), then it matters. Further, even if it doesn’t impact rank, you should still care!

But don’t take my word for it that Google has the technology. Check out these slides from Google engineer Paul Haahr, who spoke at SMX:

how-does-google-use-click-data

google-ctr-live-experiments

Also, AJ Kohn put together a good post about Google click-through rate as a ranking signal last year. He included a couple eye-opening quotes that I’ll share here because they are important. The first from Edmond Lau, a former Google engineer:

“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. Infrequently clicked results should drop toward the bottom because they’re less relevant, and frequently clicked results bubble toward the top. Building a feedback loop is a fairly obvious step forward in quality for both search and recommendations systems, and a smart search engine would incorporate the data.”

The second from Marissa Mayer in 2007 talking about how Google used CTR as a way to determine when to display a OneBox:

“We hold them to a very high click-through rate expectation and if they don’t meet that click-through rate, the OneBox gets turned off on that particular query. We have an automated system that looks at click-through rates per OneBox presentation per query. So it might be that news is performing really well on Bush today but it’s not performing very well on another term, it ultimately gets turned off due to lack of click-through rates. We are authorizing it in a way that’s scalable and does a pretty good job enforcing relevance.”

Also, check out this amazing excerpt from an FTC document that was obtained by the WSJ:

“In addition, click data (the website links on which a user actually clicks) is important for evaluating the quality of the search results page. As Google’s former chief of search quality Udi Manber testified:

‘The ranking itself is affected by the click data. If we discover that, for a particular query, hypothetically, 80 percent of people click on Result No. 2 and only 10 percent click on Result No. 1, after a while we figure out, well, probably Result 2 is the one people want. So we’ll switch it.’

Testimony from Sergey Brin and Eric Schmidt confirms that click data is important for many purposes, including, most importantly, providing ‘feedback’ on whether Google’s search algorithms are offering its users high quality results.”

Why organic Google CTR matters

If you have great positions in the SERPs, that’s awesome. But even high rankings don’t guarantee visits to your site.

What really matters is how many people are clicking on your listing (and not bouncing back immediately). You want to attract more visitors who are likely to stick around and then convert.

In 2009, the head of Google’s webspam team at the time, Matt Cutts, was asked about the importance of maximizing your organic CTR. Here’s a key quote that says it all:

“It doesn’t really matter how often you show up. It matters how often you get clicked on and then how often you … convert those to whatever you really want (sales, purchases, subscriptions)… Do spend some time looking at your title, your URL, and your snippet that Google generates, and see if you can find ways to improve that and make it better for users because then they’re more likely to click. You’ll get more visitors, you’ll get better return on your investment.”

In another video, he talked about the importance of titles, especially on your important web pages: “you want to make something that people will actually click on when they see it in the search results – something that lets them know you’re gonna have the answer they’re looking for.”

Bottom line: Google cares a lot about overall user engagement with the results they show in the SERPs. So if Google is testing your page for relevancy to a particular keyword search, and you want that test to go your way, you better have a great CTR (and great content and great task completion rates). Otherwise, you’ll fail the quality test and someone else will get chosen.

Testing the real impact of organic CTR on Google

Rand Fishkin conducted one of the most popular tests of the influence of CTR on Google’s search results. He asked people to do a specific search and click on the link to his blog (which was in 7th position). This impacted the rankings for a short period of time, moving the post up to 1st position.

imec-lab-google-ctr-test

But these are all temporary changes. The rankings don’t persist because the inflated CTR’s aren’t natural.

It’s like how you can’t increase your AdWords Quality Scores simply by clicking on your own ads a few times. This is the oldest trick in the book and it doesn’t work. (Sorry.)

Isn’t CTR too easy to game?

The results of another experiment appeared on Search Engine Land last August and concluded that CTR isn’t a ranking factor. But this test had a pretty significant flaw ­– it relied on bots artificially inflating CTRs and search volume (and this test was only for a single two-word keyword: “negative SEO”). So essentially, this test was the organic search equivalent of click fraud.

I’ve seen a lot of people saying Google will never use CTR in organic rankings because “it’s too easy to game” or “too easy to fake.” I disagree. Google AdWords has been fighting click fraud for 15 years and they can easily apply these learnings to organic search. There are plenty of ways to detect unnatural clicking. What did I just say about old tricks?

Before we look at the data, a final “disclaimer.” I don’t know if what this data reveals is due to RankBrain, or another machine-learning-based ranking signal that’s already part of the core Google algorithm. Regardless, there’s something here – and I can most certainly say with confidence that CTR is impacting rank.

NEW DATA: Does organic CTR impact SEO rankings?

Google has said that RankBrain is being tested on long-tail terms, which makes sense. Google wants to start testing its machine-learning system with searches they have little to no data on – and 99% of pages have zero external links pointing to them.

How is Google able to tell which pages should rank in these cases?

By examining engagement and relevance. CTR is one of the best indicators of both.

High-volume head terms, as far as we know, aren’t being exposed to RankBrain right now. So by observing the differences between the organic search CTRs of long-tail terms versus head terms, we should be able to spot the difference:

google-ctr-vs-organic-search-position-data

So here’s what we did: We looked at 1,000 keywords in the same keyword niche (to isolate external factors like Google shopping and other SERP features that can alter CTR characteristics). The keywords are all from my own website: wordstream.com.

I compared CTR versus rank for one- or two-word search terms, and did the same thing for long-tail keywords (search terms between 4 to 10 words).

Notice how the long-tail terms get much higher average CTRs for a given position. For example, in this data set, the head term in position 1 got an average CTR of 17.5%, whereas the long-tail term in position 1 had a remarkably high CTR, at an average of 33%.

You’re probably thinking: “Well, that makes sense. You’d expect long-tail terms to have stronger query intent, thus higher CTRs.” That’s true, actually.

But why is that long-tail keyword terms with high CTRs are so much more likely to be in top positions versus bottom-of-page organic positions? That’s a little weird, right?

OK, let’s do an analysis of paid search queries in the same niche. We use organic search to come up with paid search keyword ideas and vice versa, so we’re looking at the same keywords in many cases.

google-ctr-vs-position-paid-search

Long-tail terms in this same vertical get higher CTRs than head terms. However, the difference between long-tail and head term CTR is very small in positions 1–2, and becomes huge as you go out to lower positions.

So in summary, something unusual is happening:

  • In paid search, long-tail and head terms do roughly the same CTR in high ad spots (1–2) and see huge differences in CTR for lower spots (3–7).
  • But in organic search, the long-tail and head terms in spots 1–2 have huge differences in CTR and very little difference as you go down the page.

Why are the same keywords behaving so differently in organic versus paid?

The difference (we think) is that pages with higher organic click-through rates are getting a search ranking boost.

How to beat the expected organic search CTR

CTR and ranking are codependent variables. There’s obviously a relationship between the two, but which is causing what? In order to get to the bottom of this “chicken versus egg” situation, we’re going to have to do a bit more analysis.

The following graph takes the difference between an observed organic search CTR minus the expected CTR, to figure out if your page is beating — or being beaten by — the expected average CTR for a given organic position.

By only looking at the extent by which a keyword beats or is beaten by the predicted CTR, you are essentially isolating the natural relationship between CTR and ranking in order to get a better picture of what’s going on.

google-ctr-rankbrain-rewards-penalties

We found that, on average, if you beat the expected CTR, then you’re far more likely to rank in more prominent positions. Failing to beat the expected CTR makes it more likely you’ll appear in positions 6–10.

So, based on our example of long-tail search terms for this niche, if a page:

  • Beats the expected CTR for a given position by 20 percent, you’re likely to appear in position 1.
  • Beats the expected CTR for a given position by 12 percent, then you’re likely to appear in position 2.
  • Falls below the expected CTR for a given position by 6 percent, then you’re likely to appear in position 10.

And so on.

Here’s a greatly simplified rule of thumb:

The more your pages beat the expected organic CTR for a given position, the more likely you are to appear in prominent organic positions.

If your pages fall below the expected organic Google search CTR, then you’ll find your pages in lower organic positions on the SERP.

Want to move up by one position in Google’s rankings? Increase your CTR by 3%. Want to move up another spot? Increase your CTR by another 3%.

If you can’t beat the expected click-through rate for a given position, you’re unlikely to appear in positions 1–5.

Essentially, you can think of all of this as though Google is giving bonus points to pages that have high click-through rates. The fact that it looks punitive is just a natural side effect.

If Google gives “high CTR bonus points” to other websites, then your relative performance will decline. It’s not that you got penalized; it’s just that you didn’t get the rewards.

Four crucial ways to raise your Google CTRs

Many “expert” SEOs will tell you not to waste time trying to maximize your CTRs since it’s supposedly “not a direct ranking signal.” “Let’s build more links and make more infographics,” they say.

I couldn’t disagree more. If you want to rank better, you need to get more people to your website. (And getting people to your website is the whole point of ranking anyway!)

AdWords and many other technologies look at user engagement signals to determine page quality and relevance. We’ve already seen evidence that CTR is important to Google.

So how do you raise your Google CTRs – not just for a few days, but in a sustained way? You should focus your efforts in four key areas:

  1. Optimize pages with low “organic Quality Scores.” Download all of your query data from the Google Search Console. Sort your data, figure out which of your pages have below average CTRs, and prioritize those. Don’t risk turning one of your unicorn pages with an awesome CTR into a donkey with a terrible CTR! It’s far less risky turning a donkey into unicorn!
  2. Combine your SEO keywords with emotional triggers to create irresistible headlines. Emotions like anger, disgust, affirmation, and fear are proven to increase click-through rates and conversion rates. If everyone who you want to beat already has crafted optimized title tags, then packing an emotional wallop will give you the edge you need and make your listing stand out.
  3. Work to improve other user engagement metrics. Like click-through rate, we believe you need to have better-than-expected engagement metrics (e.g. time on site and bounce rate). This is a critical relevance signal! Google has enough data to know the expected conversion and engagement rates based on a variety of factors (e.g. industry, query, location, time of day, device type). If your content performs well, you’re likely going to get a rankings boost. If your content does poorly, there’s not necessarily a penalty, but you definitely won’t get any bonus points.
  4. Use social media ads and remarketing to increase search volume and CTR. Paid social ads and remarketing display ads can generate serious awareness and exposure for a reasonable cost (no more than $50 a day). If people aren’t familiar with your brand, bombard your target audience with Facebook and Twitter ads. People who are familiar with your brand are 2x more likely to click through and to convert!

Just say no to low Google CTRs!

You want to make sure your pages get as many organic search clicks as possible. Doing so means more people are visiting your site, which will send important signals to Google that your page is relevant and awesome.

Our research also shows that above-expected user engagement metrics result in better organic rankings, which results in even more clicks to your site.

Don’t settle for average CTRs. Be a unicorn in a sea of donkeys! Raise your CTRs and engagement rates! Get optimizing now!

This article was originally published on the Word Stream blog, reprinted with permission.

from Search Engine Watch https://searchenginewatch.com/2016/05/12/why-you-need-to-raise-organic-ctrs-and-how-to-do-it/
via Auto Feed

Google Rethinking Payday Loans & Doorway Pages?

Nov 12, 2013 WSJ: Google Ventures Backs LendUp to Rethink Payday Loans

Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”

What sort of strategy is helping to drive that industry transformation?

How about doorway pages.

These sorts of doorway pages are still live to this day. Simply look at the footer area of lendup.com/payday-loans

This in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.

March 16, 2015 Google To Launch New Doorway Page Penalty Algorithm

Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.

Today we get journalists conduits for Google’s public relations efforts writing headlines like: Google: Payday Loans Are Too Harmful to Advertise.

Today those sorts of stories are literally everywhere.

Tomorrow the story will be over.

And when it is.

Precisely zero journalists will have covered the above contrasting behaviors.

As they weren’t in the press release.

Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp’s lead in re-branding their offers as being something else in name.

Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.

Don’t expect to see a link to this blog post on TechCrunch.

There you’ll read some hard-hitting cutting edge tech news like:

Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.

#MomentOfZeroTruth #ZMOT

Categories: 

from SEO Book http://www.seobook.com/google-rethinking-payday-loans
via KCG Auto Feed

SEO 101: eight simple ways to optimise your blog posts for search

You don’t have to be an SEO expert to optimise your content in just a few steps and improve your page’s ranking on search engine results pages (SERPs).

SEO might sound complicated for beginners, but in fact, everyone can start applying a few basic tips that will affect a post’s performance and eventually its ranking.

1. WordPress

WordPress can be very helpful for easy SEO optimisation, as it allows everyone to perform a series of quick steps to help search engines find your content. There are many plugins that can guide you with the optimisation of your content, while they can also measure your post’s performance in terms of SEO success.

  • How often do you use your focus keywords?
  • Does your content pass the readability test?
  • Should you use more headings?

‘All in One SEO pack’, ‘SEO by SQUIRRLY’, and ‘Yoast SEO’ are among the most popular SEO plugins, but you can find numerous others to suit your needs and simplify the process of optimisation.

Wordpress SEO plugin

From page analysis to a sitemap generator, WordPress plugins can help you understand how SEO works, which may eventually help you improve your content to make it more appealing, both for your audience, but also for search engines.

2. Headline and title tags

A post’s headline is the user’s first impression on your content and this will affect whether the exposure will lead to an actual click, or not.

The title tag should be an accurate description of your content aiming to capture the audience’s attention, while helping search engines discover your content with the right optimisation.

Title tag was always an important part of on-page SEO optimisation, but Google’s semantic search has changed the rules of the game, encouraging people to think more of their audience and less of the keywords.

It’s not necessary anymore to use a specific keyword in your title tag, although it may be useful if you manage to use it in context, helping users understand more of the topic you’ll be covering.

According to a research by Backlinko, keyword-optimised title tags may still be associated with better ranking, but not in the same way that it mattered in the past.

Thus, it is important that every title is:

  • Concise (not more than 60 characters)
  • Relevant (inform the readers on what the post is about)
  • Enhancing readability (Is your title appealing to readers? What will make them click?)
  • Having an emotional impact ( The best titles manage to build emotional triggers, favouring their virality. It’s the emotional impact that is instantly built in the reader’s mind, making the click easier, mostly triggered by the provoked emotion)
  • Keyword-optimised (providing that it’s used in context)

A compelling title manages to attract both the readers and the search engines in less than 60 characters, which is more challenging than it seems, but it may also lead to an increased traffic.

coschedule headline analyzer tool

If you want to improve the craft of coming up with great headlines, then CoSchedule may help with its Headline Analyzer, which allows you to understand what makes an effective headline both for humans and search engines, ranking your titles and measuring it depending on its length and the type of words that you used.

3. Formatting

Good formatting is appreciated both by humans and search engines, as it makes the content more appealing.

Headings

Headings help the text’s readability by dividing it into smaller blocks, with <h1> serving as the part that needs to be highlighted more than the rest and <h2>, <h3>, <h4>, <h5>, <h6> creating an additional layer of importance compared to the normal paragraph text.

Each heading has a different size, in order to be easily distinguished, creating a hierarchical structure that increases the chances of users spending more time reading (and skipping to relevant) content.

Headings should follow the guidelines of the titles, be appealing and descriptive, and separate long blocks of content by creating a visual appeal.

In terms of SEO optimisation, headings help search engines spot the most important parts of your content and discover the topic that you’re writing about.

In fact, according to Searchmetrics, pages tend to use more H1 tags since 2014, while the presence of H1 and H2 tags also aid the user experience.

URL

URLs should be simple and clear both for humans and search engines. Although search engines are able to crawl even the most complex URL, it is still preferable to keep the URL simple, useful and relevant.

How URL length affects ranking position

Image source: Backlinko

As the URL is displayed in the SERPs along with the title and the meta description, it needs to convey the necessary information for your content, while its length may encourage the sharing of the content.

There are many options in WordPress on how to create an automatic URL, but once again, simplicity is preferred, as a shorter URL may help readers (and search engines) discover your post’s actual content.

Permalink-Settings Yoast SEO

Meta description

Meta descriptions provide a summary of the page’s content to search engines and should provide a concise and relevant description of your post, serving as a preview for readers, helping them decide whether they are going to visit the page or not.

According to Survey Monkey, 43.2% of people click on a result based on its meta description, which means that you need to use wisely the 160 characters maximum length.

A meta description should follow the rules of your actual content, be descriptive and well written, without overusing keywords for the sake of search engine optimisation. Even if you include your targeted keyword, make sure it is provided in context, always thinking of your audience first.

How often is keyword used in description?

Image Source: Searchmetrics

Semantic search has affected the impact of keywords in the description, but this doesn’t mean that they are still not used. It was back in 2009 when Google announced that meta descriptions and meta keywords don’t contribute to its ranking algorithms for search, but we still need to remember their importance as part of the preview snippet in SERPS, which is another case of putting the audience first when optimising.

4. Linking

Links were always important for SEO optimisation, but this led to many manipulative techniques in the past. Search engines have moved to the age of semantic context, which means that links may still be significant, but in a more relevant and useful way.

Internal links can enhance the user experience, as they help the audience navigate through your own site, in order to read further relevant posts. Any well-written link that is useful aims to allow a reader to continue its navigation through the site, boosting the content’s authority by linking a series of quality posts.

This also affects the crawlability of your content, as search engines perceive your posts as informative and relevant.

External linking was used cautiously in many occasions, out of fear that such a link only favours the linked source and not your own content, but this is not the case.

By adding links to external sources that are relevant to your content you are boosting your own authority, helping search engines understand your niche topic and reward you for your linking that aims to add value to your own post.

You don’t have to link to many external sites, rather the most important ones that are considered to be:

  • popular
  • trustworthy
  • relevant

External linking helps search engines learn more about your content, improving your credibility and even your ranking.

Backlinks were always an integral part of SEO optimisation, as they serve as the proof that your content is appreciated by others, improving your authority to a particular field.

Although Google is not keen on “unnatural” linking that serves no particular purpose, a backlink of high quality is always welcome, as it contributes both to your site’s ranking factors, but also to your content’s authority.Backlinks with keyword in anchor text saw a drop in usage recently

Image Source: Searchmetrics

During the past year we’ve seen a decrease of backlinks using a keyword to the anchor text and this is related once again to Google’s attempt to combat any kind of manipulative link with no context.

Any backlink from a highly trusted source may eventually lead to an increase of traffic and a boost in the search ranking factors and the best way to achieve it is to keep producing quality and informative content that will offer a unique perspective in its relevant field.

5. Images

Images do not just enhance the reading experience for your audience, but they are also important in your SEO optimisation.

As users can find your images directly through Google’s Image Search, it’s important to pay attention to their naming, in order to increase the chances of bringing traffic back to your site or boost your site’s ranking.

Image optimisation for SEO is simple, but it’s sometimes overlooked as a boring task. However, it’s great when a user discovers your content through image search, associating an image with your content and that’s why you should tart spending a few minutes to optimise your images from now on.

As well as the filename, which serves as the image’s title, it is also crucial to add “alt” text, which is essentially the description of your image. This section is about providing alternative text for your image, which will be displayed to a user’s browser if there is a problem with the actual image.

A Wodpress window showcasing the fields that enhance image optimisation

As search engines can only read and not ‘see’ an image, the alt text should be indicative of your file, trying to describe it in the best possible way.

Your description should be clear and concise and if you still need help with finding the right text to describe your image, keep in mind that alt text is used by screen reader software to describe an image to people with visual impairments.

6. How metrics affect SEO

It’s always useful to analyse the performance of your posts, with metrics and conversions helping you understand your content at a deeper level.

However, there is an indication that metrics such as the bounce rate and the SERP click-through-rate affect ranking, which makes their analysis even more important.

SEO optimisation can be improved with the analysis of your data

According to Backlinko, websites with low average bounce rate are ranking higher on the search results and despite the uncertainty regarding the direct correlation between these two, it still reminds us that engaging content offers numerous benefits for a site.

It’s the time spent on site along with the bounce rate that can lead to very interesting insights for your content, in order to discover what your audience likes and what needs to be improved.

Furthermore, the click-through rate has an even closer relationship with SEO, as it affects the content’s ranking position on SERPs, using the number of clicks as an indication of a content’s popularity, as this validation comes directly from your readers.

7. Readability over keyword-stuffing

Keywords may be among the most commonly used words along with SEO, but in 2016, it’s more important to focus on the quality of the content rather than the target keyword.

It’s the readability that will affect your search rankings more than the right use of a keyword, as the first one may improve the user experience, the level of engagement and the time spent on the site, while the use of keywords without a content of high quality simply looks like an automatic text, without taking into consideration the human factor.

Search engines value the human factor and thus, the level of readability can be improved with:

  • Easy-to-read text
  • Short sentences
  • Short paragraphs
  • Organised structure
  • In-depth knowledge of the topic
  • Focus on humans, not search engines

You can measure the readability of your content with WordPress plugins such as Yoast SEO and FD Word Statistics, while there are also many online tools, including Readability-Score and the Readability Test Tool.

Except for the goal of readability, you still need to add relevant keywords to your text, as they are still affecting the search ranking factors, provided that they are added naturally as part of your text, adding value and context to it.

8. Search engines love new content

Everyone loves fresh content, including search engines. By regularly creating new content you are increasing the chances to become an authority in your field, which may favour your ranking and boost your traffic.

You don’t have to create new content daily, but consistency and relevance will be highly appreciated, both by your audience and the search engines.

As you keep adding more valuable content, visitors will keep coming back, increasing the engagement while building your credibility.

Never sacrifice quality over quantity though, as this won’t be appreciated neither by readers, nor by search engines. 

from Search Engine Watch https://searchenginewatch.com/2016/05/11/seo-101-eight-ways-to-optimise-your-blog-posts-for-search/
via Auto Feed

Why should you focus on multiple keywords?

In Yoast SEO Premium you’re able to focus on multiple keywords. If you use our tool correctly, your text can be optimized for up to five keywords. In this post, I’ll explain to you why it’s important to use the multiple focus keyword functionality while optimizing your text.

how to use multiple focus keywords

Explaining (multiple) focus keywords

The Yoast SEO plugin helps you to optimize each and every post (or page) you write. Imagine yourself having a travel blog. For your travel blog, you’re writing a blog post about a road trip through California. The focus keyword is the word or phrase your audience will use in the search engines and for which you want your post to rank. In order to choose your focus keyword wisely, you should do some research! In our example, the most important keyword would be ‘road trip California’. Sometimes it’s hard to choose one keyword because you want a post to rank for more than one specific focus keyword. Perhaps you would also like to rank for a synonym or for a slightly different keyword. That’s when the multiple focus keywords come in handy! Let’s look at 4 examples in which optimizing for multiple keywords is the best strategy.

Synonyms

People search for different things. While some people will use the term road trip when searching for their vacation, others could very well use vacation, holiday or trip. To reach different groups of people, you should make sure that your post will rank for these different keywords.

More than one topic

Sometimes a post is about more than one topic or has a few subtopics. Our article about the road trip to California could be about planning for the road trip, as well as sightseeing in California. These two topics could very well fit into one article. In this case, you would like your article to rank for ‘sightseeing California’ as well as for ‘planning road trip’. And, you’d also like to rank for your most important keyword ‘road trip California’.

multiple focus keywords: multiple topics shown in google trends

Long tail keyword variants

A great strategy to get your content to rank in Google is to focus on long tail keywords. Long tail keywords will have far less competition and will be relatively easy to rank for.

If you were able to rank for multiple long tail keywords with one post, that would make it even more fruitful. Addressing multiple long tail variants of your focus keyword will be a great strategy. Optimizing your post for different long tail variants will give you the opportunity to be found for more search terms. In our example, one could, for instance, focus on road trip California and on two long tail variants: ‘road trip southern California’ and ‘road trip northern California’.

multiple focus keywords: long-tail keyword variants shown in Google trends

Key phrases

If people seek something rather specific, they tend to use key phrases. Sometimes, the word order of the words within these key phrases (and the use of stopwords) is important. If the word order and the use of stopwords is important, we would advise you to optimize your post on different variations of your focus keyword.

While investigating how Google handles stopwords, we found that a search term like ‘road trip California’ is handled in exactly the same manner as ‘California road trip’. The order of the words is irrelevant to Google. However, for the search [road trip in California], Google tries to find the exact match (and the order of the word is important). So, search queries with stopwords seem to be handled a bit different by Google.

multiple focus keywords: key phrases difference shown in google trends

How to use multiple focus keywords

Optimizing your post for multiple focus keywords is really easy! You should purchase Yoast SEO Premium and click on the tab in the Yoast SEO Premium box to add a new keyword:

multiple focus keywords: click plus sign to add a focus keyword

A new box will open and you can enter the second focus keyword you’d like to optimize your post for:

multiple focus keyword: input field

The plugin will run a check on the content to see if your post is optimized for all the focus keywords you entered.

Read more: ‘Blog SEO: befriend the long tail’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/focus-multiple-keywords/
via KCG Auto Feed

How Reposting Old Blog Content Will Boost Your Traffic

“Content is king” is an adage that has been adopted and practiced by every successful blogger across the web for years. So practiced, that there’s a definite surplus in content. While content continues to flood the internet at an increasing rate, users continue to only consume so much of it. This creates a problem-or opportunity, depending on how you look at it-with blogging strategy. In an attempt to stay at the surface in such a competitive landscape, bloggers are cranking out content at a rate that takes away from the quality of their posts and contributes to the excess information put out there for users.

The effort of content production can be subsidized with reposting old blog content-an effort that can drastically increase your site’s traffic, make useable content readily available for your readers, and make your writing last longer. Content takes time to produce, so you might as well squeeze as much use out of it as you can. Part of doing that means upcycling and reposting old content, specifically the content that has performed well.

For nearly all blogs, a very small percentage of high performing posts generate the majority of traffic for your site. Take the prominent website Hubspot, for example. An analysis of their blog determined that 76% of their monthly blog views came from old posts, and similar analyses have revealed the same trend across the board. With the knowledge that the majority of blog views are coming from a handful of your older, most successful posts rather than the new stuff you’re racing the clock to finish, you can plot your next move: capitalizing on things you’ve already written by reposting old blog content.

How to Do It

  • Start by identifying your top posts (most blogging platforms and websites have this information available when you log in through the back end). Look for what’s called “evergreen” content-pieces that remain relevant and fresh in concept for users, even though time has passed since it was originally published.
  • Update the content. Stay on topic with what the post is about, but updates images, add a little more information, or rewrite a paragraph.
  • Evaluate your keywords. Did you miss any the first time? Can you add a little to the content to include some new keywords?
  • Don’t change the URL, but refresh the time stamp to reflect the current date.

Why It Works

It may seem like a practice that’s too easy to work, but it does. If the content is evergreen, it will still create a lot of user engagement by striking a relevant nerve with your audience. By tweaking your content just a little and reposting it at a later date, you can reach new users that didn’t see the post the first time who will like, link, share, and comment.

Not to be confused with duplicate content, reposted content will yield positive effects on search engine rankings. Google values quality content that performs well and offers real value to users. When you have content that has performed well, offers value, but gets buried by constant content production, reposting is a way to resurrect and reuse it.

from HigherVisibility http://www.highervisibility.com/blog/how-reposting-old-blog-content-will-boost-your-traffic/
via KCG Auto Feed

Here’s a new content marketing strategy documentation map

The majority of enterprise content marketers don’t have a documented strategy, according to recent research. The CMI found that almost two thirds of professional content folk haven’t yet bothered to write down their strategy.

In some circles that’s akin to not having a strategy at all, but I don’t find it particularly surprising. Plenty of experienced, established teams seem to work without documentation in place, but it seems to me that content marketing has evolved to the point where it’s really easy to lose focus.

I’m currently going through the process of establishing a content strategy from scratch and thought I’d share what I’m doing, which I’ve summarised in the visualisation below. I guess each of these areas could be a chapter heading in a handy reference guide for the team.

Content marketing strategy document map

All pretty top level, you understand, but I’ll explain a bit about each area below.

What do we mean by ‘document’?

There’s no standard template for this, so far as I know. In any event, what’s right for me might be wrong for you. But I would say that your ‘documentation’ should amount to more than a simple mission statement.

Big picture strategy slogans are one thing, but to actually make things happen, you need a lot of detail. Your team needs to know where to look – or who to look to – when they want something.

If you don’t have proper documentation in place, then they will look to you, and you will turn into a repetitive answer machine. Heavy bummer.

It might be that a lot of the supporting documentation already exists, in some shape or form. It’s just that it is unfinished, or out of date, or unstructured, and very possibly unshared. Why not put some time aside to get things together?

Assembling a collection of useful documents – alongside a goal-orientated series of targets – will help you to keep things on track. Your team will thank you for it, especially newcomers.

Let’s go through the four key areas to think about (Goals, Tactics, People, Processes), and the three others that are loosely filed under productivity (Assets, Tools and Tech).

In business, everything revolves around goals, unless you’re batshit crazy, so I’ll start there.

Goals

Content marketing teams exist to support all kinds of businesses goals. Some are more important than others. Goals can be strategic, tactical or based around task completion. Macro, micro, nano. Company, department, team member. Or mission, campaign, task.

Goals should be written down and ideally visible across teams, since you rarely work in a vacuum. Performance stats should be visible too, because transparency is a winning ticket.

Note that you always, always, always need a feedback loop, to measure what works and what doesn’t. Without that you cannot hope to function properly, nor maximise your chances of success. Nor, for that matter, demonstrate ROI (or the lack of it).

Screen Shot 2016-05-11 at 11.06.21

When it comes to goals, there are three main things to sort out…

Mission statement. This is your elevator pitch, and can probably be condensed into a sentence or so. You want to cram as much meaning and clarity in these few words as possible, to quickly answer questions such as “why are we investing in content marketing?”

Targets. You can use tools to set, assign and monitor goals, or just put something together in Google Docs and share it with whoever needs to see it.

Metrics. Once clearly defined goals and targets have been set you can take some measurements and track metrics as you move forwards. Set up your analytics reports and monitor performance as you progress.

Tactics

Once you know what your goals are, you can figure out how to go about achieving them. This is where tactics come into play.

Screen Shot 2016-05-11 at 11.06.48

Research. Gut feel is a fine place to start, but tactics should be based around insight, rather than opinions. This calls for some research. Use whatever sources of data and information you can to build up a picture of the world according to your target audience.

Audience research. Figure out needs, where they like to hang out, what makes them tick, what they respond to, which competitors they talk about, who their friends are, who they respect… that kind of stuff.

Customer research. You need to know who your existing customers are before finding similar people. How do customers interact with your brand? What works?

Competitor research. It’s worth having a sniff around but there’s no need to obsess over competitor activity. Worry about your own game. Planning is a natural extension of worrying.

Personas and user journeys. Put together some personas, user stories, customer journeys, and make sure everybody is aware of the paths you want visitors to take.

Keyword research. This is rather more audience-centric than the foundational technical SEO basics, such as making your site fast. Search queries reflect consumer intent, and it is your job to create the kind of content that ranks well for target phrases.

Keyword research works best when it is truly strategic, with content mapped to specific business goals. Your content comes only after you have defined and prioritised your keyword wishlist. Or you’re doing it wrong.

Incidentally, I pretty much live by Dan Shure’s brilliant article on using the Keyword Planner in a creative way.

Funnel. How do your customers actually become customers? Understand the various journeys through the funnel. See what’s working, and think about how your content can play a role at each stage.

Content mapping. Great, you’ve mapped content throughout the funnel, but happens after somebody has become a customer? Increasing retention and customer advocacy are two of the best things you can do in business, and your content can go a long way in supporting these primary business goals. Take ownership, if necessary.

Content mapping - funnel

Formats. After you’ve done your homework, you can start to think about the actual content. Thoughts will turn to the type of content you might create, and the formats you can use. What is possible, given your team, your budget and your platforms?

Distribution. Hold up, cowboy. Don’t let the tail wag the dog. In this case the tail is content. And the dog, well, that’s distribution. Simply put, why are you making a video when you haven’t given a moment’s thought to YouTube? And what feeds YouTube? Ah yeah, reddit does.

These channels are potentially going to be the difference between a small win in local circles and a global viral. Why wouldn’t you want to optimise your distribution channel strategy?

How will people find your article? What’s your social strategy? Are you going to do any paid distribution? How are you going to nail down some excellent Google placements?

Figure out how to get the best out of your main channels, and you’ll get way more bang for your buck from each piece of content.

SEO. The devs probably need to be informed about your preferred search setup. Right?

Once you’ve got this together it becomes much easier to direct your efforts, and change tack if necessary.

The main success factor will be linked to the quality of content you create, and that’s something that you can also provide documented guidance on. Share internal and external knowledge, and make it easily accessible across teams.

People

You may know who everybody is and why they matter, but does the rest of your team? Think about the various people who stand to benefit from your success, and always remember the ones who took on some risk when you started out.

Screen Shot 2016-05-11 at 11.06.03

Stakeholders. Not just the boss, but heads of other teams that will be affected by your efforts. Who are they? What do they need to be effective in their jobs? How can content marketing support their primary goals? Also, what have you promised? Make the business case readily available to your team so they know what is expected of them. Share presentations and team goals.

Content team. Who’s in the dream team? Who is your star player? What is everybody focusing on? How does everybody communicate? That might be as straightforward as sharing a simple organogram and a bunch of invitations to Slack.

Other teams. Who do you ask for a new button to be designed? Where do the company mugshots live? Is there a shared Dropbox folder? What are the guidelines around using this stuff?

External talent. Maybe you hired a PR agency, who should be kept in the loop about major content campaigns on the horizon. Maybe you have three freelance writers who don’t work in our office. And that weird guy who makes kickass videos from a shady basement. How will these people work together and where do their contact details live for when somebody needs something?

Influencers. This is really important: know who you want to get friendly with. These community-annointed leaders of tribes can help you in a big way. What can you do to attract their attention? I tend to store influencers (including media lists) in a spreadsheet. Other people’s Twitter Lists can be a goldmine.

Processes

This is where the action happens. What are the things to do before and after publication? What do you need to do to get a piece of content over the line?

Screen Shot 2016-05-11 at 11.05.28

Brainstorming. Where do we store our ideas? How often do we get together? What tools do we use? What’s the formula for deciding what kind of content to create? Whiteboard sessions and mindmaps all play their part.

Workflow. How do we operate as a team, and as team members? How should we work with other teams? What is the process for submitting work?

What tools do we use? I’ve played around with Trello to Basecamp to Google Docs but have never settled on one goal-orientated platform (so I’m actually building one). People use tools differently and there is often some kind of protocol to follow, otherwise your world looks like to-do list spaghetti.

Taxonomy. What’s that you say about metadata? What is the common vocabulary for labels, tags and categorisation? Should I write Ecommerce or ecommerce or e-commerce or E-commerce or eCommerce? How does the tech work to support this kind of thing? If you have some rules in place, then you should police them.

Checklists. What needs to be done prior to publication? Or sign off? What boxes need to be ticked? Did you sense check everything?

Sign Off. Is there a sign off process? Who has final say? Do we really need to run everything past the PR agency? Who has publishing rights? Who is allowed to edit?

The Other Stuff

Assets, tech and tools pretty much sit between the three key areas and the goals. I see these things as being very much in the heart of the practical, and used, referenced and updated during the production phase.

Assets are things like brand guidelines, which should cover all of the dos and don’ts you need to know before publishing even the smallest status update. Authors must know your brand inside out before they represent it, right?

You’ll also need a house style guide, for content creators and editors. And ideally some pointers about things like when to publish, or how to write amazing headlines.

You’ll also be primed for success if you go to the trouble of creating (and maintaining) a schedule, be a that a shared calendar or loose, spreadsheet-based plan of action. Put some dates in the diary, get some targets in place, and watch out for the things on the verge of falling off the radar (or worse, the dreaded blockers).

Tech covers off the various platforms you will use (owned, earned, rented, paid, etc). That might mean a blog, a YouTube channel or a paid media channel. It’s probably all three.

Tech also points to your kit, and how the tech team can help improve efficiency and performance. For example, if you’re blogging, what are your CMS needs? How could the editing interface be improved? How should you report bugs? This might mean JIRA tickets, or something similar, so let your team know about how best to wave flags.

Platforms and technology can be optimised, which is where UX comes into play. Content lives at the heart of UX, but there are obviously factors outside of the content team’s control. Be sure to bang the drum if your site is slow, or if something is broken.

UX also covers persuasion, which is something of an artform among switched-on content marketers.

Then we have Tools, which primarily sit between people and process, and should help you to get things done. Pretty self-explanatory.

In summary

It’s worth pausing for thought if you are part of an existing team and you don’t have the right documentation in place. Where should I look for that style guide? Exactly what kind of person am I writing for, and why? Who should sign this off? These are questions that no right-minded team leader wants to answer on a daily basis.

Or maybe, like me, you’re starting something up, or you have a new client and a blank page. It’s tempting to jump straight into content creation, but in the long run it’s going to be way better to put a well-documented plan of attack in place, with goals and supporting assets all neatly lined up.

Either way, it’s worth regularly reviewing your strategy and updating your documentation, especially when adjusting course. To that end, I created The Content Strategy Canvas to help you get together a top level picture of what you have going on (click the pic for a big, hi-res version).

Content Strategy Canvas - half

The canvas appears overly simplistic, but it is meant to be that way. It is a visual tool to help quickly communicate the key aspects of strategy on one page. No fluff required. The other documentation you might assemble having read this post will fill in the gaps.

And lo, you will become a cherished hero.

Anyway, that’s where I’m at. I’ll share a few specific content marketing templates in the future. I’d certainly love to hear any feedback and other approaches, so do leave a comment below or get in touch.

from Search Engine Watch https://searchenginewatch.com/2016/05/11/heres-a-new-content-marketing-strategy-documentation-map/
via Auto Feed

rel=canonical: the ultimate guide

The canonical URL allows you to tell search engines that certain similar URLs are actually one and the same. Sometimes you have products or content that is accessible under multiple URLs, or even on multiple websites. Using rel=canonical you can have this without harming your rankings.

History of rel=canonical

In February 2009 Google, Bing and Yahoo! introduced the canonical link element. Matt Cutt’s post is probably the easiest reading if you want to learn about its history. While the idea is simple, the specifics of how to use it turn out to be complex.

The rel=canonical element, often called the “canonical link”, is an HTML element that helps webmasters prevent duplicate content issues. It does this by specifying the “canonical”, or “preferred”, version of a web page. Using it well improves a site’s SEO.

The idea is simple: if you have several similar versions of the same content, you pick one “canonical” version and point the search engines at that. This solves the duplicate content problem where search engines don’t know which version of the content to show. This article takes you through the use cases and the anti-use cases.

The SEO benefit of rel=canonical

Choosing a proper canonical URL for every set of similar URLs improves the SEO of your site. Because the search engine knows which version is canonical, it can count all the links towards all the different versions, as links to that single version. Basically, setting a canonical is similar to doing a 301 redirect, but without actually redirecting.

The process of canonicalization

When you have several choices for a products URL, canonicalization is the process of picking one. In many cases, it’ll be obvious: one URL will be better than others. In some cases, it might not be as obvious, but then it’s still rather easy: pick one! Not canonicalizing your URLs is always worse than not canonicalizing your URLs.

canonical graphic 1024x630

How to set canonical URLs

Correct example of using rel=canonical

Let’s assume you have two versions of the same page. Exactly, 100% the same content. They differ in that they’re in separate sections of your site and because of that the background color and the active menu item differ. That’s it. Both versions have been linked from other sites, the content itself is clearly valuable. Which version should a search engine show? Nobody knows.

For example’s sake, these are their URLs:

  • http://example.com/wordpress/seo-plugin/
  • http://example.com/wordpress/plugins/seo/

This is what rel=canonical was invented for. Especially in a lot of e-commerce systems, this (unfortunately) happens fairly often. A product has several different URLs depending on how you got there. You would apply rel=canonical as follows:

  1. You pick one of your two pages as the canonical version. It should be the version you think is the most important one. If you don’t care, pick the one with the most links or visitors. If all of that’s equal: flip a coin. You need to choose.
  2. Add a rel=canonical link from the non-canonical page to the canonical one. So if we picked the shortest URL as our canonical URL, the other URL would link to the shortest URL like so in the <head> section of the page:
    <link rel="canonical" href="http://example.com/wordpress/seo-plugin/">

    That’s it. Nothing more, nothing less.

What this does is “merge” the two pages into one from a search engine’s perspective. It’s basically a “soft redirect”, without redirecting the user. Links to both URLs now count for the single canonical version of the URL.

Setting the canonical in Yoast SEO

If you use Yoast SEO, you can change the canonical of several page types using the plugin. You only need to do this if you want to change the canonical to something different than the current page’s URL. Yoast SEO already renders the correct canonical URL for almost any page type in a WordPress install.

For posts, pages and custom post types, you can edit the canonical in the advanced tab of the Yoast SEO metabox:

canonical-in-yoast-seo

For categories, tags and other taxonomy terms, you can change them in the Yoast SEO metabox too, in the same spot. If you have other advanced use cases, you can always use the wpseo_canonical filter to change the Yoast SEO output.

When should you use canonical URLs?

301 redirect or canonical?

If you have the choice of doing a 301 redirect or setting a canonical, what should you do? The answer is simple: if there are no technical reasons not to do a redirect, you should always do a redirect. If you cannot redirect because that would break the user experience or be otherwise problematic: set a canonical URL.

Should a page have a self-referencing canonical URL?

In the example above, we make the non-canonical page link to the canonical version. But should a page set a rel canonical for itself? This is a highly debated topic amongst SEOs. At Yoast we have a strong preference for having a canonical link element on every page and Google has confirmed that’s best. The reason is that most CMSes will allow URL parameters without changing the content. So all of these URLs would show the same content:

  • http://example.com/wordpress/seo-plugin/
  • http://example.com/wordpress/seo-plugin/?isnt=it-awesome
  • http://example.com/wordpress/seo-plugin/?cmpgn=twitter
  • http://example.com/wordpress/seo-plugin/?cmpgn=facebook

The issue: if you don’t have a self-referencing canonical on the page that points to the cleanest version of the URL, you risk being hit by this stuff. Even if you don’t do it yourself, someone else could do this to you and cause a duplicate content issue. So adding a self-referencing canonical to URLs across your site is a good “defensive” SEO move. Luckily for you, our Yoast SEO plugin does this for you.

Cross-domain canonical URLs

You might have the same piece of content on several domains. For instance, SearchEngineJournal regularly republishes articles from Yoast.com (with explicit permission). Look at every one of those articles and you’ll see a rel=canonical link point right back at our original article. This means all the links pointing at their version of the article count towards the ranking of our canonical version. They get to use our content to please their audience, we get a clear benefit from it too. Everybody wins.

Faulty canonical URLs: common issues

There is a multitude of cases out there showing that a wrong rel=canonical implementation can lead to huge issues. I know of several sites that had the canonical on their homepage point to an article, and completely lost their home page from the search results. There are more things you shouldn’t do with rel=canonical. Let me list the most important ones:

  • Don’t canonicalize a paginated archive to page 1. The rel=canonical on page 2 should point to page 2. If you point it to page 1 search engines will actually not index the links on those deeper archive pages…
  • Make them 100% specific. For various reasons, many sites use protocol relative links, meaning they leave the http / https bit from their URLs. Don’t do this for your canonicals. You have a preference. Show it.
  • Base your canonical on the request URL. If you use variables like the domain or request URI used to access the current page while generating your canonical, you’re doing it wrong. Your content should be aware of its own URLs. Otherwise, you could still have the same piece of content on for instance example.com and www.example.com and have them both canonicalize to themselves.
  • Multiple rel=canonical links on a page causing havoc. Sometimes a developer of a plugin or extensions thinks that he’s God’s greatest gift to mankind and he knows best how to add a canonical to the page. Sometimes, that developer is right. But since you can’t all be me, they’re inevitably wrong too sometimes. When we encounter this in WordPress plugins we try to reach out to the developer doing it and teach them not to, but it happens. And when it does, the results are wholly unpredictable.

rel=canonical and social networks

Facebook and Twitter honor rel=canonical too. This might lead to weird situations. If you share a URL on Facebook that has a canonical pointing elsewhere, Facebook will share the details from the canonical URL. In fact, if you add a like button on a page that has a canonical pointing elsewhere, it will show the like count for the canonical URL, not for the current URL. Twitter works in the same way.

Advanced uses of rel=canonical

Canonical link HTTP header

Google also supports a canonical link HTTP header. The header looks like this:

Link: <http://www.example.com/white-paper.pdf>; 
  rel="canonical"

Canonical link HTTP headers can be very useful when canonicalizing files like PDFs, so it’s good to know that the option exists.

Using rel=canonical on not so similar pages

While I won’t recommend this, you can definitely use rel=canonical very aggressively. Google honors it to an almost ridiculous extent, where you can canonicalize a very different piece of content to another piece of content. If Google catches you doing this, it will stop trusting your site’s canonicals and thus cause you more harm…

Using rel=canonical in combination with hreflang

In our ultimate guide on hreflang, we talk about canonical. It’s very important that when you use hreflang, each language’s canonical points to itself. Make sure that you understand how to use canonical well when you’re implementing hreflang as otherwise you might kill your entire hreflang implementation.

Conclusion: rel=canonical is a power tool

Rel=canonical is a powerful tool in an SEO’s toolbox, but like any power tool, you should use it wisely as it’s easy to cut yourself. For larger sites, the process of canonicalization can be very important and lead to major SEO improvements.

Read more: ‘Duplicate content: causes and solutions’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/rel-canonical/
via KCG Auto Feed

Adopting a consumer mindset for your SEO strategy

Search engines continue to adjust algorithms to better match how consumers actually conduct their searches. How should this affect our optimization efforts?

Consumers aren’t born in a vacuum, nor do they live in one. They’re inundated with a brand’s messages, ideas, discussions, and controversy across all of their digital and offline encounters.

As businesses and marketers, we must not only educate novice buyers on the benefits of a product, but also engage educated potential customers who already have some knowledge of their choices and are making final decisions between products.

Identify consumer needs

The first step in this process is identifying the needs of the consumer at each of the stages where they’ll be searching for products.

What drives their curiosity? What is creating their need? A strategic response to this question can be pursued in several different ways.

A good way to begin is surveying your website visitors.

Onsite surveys can be a powerful way to gain an understanding of customers and what information they’re after. Think of engaging ways to talk to your audience in this way, putting them on their journey to being a customer.

NPS-survey

Customer tracking

You may have already started tracking visitor activities in your analytics platform – analytics that you can tie to a stage in your customer’s lifecycle.

In the B2B sector, this can mean a visitor downloading a whitepaper that gives some entry-level explanation of your product category, or, on the B2C side, could include a customer adding a specific product to his shopping cart on your website but not completing the purchase.

These are all opportunities to re-engage audiences by understanding how they arrived at their current stage.

The next step is employing a marketing automation system that helps communicate with customers throughout their journey. Revising and fine-tuning this system can pay extensive dividends. Take the time to review your customer touchpoints and make sure customers at every leg of the journey are getting the information they need at that stage.

Competitor analysis

You should also take time to understand your competitors’ offerings, how they are presented, and decipher how they are interpreting the customer journey.

Through awareness and insight into your competitors, you can really recognize communication opportunities for those educated consumers that have a keen understanding of the marketplace.

From there, you can offer them valuable resources that may not be available anywhere else.

Content and the customer journey

So you now have all of these data points, and can identify needs and curiosities across your customer journey – how does that play into a search engine optimization strategy?

The next phase is evaluating your content and aligning it to the customer journey. This should uncover any gaps in your content where adding helpful resources will likely prove beneficial to your consumers.

Are there pieces of content that would make your customers’ decision process quicker or smoother? These resources may provide value simply by giving your customers more assurance that they are making the right decision (remember that competitor analysis you did?).

With content gaps identified, is there a variation of that content theme that prospective buyers are searching for more than something else? This is where you start your keyword research.

Search Console Content Keywords

As you are working through that keyword research, though, always keep customer intent in mind. Just because a keyword has a high search volume and low competition does not necessarily mean that the related content is right for your audience.

Don’t be afraid to test content and get feedback from your customers (and even from your potential customers). Understanding how your audience uses and digests your content will help inform the shape that your future content should take.

Use your analytics to continually gauge the effectiveness of your content. Try changing up headlines, where the links to content pieces are located, and which pieces of content get priority placement on the page.

All of these tests can expose those hidden gems of design wisdom that make your site the resource your potential and existing customers turn to when they need answers.

The bottom line

You need to really get to know how your customers think, develop content that they need across their entire journey, and test to see what works best at each stage.

Your success will come not only from customers that experience satisfying visits to your site, but also from delivering better experiences for potentially your most influential site visitors of all – the search engines.

Kevin Gamache is Senior Search Strategist at Wire Stone, an independent digital marketing agency.

from Search Engine Watch https://searchenginewatch.com/2016/05/10/adopting-a-consumer-mindset-for-your-seo-strategy/
via Auto Feed

How Nordstrom strategically beat Zappos in Google Search

Five years ago, in April 2011, Zappos’ market share in Google was more than three times as large as Nordstrom‘s. 

Today, Nordstrom has twice the market share on Google as Zappos.

visibility-index-zappos.com-vs-nordstrom.com_.png

During the time between April 2011 to December 2012, Zappos.com managed to increase its market share by 51% (going from a visibility score of 42.9 to 63.42 points), while Nordstrom increased their visibility 13 points to 54.9. A huge jump in market share by 302%.

At this point, they became a direct competitor to Zappos, with both domains having 50% of their keywords in Google in common.

In September 2013, Nordstrom.com took off, leaving Zappos.com in the dust. Since then, Nordstrom.com has continuously increased its market share, climbing by 65% from September 2013 until today, with a visibility score of 90.78 points.

During the same time, Zappos.com continuously lost market share and ended up at a -37.32% loss, dropping from 63.42 to 39.75 points.

What happened?

If I see something like the above, I like to quickly compare both domains. You can run such a comparison in the Toolbox by simply typing the domains into the search bar, separated by a comma: zappos.com,nordstrom.com

We can see that links are not a problem for Zappos. They actually have nearly a million and a half links, from about 5,000 domains, more than Nordstrom.

zappos vs nordstrom

Competitive comparison of Zappos.com and Nordstrom.com

The thing that quickly catches the eye is the large discrepancy in the amount of indexed pages for both domains. Zappos has a whopping 56 million pages indexed and that can be a huge problem.

If we look at the number of keywords for which Zappos has a Top 100 ranking (17,000) and compare it to the number of indexed pages, we get a ratio of 3,310 indexed pages for every keyword in the Top 100.

If we compare this number to some other domains, we see how inefficient this is:

  • Wikipedia has a ratio of 107
  • Walmart has a ratio of 135
  • Nordstrom has a ratio of 182
  • Amazon has a ratio of 250
  • Zappos has a ratio of 3,310

Take a look at this:

Comparing the indexed pages for a specific product

Comparing the indexed pages for a specific product

This huge number of indexed pages is a big problem for Zappos’ crawling and indexing budget. The real enemies of both the crawling and indexing of large websites are web developers, JavaScript and chaos in general.

Let me show you some additional examples:

Zappos has quite a large number of product pages in the Google index which are not available anymore. At the same time, these pages are set to index/follow.

zappos see robots no follow

Ironically, they also have popular products, which ARE available, set to noindex/follow.

products set to noindex/follow

All these problems together will cause Google to crawl unnecessary URLs, which will deplete the crawl-budget for the domain. And this crawling power will be sorely missed, especially for such extensive projects.

Additionally, this crawling budget will define how often Googlebot crawls the first few levels of the domain and how often a deep crawl will take place.

We see something similar with the indexing budget: this budget decides on the maximum number of URLs which will be added to the Google index.

It is important to keep in mind that only URLs which are crawled regularly will stay within the index.

It could all be so easy. In theory, every piece of content should have a unique, logical, easy to understand URL, which stays exactly the same over the decades.

Sadly, this utopia does not hold up to the real world: web developers decide on creating the third print version of a page, Googlebot learns a bit more JavaScript and suddenly invents completely new URLs and the website gets its third CMS-relaunch in two years, which leaves the original URL-concept in tatters.

All of this will end the same way: Google will crawl unnecessary URLs and waste the domain’s crawling budget.

Conclusion

We can see that Nordstrom decided to compete with Zappos on about 50% of its keywords. For quite a while, both domains competed directly at the same level of visibility. Though in the end, Zappos’ onpage problems and a change in user behaviour has let to a stark contrast in visibility for both Domains.

If we look at which keywords both domains rank for, we notice that, in the beginning, Nordstrom only ranked for 23% of the keywords which Zappos had. Only three years later, Nordstrom already managed to rank for 50% of Zappos’ keywords.

This change shows us that Nordstrom actually decided to actively work on competing with Zappos. Today the tables have turned and Nordstrom directly competes on 67% of Zappos’ keywords.

Number of keywords which Zappos.com and Nordstrom.com have in common

Number of keywords which Zappos.com and Nordstrom.com have in common

When we talk about user behaviour, we mean that, if the user has a choice between both a result on Nordstrom and Zappos, they will decide to go to Nordstrom.com. We can see this thanks to Google Trends. The user interest for both domains part ways in Mid 2012, just as direct competition started.

This post was originally published on the Sistrix blog, reprinted by permission.

from Search Engine Watch https://searchenginewatch.com/2016/05/10/how-nordstrom-strategically-beat-zappos-in-google-search/
via Auto Feed