SEO 101: eight simple ways to optimise your blog posts for search

You don’t have to be an SEO expert to optimise your content in just a few steps and improve your page’s ranking on search engine results pages (SERPs).

SEO might sound complicated for beginners, but in fact, everyone can start applying a few basic tips that will affect a post’s performance and eventually its ranking.

1. WordPress

WordPress can be very helpful for easy SEO optimisation, as it allows everyone to perform a series of quick steps to help search engines find your content. There are many plugins that can guide you with the optimisation of your content, while they can also measure your post’s performance in terms of SEO success.

  • How often do you use your focus keywords?
  • Does your content pass the readability test?
  • Should you use more headings?

‘All in One SEO pack’, ‘SEO by SQUIRRLY’, and ‘Yoast SEO’ are among the most popular SEO plugins, but you can find numerous others to suit your needs and simplify the process of optimisation.

Wordpress SEO plugin

From page analysis to a sitemap generator, WordPress plugins can help you understand how SEO works, which may eventually help you improve your content to make it more appealing, both for your audience, but also for search engines.

2. Headline and title tags

A post’s headline is the user’s first impression on your content and this will affect whether the exposure will lead to an actual click, or not.

The title tag should be an accurate description of your content aiming to capture the audience’s attention, while helping search engines discover your content with the right optimisation.

Title tag was always an important part of on-page SEO optimisation, but Google’s semantic search has changed the rules of the game, encouraging people to think more of their audience and less of the keywords.

It’s not necessary anymore to use a specific keyword in your title tag, although it may be useful if you manage to use it in context, helping users understand more of the topic you’ll be covering.

According to a research by Backlinko, keyword-optimised title tags may still be associated with better ranking, but not in the same way that it mattered in the past.

Thus, it is important that every title is:

  • Concise (not more than 60 characters)
  • Relevant (inform the readers on what the post is about)
  • Enhancing readability (Is your title appealing to readers? What will make them click?)
  • Having an emotional impact ( The best titles manage to build emotional triggers, favouring their virality. It’s the emotional impact that is instantly built in the reader’s mind, making the click easier, mostly triggered by the provoked emotion)
  • Keyword-optimised (providing that it’s used in context)

A compelling title manages to attract both the readers and the search engines in less than 60 characters, which is more challenging than it seems, but it may also lead to an increased traffic.

coschedule headline analyzer tool

If you want to improve the craft of coming up with great headlines, then CoSchedule may help with its Headline Analyzer, which allows you to understand what makes an effective headline both for humans and search engines, ranking your titles and measuring it depending on its length and the type of words that you used.

3. Formatting

Good formatting is appreciated both by humans and search engines, as it makes the content more appealing.

Headings

Headings help the text’s readability by dividing it into smaller blocks, with <h1> serving as the part that needs to be highlighted more than the rest and <h2>, <h3>, <h4>, <h5>, <h6> creating an additional layer of importance compared to the normal paragraph text.

Each heading has a different size, in order to be easily distinguished, creating a hierarchical structure that increases the chances of users spending more time reading (and skipping to relevant) content.

Headings should follow the guidelines of the titles, be appealing and descriptive, and separate long blocks of content by creating a visual appeal.

In terms of SEO optimisation, headings help search engines spot the most important parts of your content and discover the topic that you’re writing about.

In fact, according to Searchmetrics, pages tend to use more H1 tags since 2014, while the presence of H1 and H2 tags also aid the user experience.

URL

URLs should be simple and clear both for humans and search engines. Although search engines are able to crawl even the most complex URL, it is still preferable to keep the URL simple, useful and relevant.

How URL length affects ranking position

Image source: Backlinko

As the URL is displayed in the SERPs along with the title and the meta description, it needs to convey the necessary information for your content, while its length may encourage the sharing of the content.

There are many options in WordPress on how to create an automatic URL, but once again, simplicity is preferred, as a shorter URL may help readers (and search engines) discover your post’s actual content.

Permalink-Settings Yoast SEO

Meta description

Meta descriptions provide a summary of the page’s content to search engines and should provide a concise and relevant description of your post, serving as a preview for readers, helping them decide whether they are going to visit the page or not.

According to Survey Monkey, 43.2% of people click on a result based on its meta description, which means that you need to use wisely the 160 characters maximum length.

A meta description should follow the rules of your actual content, be descriptive and well written, without overusing keywords for the sake of search engine optimisation. Even if you include your targeted keyword, make sure it is provided in context, always thinking of your audience first.

How often is keyword used in description?

Image Source: Searchmetrics

Semantic search has affected the impact of keywords in the description, but this doesn’t mean that they are still not used. It was back in 2009 when Google announced that meta descriptions and meta keywords don’t contribute to its ranking algorithms for search, but we still need to remember their importance as part of the preview snippet in SERPS, which is another case of putting the audience first when optimising.

4. Linking

Links were always important for SEO optimisation, but this led to many manipulative techniques in the past. Search engines have moved to the age of semantic context, which means that links may still be significant, but in a more relevant and useful way.

Internal links can enhance the user experience, as they help the audience navigate through your own site, in order to read further relevant posts. Any well-written link that is useful aims to allow a reader to continue its navigation through the site, boosting the content’s authority by linking a series of quality posts.

This also affects the crawlability of your content, as search engines perceive your posts as informative and relevant.

External linking was used cautiously in many occasions, out of fear that such a link only favours the linked source and not your own content, but this is not the case.

By adding links to external sources that are relevant to your content you are boosting your own authority, helping search engines understand your niche topic and reward you for your linking that aims to add value to your own post.

You don’t have to link to many external sites, rather the most important ones that are considered to be:

  • popular
  • trustworthy
  • relevant

External linking helps search engines learn more about your content, improving your credibility and even your ranking.

Backlinks were always an integral part of SEO optimisation, as they serve as the proof that your content is appreciated by others, improving your authority to a particular field.

Although Google is not keen on “unnatural” linking that serves no particular purpose, a backlink of high quality is always welcome, as it contributes both to your site’s ranking factors, but also to your content’s authority.Backlinks with keyword in anchor text saw a drop in usage recently

Image Source: Searchmetrics

During the past year we’ve seen a decrease of backlinks using a keyword to the anchor text and this is related once again to Google’s attempt to combat any kind of manipulative link with no context.

Any backlink from a highly trusted source may eventually lead to an increase of traffic and a boost in the search ranking factors and the best way to achieve it is to keep producing quality and informative content that will offer a unique perspective in its relevant field.

5. Images

Images do not just enhance the reading experience for your audience, but they are also important in your SEO optimisation.

As users can find your images directly through Google’s Image Search, it’s important to pay attention to their naming, in order to increase the chances of bringing traffic back to your site or boost your site’s ranking.

Image optimisation for SEO is simple, but it’s sometimes overlooked as a boring task. However, it’s great when a user discovers your content through image search, associating an image with your content and that’s why you should tart spending a few minutes to optimise your images from now on.

As well as the filename, which serves as the image’s title, it is also crucial to add “alt” text, which is essentially the description of your image. This section is about providing alternative text for your image, which will be displayed to a user’s browser if there is a problem with the actual image.

A Wodpress window showcasing the fields that enhance image optimisation

As search engines can only read and not ‘see’ an image, the alt text should be indicative of your file, trying to describe it in the best possible way.

Your description should be clear and concise and if you still need help with finding the right text to describe your image, keep in mind that alt text is used by screen reader software to describe an image to people with visual impairments.

6. How metrics affect SEO

It’s always useful to analyse the performance of your posts, with metrics and conversions helping you understand your content at a deeper level.

However, there is an indication that metrics such as the bounce rate and the SERP click-through-rate affect ranking, which makes their analysis even more important.

SEO optimisation can be improved with the analysis of your data

According to Backlinko, websites with low average bounce rate are ranking higher on the search results and despite the uncertainty regarding the direct correlation between these two, it still reminds us that engaging content offers numerous benefits for a site.

It’s the time spent on site along with the bounce rate that can lead to very interesting insights for your content, in order to discover what your audience likes and what needs to be improved.

Furthermore, the click-through rate has an even closer relationship with SEO, as it affects the content’s ranking position on SERPs, using the number of clicks as an indication of a content’s popularity, as this validation comes directly from your readers.

7. Readability over keyword-stuffing

Keywords may be among the most commonly used words along with SEO, but in 2016, it’s more important to focus on the quality of the content rather than the target keyword.

It’s the readability that will affect your search rankings more than the right use of a keyword, as the first one may improve the user experience, the level of engagement and the time spent on the site, while the use of keywords without a content of high quality simply looks like an automatic text, without taking into consideration the human factor.

Search engines value the human factor and thus, the level of readability can be improved with:

  • Easy-to-read text
  • Short sentences
  • Short paragraphs
  • Organised structure
  • In-depth knowledge of the topic
  • Focus on humans, not search engines

You can measure the readability of your content with WordPress plugins such as Yoast SEO and FD Word Statistics, while there are also many online tools, including Readability-Score and the Readability Test Tool.

Except for the goal of readability, you still need to add relevant keywords to your text, as they are still affecting the search ranking factors, provided that they are added naturally as part of your text, adding value and context to it.

8. Search engines love new content

Everyone loves fresh content, including search engines. By regularly creating new content you are increasing the chances to become an authority in your field, which may favour your ranking and boost your traffic.

You don’t have to create new content daily, but consistency and relevance will be highly appreciated, both by your audience and the search engines.

As you keep adding more valuable content, visitors will keep coming back, increasing the engagement while building your credibility.

Never sacrifice quality over quantity though, as this won’t be appreciated neither by readers, nor by search engines. 

from Search Engine Watch https://searchenginewatch.com/2016/05/11/seo-101-eight-ways-to-optimise-your-blog-posts-for-search/
via Auto Feed

Why should you focus on multiple keywords?

In Yoast SEO Premium you’re able to focus on multiple keywords. If you use our tool correctly, your text can be optimized for up to five keywords. In this post, I’ll explain to you why it’s important to use the multiple focus keyword functionality while optimizing your text.

how to use multiple focus keywords

Explaining (multiple) focus keywords

The Yoast SEO plugin helps you to optimize each and every post (or page) you write. Imagine yourself having a travel blog. For your travel blog, you’re writing a blog post about a road trip through California. The focus keyword is the word or phrase your audience will use in the search engines and for which you want your post to rank. In order to choose your focus keyword wisely, you should do some research! In our example, the most important keyword would be ‘road trip California’. Sometimes it’s hard to choose one keyword because you want a post to rank for more than one specific focus keyword. Perhaps you would also like to rank for a synonym or for a slightly different keyword. That’s when the multiple focus keywords come in handy! Let’s look at 4 examples in which optimizing for multiple keywords is the best strategy.

Synonyms

People search for different things. While some people will use the term road trip when searching for their vacation, others could very well use vacation, holiday or trip. To reach different groups of people, you should make sure that your post will rank for these different keywords.

More than one topic

Sometimes a post is about more than one topic or has a few subtopics. Our article about the road trip to California could be about planning for the road trip, as well as sightseeing in California. These two topics could very well fit into one article. In this case, you would like your article to rank for ‘sightseeing California’ as well as for ‘planning road trip’. And, you’d also like to rank for your most important keyword ‘road trip California’.

multiple focus keywords: multiple topics shown in google trends

Long tail keyword variants

A great strategy to get your content to rank in Google is to focus on long tail keywords. Long tail keywords will have far less competition and will be relatively easy to rank for.

If you were able to rank for multiple long tail keywords with one post, that would make it even more fruitful. Addressing multiple long tail variants of your focus keyword will be a great strategy. Optimizing your post for different long tail variants will give you the opportunity to be found for more search terms. In our example, one could, for instance, focus on road trip California and on two long tail variants: ‘road trip southern California’ and ‘road trip northern California’.

multiple focus keywords: long-tail keyword variants shown in Google trends

Key phrases

If people seek something rather specific, they tend to use key phrases. Sometimes, the word order of the words within these key phrases (and the use of stopwords) is important. If the word order and the use of stopwords is important, we would advise you to optimize your post on different variations of your focus keyword.

While investigating how Google handles stopwords, we found that a search term like ‘road trip California’ is handled in exactly the same manner as ‘California road trip’. The order of the words is irrelevant to Google. However, for the search [road trip in California], Google tries to find the exact match (and the order of the word is important). So, search queries with stopwords seem to be handled a bit different by Google.

multiple focus keywords: key phrases difference shown in google trends

How to use multiple focus keywords

Optimizing your post for multiple focus keywords is really easy! You should purchase Yoast SEO Premium and click on the tab in the Yoast SEO Premium box to add a new keyword:

multiple focus keywords: click plus sign to add a focus keyword

A new box will open and you can enter the second focus keyword you’d like to optimize your post for:

multiple focus keyword: input field

The plugin will run a check on the content to see if your post is optimized for all the focus keywords you entered.

Read more: ‘Blog SEO: befriend the long tail’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/focus-multiple-keywords/
via KCG Auto Feed

How Reposting Old Blog Content Will Boost Your Traffic

Content is king” is an adage that has been adopted and practiced by every successful blogger across the web for years. So practiced, that there’s a definite surplus in content. While content continues to flood the internet at an increasing rate, users continue to only consume so much of it. This creates a problem-or opportunity, depending on how you look at it-with blogging strategy. In an attempt to stay at the surface in such a competitive landscape, bloggers are cranking out content at a rate that takes away from the quality of their posts and contributes to the excess information put out there for users.

The effort of content production can be subsidized with reposting old blog content-an effort that can drastically increase your site’s traffic, make useable content readily available for your readers, and make your writing last longer. Content takes time to produce, so you might as well squeeze as much use out of it as you can. Part of doing that means upcycling and reposting old content, specifically the content that has performed well.

For nearly all blogs, a very small percentage of high performing posts generate the majority of traffic for your site. Take the prominent website Hubspot, for example. An analysis of their blog determined that 76% of their monthly blog views came from old posts, and similar analyses have revealed the same trend across the board. With the knowledge that the majority of blog views are coming from a handful of your older, most successful posts rather than the new stuff you’re racing the clock to finish, you can plot your next move: capitalizing on things you’ve already written by reposting old blog content.

How to Do It

  • Start by identifying your top posts (most blogging platforms and websites have this information available when you log in through the back end). Look for what’s called “evergreen” content-pieces that remain relevant and fresh in concept for users, even though time has passed since it was originally published.
  • Update the content. Stay on topic with what the post is about, but updates images, add a little more information, or rewrite a paragraph.
  • Evaluate your keywords. Did you miss any the first time? Can you add a little to the content to include some new keywords?
  • Don’t change the URL, but refresh the time stamp to reflect the current date.

Why It Works

It may seem like a practice that’s too easy to work, but it does. If the content is evergreen, it will still create a lot of user engagement by striking a relevant nerve with your audience. By tweaking your content just a little and reposting it at a later date, you can reach new users that didn’t see the post the first time who will like, link, share, and comment.

Not to be confused with duplicate content, reposted content will yield positive effects on search engine rankings. Google values quality content that performs well and offers real value to users. When you have content that has performed well, offers value, but gets buried by constant content production, reposting is a way to resurrect and reuse it.

from HigherVisibility http://www.highervisibility.com/blog/how-reposting-old-blog-content-will-boost-your-traffic/
via KCG Auto Feed

Here’s a new content marketing strategy documentation map

The majority of enterprise content marketers don’t have a documented strategy, according to recent research. The CMI found that almost two thirds of professional content folk haven’t yet bothered to write down their strategy.

In some circles that’s akin to not having a strategy at all, but I don’t find it particularly surprising. Plenty of experienced, established teams seem to work without documentation in place, but it seems to me that content marketing has evolved to the point where it’s really easy to lose focus.

I’m currently going through the process of establishing a content strategy from scratch and thought I’d share what I’m doing, which I’ve summarised in the visualisation below. I guess each of these areas could be a chapter heading in a handy reference guide for the team.

Content marketing strategy document map

All pretty top level, you understand, but I’ll explain a bit about each area below.

What do we mean by ‘document’?

There’s no standard template for this, so far as I know. In any event, what’s right for me might be wrong for you. But I would say that your ‘documentation’ should amount to more than a simple mission statement.

Big picture strategy slogans are one thing, but to actually make things happen, you need a lot of detail. Your team needs to know where to look – or who to look to – when they want something.

If you don’t have proper documentation in place, then they will look to you, and you will turn into a repetitive answer machine. Heavy bummer.

It might be that a lot of the supporting documentation already exists, in some shape or form. It’s just that it is unfinished, or out of date, or unstructured, and very possibly unshared. Why not put some time aside to get things together?

Assembling a collection of useful documents – alongside a goal-orientated series of targets – will help you to keep things on track. Your team will thank you for it, especially newcomers.

Let’s go through the four key areas to think about (Goals, Tactics, People, Processes), and the three others that are loosely filed under productivity (Assets, Tools and Tech).

In business, everything revolves around goals, unless you’re batshit crazy, so I’ll start there.

Goals

Content marketing teams exist to support all kinds of businesses goals. Some are more important than others. Goals can be strategic, tactical or based around task completion. Macro, micro, nano. Company, department, team member. Or mission, campaign, task.

Goals should be written down and ideally visible across teams, since you rarely work in a vacuum. Performance stats should be visible too, because transparency is a winning ticket.

Note that you always, always, always need a feedback loop, to measure what works and what doesn’t. Without that you cannot hope to function properly, nor maximise your chances of success. Nor, for that matter, demonstrate ROI (or the lack of it).

Screen Shot 2016-05-11 at 11.06.21

When it comes to goals, there are three main things to sort out…

Mission statement. This is your elevator pitch, and can probably be condensed into a sentence or so. You want to cram as much meaning and clarity in these few words as possible, to quickly answer questions such as “why are we investing in content marketing?”

Targets. You can use tools to set, assign and monitor goals, or just put something together in Google Docs and share it with whoever needs to see it.

Metrics. Once clearly defined goals and targets have been set you can take some measurements and track metrics as you move forwards. Set up your analytics reports and monitor performance as you progress.

Tactics

Once you know what your goals are, you can figure out how to go about achieving them. This is where tactics come into play.

Screen Shot 2016-05-11 at 11.06.48

Research. Gut feel is a fine place to start, but tactics should be based around insight, rather than opinions. This calls for some research. Use whatever sources of data and information you can to build up a picture of the world according to your target audience.

Audience research. Figure out needs, where they like to hang out, what makes them tick, what they respond to, which competitors they talk about, who their friends are, who they respect… that kind of stuff.

Customer research. You need to know who your existing customers are before finding similar people. How do customers interact with your brand? What works?

Competitor research. It’s worth having a sniff around but there’s no need to obsess over competitor activity. Worry about your own game. Planning is a natural extension of worrying.

Personas and user journeys. Put together some personas, user stories, customer journeys, and make sure everybody is aware of the paths you want visitors to take.

Keyword research. This is rather more audience-centric than the foundational technical SEO basics, such as making your site fast. Search queries reflect consumer intent, and it is your job to create the kind of content that ranks well for target phrases.

Keyword research works best when it is truly strategic, with content mapped to specific business goals. Your content comes only after you have defined and prioritised your keyword wishlist. Or you’re doing it wrong.

Incidentally, I pretty much live by Dan Shure’s brilliant article on using the Keyword Planner in a creative way.

Funnel. How do your customers actually become customers? Understand the various journeys through the funnel. See what’s working, and think about how your content can play a role at each stage.

Content mapping. Great, you’ve mapped content throughout the funnel, but happens after somebody has become a customer? Increasing retention and customer advocacy are two of the best things you can do in business, and your content can go a long way in supporting these primary business goals. Take ownership, if necessary.

Content mapping - funnel

Formats. After you’ve done your homework, you can start to think about the actual content. Thoughts will turn to the type of content you might create, and the formats you can use. What is possible, given your team, your budget and your platforms?

Distribution. Hold up, cowboy. Don’t let the tail wag the dog. In this case the tail is content. And the dog, well, that’s distribution. Simply put, why are you making a video when you haven’t given a moment’s thought to YouTube? And what feeds YouTube? Ah yeah, reddit does.

These channels are potentially going to be the difference between a small win in local circles and a global viral. Why wouldn’t you want to optimise your distribution channel strategy?

How will people find your article? What’s your social strategy? Are you going to do any paid distribution? How are you going to nail down some excellent Google placements?

Figure out how to get the best out of your main channels, and you’ll get way more bang for your buck from each piece of content.

SEO. The devs probably need to be informed about your preferred search setup. Right?

Once you’ve got this together it becomes much easier to direct your efforts, and change tack if necessary.

The main success factor will be linked to the quality of content you create, and that’s something that you can also provide documented guidance on. Share internal and external knowledge, and make it easily accessible across teams.

People

You may know who everybody is and why they matter, but does the rest of your team? Think about the various people who stand to benefit from your success, and always remember the ones who took on some risk when you started out.

Screen Shot 2016-05-11 at 11.06.03

Stakeholders. Not just the boss, but heads of other teams that will be affected by your efforts. Who are they? What do they need to be effective in their jobs? How can content marketing support their primary goals? Also, what have you promised? Make the business case readily available to your team so they know what is expected of them. Share presentations and team goals.

Content team. Who’s in the dream team? Who is your star player? What is everybody focusing on? How does everybody communicate? That might be as straightforward as sharing a simple organogram and a bunch of invitations to Slack.

Other teams. Who do you ask for a new button to be designed? Where do the company mugshots live? Is there a shared Dropbox folder? What are the guidelines around using this stuff?

External talent. Maybe you hired a PR agency, who should be kept in the loop about major content campaigns on the horizon. Maybe you have three freelance writers who don’t work in our office. And that weird guy who makes kickass videos from a shady basement. How will these people work together and where do their contact details live for when somebody needs something?

Influencers. This is really important: know who you want to get friendly with. These community-annointed leaders of tribes can help you in a big way. What can you do to attract their attention? I tend to store influencers (including media lists) in a spreadsheet. Other people’s Twitter Lists can be a goldmine.

Processes

This is where the action happens. What are the things to do before and after publication? What do you need to do to get a piece of content over the line?

Screen Shot 2016-05-11 at 11.05.28

Brainstorming. Where do we store our ideas? How often do we get together? What tools do we use? What’s the formula for deciding what kind of content to create? Whiteboard sessions and mindmaps all play their part.

Workflow. How do we operate as a team, and as team members? How should we work with other teams? What is the process for submitting work?

What tools do we use? I’ve played around with Trello to Basecamp to Google Docs but have never settled on one goal-orientated platform (so I’m actually building one). People use tools differently and there is often some kind of protocol to follow, otherwise your world looks like to-do list spaghetti.

Taxonomy. What’s that you say about metadata? What is the common vocabulary for labels, tags and categorisation? Should I write Ecommerce or ecommerce or e-commerce or E-commerce or eCommerce? How does the tech work to support this kind of thing? If you have some rules in place, then you should police them.

Checklists. What needs to be done prior to publication? Or sign off? What boxes need to be ticked? Did you sense check everything?

Sign Off. Is there a sign off process? Who has final say? Do we really need to run everything past the PR agency? Who has publishing rights? Who is allowed to edit?

The Other Stuff

Assets, tech and tools pretty much sit between the three key areas and the goals. I see these things as being very much in the heart of the practical, and used, referenced and updated during the production phase.

Assets are things like brand guidelines, which should cover all of the dos and don’ts you need to know before publishing even the smallest status update. Authors must know your brand inside out before they represent it, right?

You’ll also need a house style guide, for content creators and editors. And ideally some pointers about things like when to publish, or how to write amazing headlines.

You’ll also be primed for success if you go to the trouble of creating (and maintaining) a schedule, be a that a shared calendar or loose, spreadsheet-based plan of action. Put some dates in the diary, get some targets in place, and watch out for the things on the verge of falling off the radar (or worse, the dreaded blockers).

Tech covers off the various platforms you will use (owned, earned, rented, paid, etc). That might mean a blog, a YouTube channel or a paid media channel. It’s probably all three.

Tech also points to your kit, and how the tech team can help improve efficiency and performance. For example, if you’re blogging, what are your CMS needs? How could the editing interface be improved? How should you report bugs? This might mean JIRA tickets, or something similar, so let your team know about how best to wave flags.

Platforms and technology can be optimised, which is where UX comes into play. Content lives at the heart of UX, but there are obviously factors outside of the content team’s control. Be sure to bang the drum if your site is slow, or if something is broken.

UX also covers persuasion, which is something of an artform among switched-on content marketers.

Then we have Tools, which primarily sit between people and process, and should help you to get things done. Pretty self-explanatory.

In summary

It’s worth pausing for thought if you are part of an existing team and you don’t have the right documentation in place. Where should I look for that style guide? Exactly what kind of person am I writing for, and why? Who should sign this off? These are questions that no right-minded team leader wants to answer on a daily basis.

Or maybe, like me, you’re starting something up, or you have a new client and a blank page. It’s tempting to jump straight into content creation, but in the long run it’s going to be way better to put a well-documented plan of attack in place, with goals and supporting assets all neatly lined up.

Either way, it’s worth regularly reviewing your strategy and updating your documentation, especially when adjusting course. To that end, I created The Content Strategy Canvas to help you get together a top level picture of what you have going on (click the pic for a big, hi-res version).

Content Strategy Canvas - half

The canvas appears overly simplistic, but it is meant to be that way. It is a visual tool to help quickly communicate the key aspects of strategy on one page. No fluff required. The other documentation you might assemble having read this post will fill in the gaps.

And lo, you will become a cherished hero.

Anyway, that’s where I’m at. I’ll share a few specific content marketing templates in the future. I’d certainly love to hear any feedback and other approaches, so do leave a comment below or get in touch.

from Search Engine Watch https://searchenginewatch.com/2016/05/11/heres-a-new-content-marketing-strategy-documentation-map/
via Auto Feed

rel=canonical: the ultimate guide

The canonical URL allows you to tell search engines that certain similar URLs are actually one and the same. Sometimes you have products or content that is accessible under multiple URLs, or even on multiple websites. Using rel=canonical you can have this without harming your rankings.

History of rel=canonical

In February 2009 Google, Bing and Yahoo! introduced the canonical link element. Matt Cutt’s post is probably the easiest reading if you want to learn about its history. While the idea is simple, the specifics of how to use it turn out to be complex.

The rel=canonical element, often called the “canonical link”, is an HTML element that helps webmasters prevent duplicate content issues. It does this by specifying the “canonical”, or “preferred”, version of a web page. Using it well improves a site’s SEO.

The idea is simple: if you have several similar versions of the same content, you pick one “canonical” version and point the search engines at that. This solves the duplicate content problem where search engines don’t know which version of the content to show. This article takes you through the use cases and the anti-use cases.

The SEO benefit of rel=canonical

Choosing a proper canonical URL for every set of similar URLs improves the SEO of your site. Because the search engine knows which version is canonical, it can count all the links towards all the different versions, as links to that single version. Basically, setting a canonical is similar to doing a 301 redirect, but without actually redirecting.

The process of canonicalization

When you have several choices for a products URL, canonicalization is the process of picking one. In many cases, it’ll be obvious: one URL will be better than others. In some cases, it might not be as obvious, but then it’s still rather easy: pick one! Not canonicalizing your URLs is always worse than not canonicalizing your URLs.

canonical graphic 1024x630

How to set canonical URLs

Correct example of using rel=canonical

Let’s assume you have two versions of the same page. Exactly, 100% the same content. They differ in that they’re in separate sections of your site and because of that the background color and the active menu item differ. That’s it. Both versions have been linked from other sites, the content itself is clearly valuable. Which version should a search engine show? Nobody knows.

For example’s sake, these are their URLs:

  • http://example.com/wordpress/seo-plugin/
  • http://example.com/wordpress/plugins/seo/

This is what rel=canonical was invented for. Especially in a lot of e-commerce systems, this (unfortunately) happens fairly often. A product has several different URLs depending on how you got there. You would apply rel=canonical as follows:

  1. You pick one of your two pages as the canonical version. It should be the version you think is the most important one. If you don’t care, pick the one with the most links or visitors. If all of that’s equal: flip a coin. You need to choose.
  2. Add a rel=canonical link from the non-canonical page to the canonical one. So if we picked the shortest URL as our canonical URL, the other URL would link to the shortest URL like so in the <head> section of the page:
    <link rel="canonical" href="http://example.com/wordpress/seo-plugin/">

    That’s it. Nothing more, nothing less.

What this does is “merge” the two pages into one from a search engine’s perspective. It’s basically a “soft redirect”, without redirecting the user. Links to both URLs now count for the single canonical version of the URL.

Setting the canonical in Yoast SEO

If you use Yoast SEO, you can change the canonical of several page types using the plugin. You only need to do this if you want to change the canonical to something different than the current page’s URL. Yoast SEO already renders the correct canonical URL for almost any page type in a WordPress install.

For posts, pages and custom post types, you can edit the canonical in the advanced tab of the Yoast SEO metabox:

canonical-in-yoast-seo

For categories, tags and other taxonomy terms, you can change them in the Yoast SEO metabox too, in the same spot. If you have other advanced use cases, you can always use the wpseo_canonical filter to change the Yoast SEO output.

When should you use canonical URLs?

301 redirect or canonical?

If you have the choice of doing a 301 redirect or setting a canonical, what should you do? The answer is simple: if there are no technical reasons not to do a redirect, you should always do a redirect. If you cannot redirect because that would break the user experience or be otherwise problematic: set a canonical URL.

Should a page have a self-referencing canonical URL?

In the example above, we make the non-canonical page link to the canonical version. But should a page set a rel canonical for itself? This is a highly debated topic amongst SEOs. At Yoast we have a strong preference for having a canonical link element on every page and Google has confirmed that’s best. The reason is that most CMSes will allow URL parameters without changing the content. So all of these URLs would show the same content:

  • http://example.com/wordpress/seo-plugin/
  • http://example.com/wordpress/seo-plugin/?isnt=it-awesome
  • http://example.com/wordpress/seo-plugin/?cmpgn=twitter
  • http://example.com/wordpress/seo-plugin/?cmpgn=facebook

The issue: if you don’t have a self-referencing canonical on the page that points to the cleanest version of the URL, you risk being hit by this stuff. Even if you don’t do it yourself, someone else could do this to you and cause a duplicate content issue. So adding a self-referencing canonical to URLs across your site is a good “defensive” SEO move. Luckily for you, our Yoast SEO plugin does this for you.

Cross-domain canonical URLs

You might have the same piece of content on several domains. For instance, SearchEngineJournal regularly republishes articles from Yoast.com (with explicit permission). Look at every one of those articles and you’ll see a rel=canonical link point right back at our original article. This means all the links pointing at their version of the article count towards the ranking of our canonical version. They get to use our content to please their audience, we get a clear benefit from it too. Everybody wins.

Faulty canonical URLs: common issues

There is a multitude of cases out there showing that a wrong rel=canonical implementation can lead to huge issues. I know of several sites that had the canonical on their homepage point to an article, and completely lost their home page from the search results. There are more things you shouldn’t do with rel=canonical. Let me list the most important ones:

  • Don’t canonicalize a paginated archive to page 1. The rel=canonical on page 2 should point to page 2. If you point it to page 1 search engines will actually not index the links on those deeper archive pages…
  • Make them 100% specific. For various reasons, many sites use protocol relative links, meaning they leave the http / https bit from their URLs. Don’t do this for your canonicals. You have a preference. Show it.
  • Base your canonical on the request URL. If you use variables like the domain or request URI used to access the current page while generating your canonical, you’re doing it wrong. Your content should be aware of its own URLs. Otherwise, you could still have the same piece of content on for instance example.com and www.example.com and have them both canonicalize to themselves.
  • Multiple rel=canonical links on a page causing havoc. Sometimes a developer of a plugin or extensions thinks that he’s God’s greatest gift to mankind and he knows best how to add a canonical to the page. Sometimes, that developer is right. But since you can’t all be me, they’re inevitably wrong too sometimes. When we encounter this in WordPress plugins we try to reach out to the developer doing it and teach them not to, but it happens. And when it does, the results are wholly unpredictable.

rel=canonical and social networks

Facebook and Twitter honor rel=canonical too. This might lead to weird situations. If you share a URL on Facebook that has a canonical pointing elsewhere, Facebook will share the details from the canonical URL. In fact, if you add a like button on a page that has a canonical pointing elsewhere, it will show the like count for the canonical URL, not for the current URL. Twitter works in the same way.

Advanced uses of rel=canonical

Canonical link HTTP header

Google also supports a canonical link HTTP header. The header looks like this:

Link: <http://www.example.com/white-paper.pdf>; 
  rel="canonical"

Canonical link HTTP headers can be very useful when canonicalizing files like PDFs, so it’s good to know that the option exists.

Using rel=canonical on not so similar pages

While I won’t recommend this, you can definitely use rel=canonical very aggressively. Google honors it to an almost ridiculous extent, where you can canonicalize a very different piece of content to another piece of content. If Google catches you doing this, it will stop trusting your site’s canonicals and thus cause you more harm…

Using rel=canonical in combination with hreflang

In our ultimate guide on hreflang, we talk about canonical. It’s very important that when you use hreflang, each language’s canonical points to itself. Make sure that you understand how to use canonical well when you’re implementing hreflang as otherwise you might kill your entire hreflang implementation.

Conclusion: rel=canonical is a power tool

Rel=canonical is a powerful tool in an SEO’s toolbox, but like any power tool, you should use it wisely as it’s easy to cut yourself. For larger sites, the process of canonicalization can be very important and lead to major SEO improvements.

Read more: ‘Duplicate content: causes and solutions’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/rel-canonical/
via KCG Auto Feed

Adopting a consumer mindset for your SEO strategy

Search engines continue to adjust algorithms to better match how consumers actually conduct their searches. How should this affect our optimization efforts?

Consumers aren’t born in a vacuum, nor do they live in one. They’re inundated with a brand’s messages, ideas, discussions, and controversy across all of their digital and offline encounters.

As businesses and marketers, we must not only educate novice buyers on the benefits of a product, but also engage educated potential customers who already have some knowledge of their choices and are making final decisions between products.

Identify consumer needs

The first step in this process is identifying the needs of the consumer at each of the stages where they’ll be searching for products.

What drives their curiosity? What is creating their need? A strategic response to this question can be pursued in several different ways.

A good way to begin is surveying your website visitors.

Onsite surveys can be a powerful way to gain an understanding of customers and what information they’re after. Think of engaging ways to talk to your audience in this way, putting them on their journey to being a customer.

NPS-survey

Customer tracking

You may have already started tracking visitor activities in your analytics platform – analytics that you can tie to a stage in your customer’s lifecycle.

In the B2B sector, this can mean a visitor downloading a whitepaper that gives some entry-level explanation of your product category, or, on the B2C side, could include a customer adding a specific product to his shopping cart on your website but not completing the purchase.

These are all opportunities to re-engage audiences by understanding how they arrived at their current stage.

The next step is employing a marketing automation system that helps communicate with customers throughout their journey. Revising and fine-tuning this system can pay extensive dividends. Take the time to review your customer touchpoints and make sure customers at every leg of the journey are getting the information they need at that stage.

Competitor analysis

You should also take time to understand your competitors’ offerings, how they are presented, and decipher how they are interpreting the customer journey.

Through awareness and insight into your competitors, you can really recognize communication opportunities for those educated consumers that have a keen understanding of the marketplace.

From there, you can offer them valuable resources that may not be available anywhere else.

Content and the customer journey

So you now have all of these data points, and can identify needs and curiosities across your customer journey – how does that play into a search engine optimization strategy?

The next phase is evaluating your content and aligning it to the customer journey. This should uncover any gaps in your content where adding helpful resources will likely prove beneficial to your consumers.

Are there pieces of content that would make your customers’ decision process quicker or smoother? These resources may provide value simply by giving your customers more assurance that they are making the right decision (remember that competitor analysis you did?).

With content gaps identified, is there a variation of that content theme that prospective buyers are searching for more than something else? This is where you start your keyword research.

Search Console Content Keywords

As you are working through that keyword research, though, always keep customer intent in mind. Just because a keyword has a high search volume and low competition does not necessarily mean that the related content is right for your audience.

Don’t be afraid to test content and get feedback from your customers (and even from your potential customers). Understanding how your audience uses and digests your content will help inform the shape that your future content should take.

Use your analytics to continually gauge the effectiveness of your content. Try changing up headlines, where the links to content pieces are located, and which pieces of content get priority placement on the page.

All of these tests can expose those hidden gems of design wisdom that make your site the resource your potential and existing customers turn to when they need answers.

The bottom line

You need to really get to know how your customers think, develop content that they need across their entire journey, and test to see what works best at each stage.

Your success will come not only from customers that experience satisfying visits to your site, but also from delivering better experiences for potentially your most influential site visitors of all – the search engines.

Kevin Gamache is Senior Search Strategist at Wire Stone, an independent digital marketing agency.

from Search Engine Watch https://searchenginewatch.com/2016/05/10/adopting-a-consumer-mindset-for-your-seo-strategy/
via Auto Feed

How Nordstrom strategically beat Zappos in Google Search

Five years ago, in April 2011, Zappos’ market share in Google was more than three times as large as Nordstrom‘s. 

Today, Nordstrom has twice the market share on Google as Zappos.

visibility-index-zappos.com-vs-nordstrom.com_.png

During the time between April 2011 to December 2012, Zappos.com managed to increase its market share by 51% (going from a visibility score of 42.9 to 63.42 points), while Nordstrom increased their visibility 13 points to 54.9. A huge jump in market share by 302%.

At this point, they became a direct competitor to Zappos, with both domains having 50% of their keywords in Google in common.

In September 2013, Nordstrom.com took off, leaving Zappos.com in the dust. Since then, Nordstrom.com has continuously increased its market share, climbing by 65% from September 2013 until today, with a visibility score of 90.78 points.

During the same time, Zappos.com continuously lost market share and ended up at a -37.32% loss, dropping from 63.42 to 39.75 points.

What happened?

If I see something like the above, I like to quickly compare both domains. You can run such a comparison in the Toolbox by simply typing the domains into the search bar, separated by a comma: zappos.com,nordstrom.com

We can see that links are not a problem for Zappos. They actually have nearly a million and a half links, from about 5,000 domains, more than Nordstrom.

zappos vs nordstrom

Competitive comparison of Zappos.com and Nordstrom.com

The thing that quickly catches the eye is the large discrepancy in the amount of indexed pages for both domains. Zappos has a whopping 56 million pages indexed and that can be a huge problem.

If we look at the number of keywords for which Zappos has a Top 100 ranking (17,000) and compare it to the number of indexed pages, we get a ratio of 3,310 indexed pages for every keyword in the Top 100.

If we compare this number to some other domains, we see how inefficient this is:

  • Wikipedia has a ratio of 107
  • Walmart has a ratio of 135
  • Nordstrom has a ratio of 182
  • Amazon has a ratio of 250
  • Zappos has a ratio of 3,310

Take a look at this:

Comparing the indexed pages for a specific product

Comparing the indexed pages for a specific product

This huge number of indexed pages is a big problem for Zappos’ crawling and indexing budget. The real enemies of both the crawling and indexing of large websites are web developers, JavaScript and chaos in general.

Let me show you some additional examples:

Zappos has quite a large number of product pages in the Google index which are not available anymore. At the same time, these pages are set to index/follow.

zappos see robots no follow

Ironically, they also have popular products, which ARE available, set to noindex/follow.

products set to noindex/follow

All these problems together will cause Google to crawl unnecessary URLs, which will deplete the crawl-budget for the domain. And this crawling power will be sorely missed, especially for such extensive projects.

Additionally, this crawling budget will define how often Googlebot crawls the first few levels of the domain and how often a deep crawl will take place.

We see something similar with the indexing budget: this budget decides on the maximum number of URLs which will be added to the Google index.

It is important to keep in mind that only URLs which are crawled regularly will stay within the index.

It could all be so easy. In theory, every piece of content should have a unique, logical, easy to understand URL, which stays exactly the same over the decades.

Sadly, this utopia does not hold up to the real world: web developers decide on creating the third print version of a page, Googlebot learns a bit more JavaScript and suddenly invents completely new URLs and the website gets its third CMS-relaunch in two years, which leaves the original URL-concept in tatters.

All of this will end the same way: Google will crawl unnecessary URLs and waste the domain’s crawling budget.

Conclusion

We can see that Nordstrom decided to compete with Zappos on about 50% of its keywords. For quite a while, both domains competed directly at the same level of visibility. Though in the end, Zappos’ onpage problems and a change in user behaviour has let to a stark contrast in visibility for both Domains.

If we look at which keywords both domains rank for, we notice that, in the beginning, Nordstrom only ranked for 23% of the keywords which Zappos had. Only three years later, Nordstrom already managed to rank for 50% of Zappos’ keywords.

This change shows us that Nordstrom actually decided to actively work on competing with Zappos. Today the tables have turned and Nordstrom directly competes on 67% of Zappos’ keywords.

Number of keywords which Zappos.com and Nordstrom.com have in common

Number of keywords which Zappos.com and Nordstrom.com have in common

When we talk about user behaviour, we mean that, if the user has a choice between both a result on Nordstrom and Zappos, they will decide to go to Nordstrom.com. We can see this thanks to Google Trends. The user interest for both domains part ways in Mid 2012, just as direct competition started.

This post was originally published on the Sistrix blog, reprinted by permission.

from Search Engine Watch https://searchenginewatch.com/2016/05/10/how-nordstrom-strategically-beat-zappos-in-google-search/
via Auto Feed

The (Hollow) Soul of Technology

The Daily Obituary

As far as being an investable business goes, news is horrible.

And it is getting worse by the day.

Look at these top performers.

The above chart looks ugly, but in reality it puts an optimistic spin on things…

  • it has survivorship bias
  • the Tribune Company has already went through bankruptcy
  • the broader stock market is up huge over the past decade after many rounds of quantitative easing and zero (or even negative) interest rate policy
  • the debt carrying costs of the news companies are also artificially low due to the central banking bond market manipulation
  • the Tribune Company recently got a pop on a buy out offer

Selling The Story

Almost all the solutions to the problems faced by the mainstream media are incomplete and ultimately will fail.

That doesn’t stop the market from selling magic push button solutions. The worse the fundamentals get, the more incentive (need) there is to sell the dream.

Video

Video will save us.

No it won’t.

Video is expensive to do well and almost nobody at any sort of scale on YouTube has an enviable profit margin. Even the successful individuals who are held up as the examples of success are being squeezed out and Google is trying to push to make the site more like TV. As they get buy in from big players they’ll further squeeze out the indy players – just like general web search.

Even if TV shifts to the web, along with chunks of the associated ad budget, most of the profits will be kept by Google & ad tech management rather than flowing to publishers.

Some of the recent acquisitions are more about having more scale on an alternative platform or driving offline commerce rather than hoping for online ad revenue growth.

Expand Internationally

The New York times is cutting back on their operations in Paris.

Spread Across Topics

What impact does it have on Marketwatch’s brand if you go there for stocks information and they advise you on weight loss tips?

And, once again, when everyone starts doing that it is no longer a competitive advantage.

There have also been cases where newspapers like The New York Times acquired About.com only to later sell it for a loss. And now even About.com is unbundling itself.

Native Ads

The more companies who do them & the more places they are seen, the lower the rates go, the less novel they will seem, and the greater the likelihood a high-spending advertiser decides to publish it on their own site & then drive the audience directly to their site.

When it is rare or unique it stands out and is special, justifying the extra incremental cost. But when it is a scaled process it is no longer unique enough to justify the vastly higher cost.

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Get Into Affiliate Marketing

It won’t scale across all the big publishers. It only works well at scale in select verticals and as more entities test it they’ll fill up the search results and end up competing for a smaller slice of attention. Further, each new affiliate means every other affiliate’s cookie lasts for a shorter duration.

It is unlikely news companies will be able to create commercially oriented review content at scale while having the depth of Wirecutter.

“We move as much product as a place 10 times bigger than us in terms of audience,” Lam said in an interview. “That’s because people trust us. We earn that trust by having such deeply-researched articles.”

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Charging People to Comment

It won’t work, as it undermines the social proof of value the site would otherwise have from having many comments on it.

Meal Delivery Kits

Absurd. And a sign of extreme desperation.

Trust Tech Monopolies

Here is Doug Edwards on Larry Page:

He wondered how Google could become like a better version of the RIAA – not just a mediator of digital music licensing – but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it.

If we just give Google or Facebook greater control, they will save us.

No they won’t.

You are probably better off selling meal kits.

As time passes, Google and Facebook keep getting a larger share of the pie, growing their rake faster than the pie is growing.

Here is the RIAA’s Cary Sherman on Google & Facebook:

Just look at Silicon Valley. They’ve done an extraordinary job, and their market cap is worth gazillions of dollars. Look at the creative industries — not just the music industry, but all of them. All of them have suffered.

Over time media sites are becoming more reliant on platforms for distribution, with visitors having fleeting interest: “bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today.”

Accelerated Mobile Pages and Instant Articles?

These are not solutions. They are only a further acceleration of the problem.

How will giving greater control to monopolies that are displacing you (while investing in AI) lead to a more sustainable future for copyright holders? If they host your content and you are no longer even a destination, what is your point of differentiation?

If someone else hosts your content & you are depended on them for distribution you are competing against yourself with an entity that can arbitrarily shift the terms on you whenever they feel like it.

“The cracks are beginning to show, the dependence on platforms has meant they are losing their core identity,” said Rafat Ali “If you are just a brand in the feed, as opposed to a brand that users come to, that will catch up to you sometime.”

Do you think you gain leverage over time as they become more dominant in your vertical? Not likely. Look at how Google’s redesigned image search shunted traffic away from the photographers. Google’s remote rater guidelines even mentioned giving lower ratings to images with watermaks on them. So if you protect your works you are punished & if you don’t, good luck negotiating with a monopoly. You’ll probably need the EU to see any remedy there.

When something is an embarrassment to Google & can harm their PR fixing it becomes a priority, otherwise most the costs of rights management fall on the creative industry & Google will go out of their way to add cost to that process. Facebook is, of course, playing the same game with video freebooting.

Algorithms are not neutral and platforms change what they promote to suit their own needs.

As the platforms aim to expand into new verticals they create new opportunities, but those opportunities are temporal.

Whatever happened to Zynga?

Even Buzzfeed, the current example of success on Facebook, missed their revenue target badly, even as they become more dependent on the Facebook feed.

“One more implication of aggregation-based monopolies is that once competitors die the aggregators become monopsonies — i.e. the only buyer for modularized suppliers. And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate.” – Ben Thompson

Long after benefit stops passing to the creative person the platform still gets to re-use the work. The Supreme Court only recentlyrefused to hear the ebook scanning case & Google is already running stories about using romance novels to train their AI. How long until Google places their own AI driven news rewrites in front of users?

Who then will fund journalism?

Dumb it Down

Remember how Panda was going to fix crap content for the web? eHow has removed literally millions of articles from their site & still has not recovered in Google. Demand Media’s bolt-on articles published on newspaper sites still rank great in Google, but that will at some point get saturated and stop being a growth opportunity, shifting from growth to zero sum to a negative sum market, particularly as Google keeps growing their knowledge scraper graph.

Now maybe if you dumb it down with celebrity garbage you get quick clicks from other channels and longterm SEO traffic doesn’t matter as much.

But if everyone is pumping the same crap into the feed it is hard to stand out. When everyone starts doing it the strategy is no longer a competitive advantage. Further, if you build a business that is algorithmically optimized for short-term clicks is also optimizing for its own longterm irrelevancy.

Yahoo’s journalists used to joke amongst themselves about the extensive variety of Kind bars provided, but now the snacks aren’t being replenished. Instead, employees frequently remind each other that there is little reason to bother creating quality work within Yahoo’s vast eco-system of middle-brow content. “You are competing against Kim Kardashian’s ass,” goes a common refrain.

Yahoo’s billion-person-a-month home page is run by an algorithm, with a spare editorial staff, that pulls in the best-performing content from across the site. Yahoo engineers generally believed that these big names should have been able to support themselves, garner their own large audiences, and shouldn’t have relied on placement on the home page to achieve large audiences. As a result, they were expected to sink or swim on their own.

“Yahoo is reverting to its natural form,” a former staffer told me, “a crap home page for the Midwest.”

That is why Yahoo! ultimately had to shut down almost all their verticals. They were optimized algorithmically for short term wins rather than building things with longterm resonance.

Death by bean counter.

The above also has an incredibly damaging knock on effect on society.

People miss the key news. “what articles got the most views, and thus “clicks.” Put bluntly, it was never the articles on my catching Bernanke pulling system liquidity into the maw of the collapse in 2008, while he maintained to Congress he had done the opposite.” – Karl Denninger

The other issue is PR is outright displacing journalism. As bad as that is at creating general disinformation, it gets worse when people presume diversity of coverage means a diversity of thought process, a diversity of work, and a diversity of sources. Even people inside the current presidential administration state how horrible this trend is on society:

“All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.” … “We created an echo chamber,” he told the magazine. “They [the seemingly independent experts] were saying things that validated what we had given them to say.”

That is basically the government complaining to the press about it being “too easy” to manipulate the press.

Adding Echo to the Echo

Much of what “seems” like an algorithm on the tech platforms is actually a bunch of lowly paid humans pretending to be an algorithm.

This goes back to the problem of the limited diversity in original sources and rise of thin “take” pieces. Stories with an inconvenient truth can get suppressed, but “newsworthy” stories with multiple sources covering them may all use the same biased source.

After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm. … A topic was often blacklisted if it didn’t have at least three traditional news sources covering it

As algorithms take over more aspects of our lives and eat more of the media ecosystem, the sources they feed upon will consistently lose quality until some sort of major reset happens.

The strategy to keep sacrificing the long term to hit the short term numbers can seem popular. And then, suddenly, death.

You can say the soul is gone
And the feeling is just not there
Not like it was so long ago.
– Neil Young, Stringman

Micropayments & Paywalls

It is getting cheap enough that just about anyone can run a paid membership site, but it is quite hard to create something worth paying for on a recurring basis.

There are a few big issues with paywalls:

  • If you have something unique and don’t market it aggressively then nobody will know about it. And, in fact, in some businesses your paying customers may have no interest in sharing your content because they view it as one of their competitive advantages. This was one of the big reasons I ultimately had to shut down our membership site.
  • If you do market something well enough to create demand then some other free sites will make free derivatives, and it is hard to keep having new things to write worth paying for in many markets. Eventually you exhaust the market or get burned out or stop resonating with it. Even free websites have churn. Paid websites have to bring in new members to offset old members leaving.
  • In most markets worth being in there is going to be plenty of free sites in the vertical which dominate the broader conversation. Thus you likely need to publish a significant amount of information for free which leads into an eventual sale. But knowing where to put the free line & how to move it over time isn’t easy. Over the past year or two I blogged far less than I should have if I was going to keep running our site as a paid membership site.
  • And the last big issue is that a paywall is basically counter to all the other sort of above business models the mainstream media is trying. You need deeper content, better content, content that is not off topic, etc. Many of the easy wins for ad funded media become easy losses for paid membership sites. And just like it is hard for newspapers to ween themselves off of print ad revenues, it can be hard to undo many of the quick win ad revenue boosters if one wants to change their business model drastically. Regaining you sou takes time, and often, death.

“It’s only after we’ve lost everything that we’re free to do anything.” ― Chuck Palahniuk, Fight Club

Categories: 

from SEO Book http://www.seobook.com/newspaperhow
via KCG Auto Feed

Search marketing in China: the rise of so.com

Baidu, the leading Chinese search engine, is the third most popular search engine in the world, despite being mostly concentrated in and around China. That speaks clearly to the immense size and power of the Chinese market.

An estimated 507 million Chinese use search engines. This is an enormous marketplace for companies who want to grow overseas and engage with new prospective customers.

Although Google dominates much of the search engine traffic in North America and Europe, in China it is one of the least popular search engines.

Instead, Baidu, and its rising competitor Qihoo 360, control the landscape. Those interested in doing business in China will need to make sure they understand these search engines if they want to compete.

How is the Chinese market changing? – So.com

The market in China is quickly changing and evolving. Baidu has long dominated the search engine sphere, and they still control an estimated 54% of the search engine market share. Over the past few years, however, there has been a fast rising competitor that is seizing an increasing percentage of the search volume.

baidu serp

Qihoo 360 was developed by a security software company and its search engine so.com. It was only launched in 2012, but by 2015 it controlled an estimated 30% of the Chinese search market.

Its popularity has likely been influenced by the growth of mobile. By Q3 in 2014, mobile devices were the leading source of searches and revenue for Chinese search engine marketing, and Qihoo 360 has been responsible for building the most popular app store in China.

How is search engine marketing different in the APAC region than in the US?

Brands who want to expand overseas into the APAC region need to be familiar with the local ranking factors and how to conduct SEO for the popular search engines, particularly Baidu and so.com as optimizing for one site will allow you to improve your rankings on both.

Tips for SEO in China:

Do not try to get a website ranked by using automatic translators or just students of the language. Using a native speaker will provide you with an infinitely superior site, as you will be able to avoid major grammatical errors, have the content flow more naturally, select more relevant keywords and use vocabulary that resonates better with the local audience. Your site will fit better overall into the framework of the Chinese digital ecosystem. Translation issues can hurt your reputation and cause you to rank lower on the SERPs.

When setting up a website, you want to try and get a .CN domain. If that is not possible, then seek a .COM or .Net. You website should also be hosted in China and you should secure an ICP license from the Chinese Ministry of Industry and Information Technology. Avoid having multiple domains or subdomains.

It is imperative that you know the list of blacklisted words that cannot be posted online. Inclusion of these words can cause your site to be de-indexed and even taken down. Remember that your website can not criticize the government in any way.

As you build the website, keep your title tags under 35 characters in Simplified Chinese and your meta descriptions below 78 characters in Simplified Chinese.

Website speed is highly valued. Regularly test your site to make sure it loads quickly. Inbound links are also viewed as valuable for search rankings, so finding opportunities to build a strong backlink profile can be very helpful.

All of the links you create should be in plain HTML. In general, avoiding Javascript is preferred, because sometimes content in that format is not indexed.

The Chinese search engines value fresh content. So regularly publishing on your page will help boost your reputation and success. You should submit your blog posts to the Baidu News Feed, which will help you attract new readers to your material.

For businesses interested in expanding into Asia, understanding how the local search engine market is evolving and changing can be critical to creating sites that rank well on the local search engines.

For business expanding globally outside of the US, make sure you optimize for premium search engines for key regions such as Naver (South Korea) and Yandex (Russia) also!

from Search Engine Watch https://searchenginewatch.com/2016/05/09/search-marketing-in-china-the-rise-of-so-com/
via Auto Feed

Google Search Console: a complete overview

The Search Console (or Google Webmaster Tools as it used to be known) is a completely free and indispensably useful service offered by Google to all webmasters.

Although you certainly don’t have to be signed up to Search Console in order to be crawled and indexed by Google, it can definitely help with optimising your site and its content for search.

Search Console Dashboard

Search Console is where you can monitor your site’s performance, identify issues, submit content for crawling, remove content you don’t want indexed, view the search queries that brought visitors to your site, monitor backlinks… there’s lots of good stuff here.

Perhaps most importantly though, Search Console is where Google will communicate with you should anything go wrong (crawling errors, manual penalties, increase in 404 pages, malware detected, etc.)

If you don’t have a Search Console account, then you should get one now. You may find that you won’t actually need some of the other fancier, more expensive tools that essentially do the same thing.

To get started, all you need is a Google sign-in, which you probably already have if you regularly use Google or Gmail, and visit Search Console.

Then follow this complete guide which will take you through every tool and feature, as clearly and concisely as possible.

Please note: we published a guide to the old Webmaster Tools service, written by Simon Heseltine, back in 2014. This is an updated, rewritten version that reflects the changes and updates to Search Console since, but much of the credit should go to Simon for laying the original groundwork.

Quick Links:

Add a property

If you haven’t already, you will have to add your website to Search Console.

Just click on the big red Add a Property button, then add your URL to the pop-up box.

add property in search console

Verification

Before Search Console can access your site, you have to prove to Google that you’re an authorized webmaster. You don’t have be in charge, but you do need permission from whoever is.

There are five methods of verification for Search Console There’s no real preference as to which method you use, although Google does give prominence to its ‘recommended method’…

1) The HTML file upload: Google provides you with a HTML verification file that you need to upload to the root directory of your site. Once you’ve done that, you just click on the provided URL, hit the verify button and you’ll have full access to Search Console data for the site.

verify your site in search console

There are also four alternative methods if the above doesn’t suit…

alternate methods of uploading to Search Console2) HTML tag: this provides you with a meta tag that needs to be inserted in the <head> section of your homepage, before the first <body> section.

If you make any further updates to the HTML of your homepage, make sure the tag is still in place, otherwise your verification will be revoked. If this does happen, you’ll just have to go through the process again.

3) Domain Name Provider: here you’re presented with a drop down list of domain registrars or name providers, then Google will give you a step-by-step guide for inserting a TXT record to your DNS configuration.

4) Google Analytics: assuming you’re using Google Analytics and your Google account is the same one you’re using for Search Console, then you can verify the site this way, as long as the GA code is in the <head> section of your home page (and remains there), and you have ‘edit’ permission.

5) Google Tag Manager: this option allows you to use your own Google Tag Manager account to verify your site, providing you’re using the ‘container snippet’ and you have ‘manage’ permission.

Now that you’re verified, you’ll be able to see your site on the Home screen. (As well as any sites you’re also a webmaster for). Here you can access the site, add another property and see how many unread messages you’ve received from Google.

Search Console Home

if you click on your site, you will be taken to its own unique Dashboard.

For the purposes of the following walk-throughs, I’ll be using my own website Methods Unsound, which means you can see all the things I need to fix and optimise in my own project.

Dashboard

Here’s where you can access all of your site’s data, adjust your settings and see how many unread messages you have.

Search Console Dashboard

The left-hand Dashboard Menu is where you can navigate to all the reports and tools at your disposal.

The three visualisations presented on the Dashboard itself (Crawl Errors, Search Analytics, and Sitemaps) are quick glimpses at your general site health and crawlability. These act as short-cuts to reports found in the left-hand menu, so we’ll cover these as we walk-through the tools.

Also note that Google may communicate a message directly on the dashboard, if it’s deemed important enough to be pulled out of your Messages. As you can see I have errors on my AMP pages that need fixing, but we’ll look at this when we get to the Dashboard Menu section further down.

First let’s take a look at settings…

Settings

Clicking on the gear icon in the top right corner will give you access to a variety of simple tools, preferences and admin features.

search console preferences

Search Console Preferences

This is simply where you can set your email preferences. Google promises not to spam you with incessant emails so it’s best to opt-in.

Search Console Preferences - email

Site Settings

Here’s where you can set your preferred domain and crawl rate.

site settings

  • Preferred domain let’s you set which version of your site you’d like indexed and whether your site shows up in search results with the www prefix or without it. Links may point to your site using http://www.example.com or http://example.com, but choosing a preference here will set how the URL is displayed in search.Google states that: “If you don’t specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages” thus cannibalising your search visibility.
  • Crawl rate lets you slow down the rate that Googlebots crawls your site. You only need to do this if you’re having server issues and crawling is definitely responsible for slowing down the speed of your server. Google has pretty sophisticated algorithms to make sure your site isn’t hit by Googlebots too often, so this is a rare occurrence.

Change of Address

This is where you tell Google if you’ve migrated your entire site to a new domain.

Search Console Change of Address

Once your new site is live and you’ve permanently 301 redirected the content from your old site to the new one, you can add the new site to Search Console (following the Add a Property instructions from earlier). You can then check the 301 redirects work properly, check all your verification methods are still intact on both old and new sites, then submit your change of address.

This will help Google index your new site quicker, rather than if you just left the Googlebots to detect all your 301 redirects on their own accord.

Google Analytics Property

If you want to see Search Console data in Google Analytics, you can use this tool to associate a site with your GA account and link it directly with your reports.

Search Console Google Analytics Property

If you don’t have Google Analytics, there’s a link at the bottom of the page to set up a new account.

Users & Property Owners

Here you can see all the authorized users of the Search Console account, and their level of access.

Search Console Users and Property Owners

You can add new users here and set their permission level.

  • Anyone listed as an Owner will have permission to access every report and tool in Search Console.
  • Full permission users can do everything except add users, link a GA account, and inform Google of a change of address.
  • Those with Restricted permission have the same restrictions as Full permission users plus they only have limited viewing capabilities on data such as crawl errors and malware infections. Also they cannot submit sitemaps, URLs, reconsideration requests or request URL removals.

Verification Details

This lets you see the all the users of your Search Console account, their personal email addresses and how they were verified (including all unsuccessful attempts.)

Search Console verification details

You can unverify individuals here (providing you’re the owner).

Associates

Another Google platform, such as a G+ or AdWords, can be associated (or connected) with your website through Search Console. if you allow this association request, it will grant them capabilities specific to the platform they are associating with you.

Here’s an example direct from Google: “Associating a mobile app with a website tells Google Search to show search result links that point to the app rather than the website when appropriate.”

If you add an associate, they won’t be able to see any data in Search Console, but they can do things like publish apps or extensions to the Chrome Web Store on behalf of your site.

Search Console associates

Here’s where you’ll find all your reports and tools available in the Search Console.

Search Console Dashboard menu

Let’s look at each option one-by-one.

Messages

Here’s where Google communicates with webmasters.

Search Console All Messages

Again, you won’t get spammed here as Google promises not to bombard you with more than a couple of messages a month. You do need to pay attention when you do receive one though as this is where you’ll be informed if your site’s health is compromised.

This can be anything from a rise in 404 pages, to issues with crawling your site, or even more serious problems like your site being infected with malware.

Search Appearance

If you click on the ? icon to the right of ‘Search Appearance’ a handy pop-up will appear. Search Appearance Overview breaks down and explains each element of the search engine results page (SERP).

Search appearance Dashboard

By clicking on each individual element, an extra box of information will appear telling you how to optimise that element to influence click-through, and where to find extra optimisation guidance within Search Console.

Search Console Dashboard explainer

Structured Data

Structured data is a way for a webmaster to add information to their site that informs Google about the context of any given webpage and how it should appear in search results.

For example, you can add star ratings, calorie counts, images or customer ratings to your webpage’s structured data and these may appear in the snippets of search results.

captain america civil war review rich snippet

The Structured Data section in Search Console contains information about all the structured data elements Google has located on your site, whether from Schema markup or other microformats.

structured data in search console

It will also show you any errors it has found while crawling your structured data. If you click on the individual ‘Data Types’ it will show you exactly which URLs contain that particular markup and when it was detected.

If you click one of the URLs listed, you can see a further breakdown of the data, as well as a tool to show exactly how it looks in live search results. Just click on ‘Test Live Data’ and it will fetch and validate the URL using Google’s Structured Data Testing Tool.

Search Console Structured Data test

Data Highlighter

Data Highlighter is an alternative to adding structured data to your HTML. As the explainer video below says, it’s a point and click tool where you can upload any webpage then highlight various elements to tell Google how you want that page to appear in search results.

There’s no need to implement any code on the website itself and you can set the Data Highlighter so it tags similar pages for you automatically.

To begin, click on the big red ‘Start Highlighting’ button…

Search Console Data Highlighter

Then enter the URL you wish to markup…

Search Console Data Highlighter upload

Then start highlighting and tagging…

structured data highlighter

After you hit publish, Google will take your added structured data into account once it has recrawled your site. You can also remove any structured data by clicking ‘Unpublish’ on the same page if you change your mind.

HTML Improvements

This is where Search Console will recommend any improvements to your meta descriptions and title tags, as well as informing you of any non-indexable content.

Search Console HTML Improvements

This is a very handy, easy-to-use feature that gives you optimisation recommendations that you can action right away.

For instance, if I click on the ‘Short meta descriptions’ link, I’ll be able to see the 14 URLs and their respective meta descriptions. I can then go into each one of these pages in my own CMS and add lengthier, more pertinent text.

Search Console HTML Improvements meta descriptions

Title tags and meta descriptions should be unique for each page and fall within certain character lengths, so for the purposes of both user experience and keeping Google informed about your site, this is a worthwhile report.

Sitelinks

Sitelinks are the subcategories that appear under the main URL when you search for a brand or a publisher.

sitelinks example

Sadly you can’t specify to Google which categories you want highlighted here, but if you’re popular enough and your site’s architecture is solid enough then these will occur organically.

However in the Sitelinks section of Search Console, you can tell Google to remove a webpage that you DON’T wish to be included as a sitelink in your search results.

Search Console Sitelinks

Accelerated Mobile Pages

This is a brand new tool, as Google’s AMP programme has only been available since earlier this year. AMP is a way for webmasters to serve fast-loading, stripped down webpages specifically to mobile users. Site speed and mobile friendliness are considered ranking signals so this is an important feature, although some SEOs are slow to adopt it.

As you can see from the report below, we’ve just started introducing AMP to our webpages and making a bit of a hash of it…

Search Console Accelerated Mobile Pages report

Accelerated Mobile Pages lets you see all the pages on your site with AMP implemented and which ones have errors. If you click on the error, you can see a list of your URLs with errors. Then by clicking on the URL, you will be recommended a fix by Google.

Search Console Accelerated Mobile Pages fix

Clearly we have some custom JavaScript issues on our site that need addressing. If you click on the ‘Open Page’ button, you can see exactly how your AMP content appears on mobile.

Search Traffic

Search Analytics

Search Analytics tells you how much traffic you get from search, revealing clicks and impressions delivered on SERPs. It will also work out your click-through rate (CTR) and reveal your average organic position for each page.

And here’s the *really* good stuff… you can also see the queries that searchers are using in order to be served your site’s content.

Search Console Search Analytics

The data for this is collected differently from Google Analytics, so don’t expect it to tally, however what this feature is really useful for is seeing which keywords and phrases are driving traffic to your site, as well as individual traffic-generating pages.

You can toggle between a variety of options, filters and date-ranges. I highly recommend looking at Impressions and CTR, to see which pages are generating high visibility but low click-through rate. Perhaps all these pages need is a tweak of a meta-description or some structured data?

Links to Your Site

Here’s where you can see the domains that link to your site and its content the most, as well as your most linked webpages.

Search Console Links to Your Site

This isn’t an exhaustive list, but a good indicator of where your content is appreciated enough to be linked. Clicking on the URLs on the right hand-side will show where they’re being linked to individually.

Internal Links

Here is where you can see how often each page on your site has been internally linked. Clicking on each ‘Target page’ will show a list of URLs where the internal link occurs.

Search Console Internal Links

There is a limit to how many ‘Target pages’ Search Console will show you, but if you have a small number of pages you can reverse the sort order and see which target pages have zero internal links. You can then go into your site and give these pages an internal link, or redirect them to somewhere else if they’re old legacy pages.

Manual Actions

This is where Google will inform you if it has administered a manual action to your site or specific webpage.

GWT Manual Actions

Google will offer any recommendations for you to act upon here, and will give you the chance to resubmit your site for reconsideration after you’ve fixed any problems.

Here’s a guide to what Google will most likely give you a manual penalty for and how you can avoid it.

International Targeting

Here you can target an audience based on language and country.

Search Console International Targeting

  • Country: If you have a neutral top-level domain (.com or .org), geotargeting helps Google determine how your site appears in search results, particularly for geographic queries. Just pick your chosen country from the drop-down menu. If you don’t want your site associated with any country, select ‘Unlisted’.
  • Language: If you manage a website for users speaking a different language, you need to make sure that search results display the correct version of your pages. To do this, insert hreflang tags in your site’s HTML, as this is what Google uses to match a user’s language preference to the right version of your pages. Or alternatively you can use sitemaps to submit language and regional alternatives for your pages.

Mobile usability

As mobile has overtaken desktop for searches this year, obviously your site has to be mobile-friendly, otherwise you’re providing a poor user experience to potentially half your visitors.

This report tells you of any issues your site has with mobile usability. And you’ll really want to be seeing the following message, as Google explicitly states you’ll otherwise be demoted.

Search Console Mobile Usability

Possible errors that will be highlighted by Search Console here include:

  • Flash usage: mobile browsers do not render Flash-based content, so don’t use it.
  • Viewport not configured: visitors to your site use a variety of devices with differing screen sizes so your pages should specify a viewport using the meta viewport tag.
  • Fixed-width viewport: viewports fixed to a pixel-size width will flag up errors. Responsive design should help solve this.
  • Content not sized to viewport: if a user has to scroll horizontally to see words and images, this will come up as an error.
  • Small font size: if your font size is too small to be legible and requires mobile users to ‘pinch to zoom’ this will need to be changed.
  • Touch elements too close: tappable buttons that are too close together can be a nightmare for mobile visitors trying to navigate your site.
  • Interstitial usage: Google will penalise you if you’re using a full-screen interstitial pop-up to advertise an app when a user visits your mobile site.

Google Index

Index Status

This lets you know how many pages of your website are currently included in Google’s index.

Search Console Index Status

You can quickly see any worrying trends from the last year (for instance that little dip in May 2015), as well as any pages that have been blocked by robots or removed.

Content Keywords

Here you can see the most common keywords found by the Googlebots as they last crawled your site.

Search Console Content Keywords

If you click on each keyword, you’ll be able to see the other synonyms found for that keyword, as well as the number of occurrences.

As Simon Heseltine suggests, look out for unexpected, unrelated keywords showing up as it’s an indication your site may have been hacked and hidden keywords have been injected into your pages.

Blocked resources

This section lets you know of any images, CSS, JavaScript or other resources on your site that’s blocked to Googlebots.

Search Console Blocked Resources

These are listed by host-name, then by specific pages, which you can follow steps to diagnose and resolve.

Remove URLs

Where essentially you can make your content disappear from Google.

remove urls search console

This only acts as a temporary fix, but by the time you’ve done this and either deleted your offending webpage or 301 redirected it elsewhere, there theoretically should no longer be a record of it.

Just enter the URL then select whether you want it removed from the search results and the cache, just from the cache or if you want an entire directory removed.

Be warned: this request can take between two to 12 hours to be processed.

Crawl

Crawl Errors

This report shows all the errors that Google has found when crawling your site over the last 90 days.

Search Console Crawl Errors

Site errors: the top half of the screen shows three tabs, where if you click on each you can see any past problems with your DNS, your server connectivity or whether a crawl had to be postponed. (Google will postpone a crawl rather than risk crawling URLs you don’t want indexed).

URL errors: the bottom half of the screen shows URL errors for desktop, smartphone and feature phone (a phone that can access the internet, but doesn’t have the advanced features of a smartphone).

You’ll likely see reports for the following on all three device types:

  • Server error: Google can’t access your site because the server is too slow to respond, or because your site is blocking Google.
  • Soft 404: this occurs when your server returns a real page for a URL that doesn’t actually exist on your site. You should replace these pages with 404 (Not found) or a 410 (Gone) return codes.
  • Not found: these are all your 404 pages that occur when a Googlebot attempts to visit a page that doesn’t exist (because you deleted it or renamed it without redirecting the old URL, etc.) Generally 404 pages are fine and won’t harm your rankings, so only pay attention to the ones related to high-ranking content.

Crawl Stats

This section shows the progress of Googlebots crawling your site in the last 90 days.

Search Console Crawl Stats

You can see how fast your pages are being crawled, kilobytes downloaded per day and average time spent downloading pages on your site.

Spikes are perfectly normal, and there’s not very much you can do about them. But if you see a sustained drop in any of these charts then it might be worth investigating to see what’s dragging it down.

Fetch as Google

Here you can check how any page on your website is seen by Google once its been been crawled.

You can also submit these webpages for indexing. You may find this is a quicker way to be crawled and indexed then if you were to let Google find the page automatically.

Search Console Fetch as Google

  • When you ‘Fetch’ a page, Google will simulate a crawl and you can quickly check any network connectivity problems or security issues with your site.
  • ‘Fetch and Render’ does the same as the above, but it also lets you check how the page itself looks on mobile or desktop, including all resources on the page (such as images and scripts) and will let you know if any of these are blocked to Googlebots.

Remember the crawler is meant to see the same page as the visitor would, so this is a good way to get a direct on-page comparison.

If the page is successfully fetched and rendered, you can submit it to the index. You are allowed 500 webpage fetches per week, but you can only submit a webppage and have Google crawl ALL the pages linked within it, 10 times per month.

robots.txt Editor

A robots.txt file placed within the root of your site, is where you can specify pages you don’t want crawled by search engines. Typically this is used because you don’t want your server overwhelmed by Googlebots, particularly if you want them to ignore script or style files, or if you want certain images not to appear in Google Image Search.

Here is where you can edit your robots.txt and check for errors. The bottom of the page reveals your errors and warnings.

robots.txt editor search console

Sitemaps

Sitemaps are hosted on the server of your website and they basically inform search engines of every page of your site, including any new ones added. It’s a good way to let Google better crawl and understand your website.

Here’s where you can access all of the information about any sitemaps either submitted manually or found by Search Console. The blue bar represents pages or images submitted, the red bar represents actual pages and images indexed.

sitemaps search console

You can test a sitemap by clicking the ‘Add/Test sitemap’ button, and if it’s valid you can then add it to Search Console.

URL Parameters

As Simon Heseltine has previously commented, this section isn’t used much anymore since the introduction of canonical tags.

However you should use URL Parameters if, for instance, you need to tell Google to distinguish between pages targeted to different countries. These preferences can encourage Google to crawl a preferred version of your URL or prevent Google from crawling duplicate content on your site.

URL parameters

Security Issues

Although any security issues will be communicated with you in the Messages section and on the Dashboard screen, here’s where you can check on problems in more detail.

Search Console Security Issues

There’s also plenty of accessible information here about how to fix your site if it’s been hacked or been infected with malware.

Other Resources

Here’s where you can access all the tools provided by Google, outside of Search Console. Including the Structured Data Testing Tool and Markup Helper, which we went into greater detail about in earlier sections.

Search Console Other Resources

Other helpful resources here are the Google My Business Center, where you can use to improve your business’s local search visibility and the PageSpeed Insights tool, which will tell you exactly how well your site is performing on mobile and desktop in terms of loading time, and how to fix any issues.

from Search Engine Watch https://searchenginewatch.com/2016/05/09/google-search-console-a-complete-overview/
via Auto Feed