The (Hollow) Soul of Technology

The Daily Obituary

As far as being an investable business goes, news is horrible.

And it is getting worse by the day.

Look at these top performers.

The above chart looks ugly, but in reality it puts an optimistic spin on things…

  • it has survivorship bias
  • the Tribune Company has already went through bankruptcy
  • the broader stock market is up huge over the past decade after many rounds of quantitative easing and zero (or even negative) interest rate policy
  • the debt carrying costs of the news companies are also artificially low due to the central banking bond market manipulation
  • the Tribune Company recently got a pop on a buy out offer

Selling The Story

Almost all the solutions to the problems faced by the mainstream media are incomplete and ultimately will fail.

That doesn’t stop the market from selling magic push button solutions. The worse the fundamentals get, the more incentive (need) there is to sell the dream.

Video

Video will save us.

No it won’t.

Video is expensive to do well and almost nobody at any sort of scale on YouTube has an enviable profit margin. Even the successful individuals who are held up as the examples of success are being squeezed out and Google is trying to push to make the site more like TV. As they get buy in from big players they’ll further squeeze out the indy players – just like general web search.

Even if TV shifts to the web, along with chunks of the associated ad budget, most of the profits will be kept by Google & ad tech management rather than flowing to publishers.

Some of the recent acquisitions are more about having more scale on an alternative platform or driving offline commerce rather than hoping for online ad revenue growth.

Expand Internationally

The New York times is cutting back on their operations in Paris.

Spread Across Topics

What impact does it have on Marketwatch’s brand if you go there for stocks information and they advise you on weight loss tips?

And, once again, when everyone starts doing that it is no longer a competitive advantage.

There have also been cases where newspapers like The New York Times acquired About.com only to later sell it for a loss. And now even About.com is unbundling itself.

Native Ads

The more companies who do them & the more places they are seen, the lower the rates go, the less novel they will seem, and the greater the likelihood a high-spending advertiser decides to publish it on their own site & then drive the audience directly to their site.

When it is rare or unique it stands out and is special, justifying the extra incremental cost. But when it is a scaled process it is no longer unique enough to justify the vastly higher cost.

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Get Into Affiliate Marketing

It won’t scale across all the big publishers. It only works well at scale in select verticals and as more entities test it they’ll fill up the search results and end up competing for a smaller slice of attention. Further, each new affiliate means every other affiliate’s cookie lasts for a shorter duration.

It is unlikely news companies will be able to create commercially oriented review content at scale while having the depth of Wirecutter.

“We move as much product as a place 10 times bigger than us in terms of audience,” Lam said in an interview. “That’s because people trust us. We earn that trust by having such deeply-researched articles.”

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Charging People to Comment

It won’t work, as it undermines the social proof of value the site would otherwise have from having many comments on it.

Meal Delivery Kits

Absurd. And a sign of extreme desperation.

Trust Tech Monopolies

Here is Doug Edwards on Larry Page:

He wondered how Google could become like a better version of the RIAA – not just a mediator of digital music licensing – but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it.

If we just give Google or Facebook greater control, they will save us.

No they won’t.

You are probably better off selling meal kits.

As time passes, Google and Facebook keep getting a larger share of the pie, growing their rake faster than the pie is growing.

Here is the RIAA’s Cary Sherman on Google & Facebook:

Just look at Silicon Valley. They’ve done an extraordinary job, and their market cap is worth gazillions of dollars. Look at the creative industries — not just the music industry, but all of them. All of them have suffered.

Over time media sites are becoming more reliant on platforms for distribution, with visitors having fleeting interest: “bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today.”

Accelerated Mobile Pages and Instant Articles?

These are not solutions. They are only a further acceleration of the problem.

How will giving greater control to monopolies that are displacing you (while investing in AI) lead to a more sustainable future for copyright holders? If they host your content and you are no longer even a destination, what is your point of differentiation?

If someone else hosts your content & you are depended on them for distribution you are competing against yourself with an entity that can arbitrarily shift the terms on you whenever they feel like it.

“The cracks are beginning to show, the dependence on platforms has meant they are losing their core identity,” said Rafat Ali “If you are just a brand in the feed, as opposed to a brand that users come to, that will catch up to you sometime.”

Do you think you gain leverage over time as they become more dominant in your vertical? Not likely. Look at how Google’s redesigned image search shunted traffic away from the photographers. Google’s remote rater guidelines even mentioned giving lower ratings to images with watermaks on them. So if you protect your works you are punished & if you don’t, good luck negotiating with a monopoly. You’ll probably need the EU to see any remedy there.

When something is an embarrassment to Google & can harm their PR fixing it becomes a priority, otherwise most the costs of rights management fall on the creative industry & Google will go out of their way to add cost to that process. Facebook is, of course, playing the same game with video freebooting.

Algorithms are not neutral and platforms change what they promote to suit their own needs.

As the platforms aim to expand into new verticals they create new opportunities, but those opportunities are temporal.

Whatever happened to Zynga?

Even Buzzfeed, the current example of success on Facebook, missed their revenue target badly, even as they become more dependent on the Facebook feed.

“One more implication of aggregation-based monopolies is that once competitors die the aggregators become monopsonies — i.e. the only buyer for modularized suppliers. And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate.” – Ben Thompson

Long after benefit stops passing to the creative person the platform still gets to re-use the work. The Supreme Court only recentlyrefused to hear the ebook scanning case & Google is already running stories about using romance novels to train their AI. How long until Google places their own AI driven news rewrites in front of users?

Who then will fund journalism?

Dumb it Down

Remember how Panda was going to fix crap content for the web? eHow has removed literally millions of articles from their site & still has not recovered in Google. Demand Media’s bolt-on articles published on newspaper sites still rank great in Google, but that will at some point get saturated and stop being a growth opportunity, shifting from growth to zero sum to a negative sum market, particularly as Google keeps growing their knowledge scraper graph.

Now maybe if you dumb it down with celebrity garbage you get quick clicks from other channels and longterm SEO traffic doesn’t matter as much.

But if everyone is pumping the same crap into the feed it is hard to stand out. When everyone starts doing it the strategy is no longer a competitive advantage. Further, if you build a business that is algorithmically optimized for short-term clicks is also optimizing for its own longterm irrelevancy.

Yahoo’s journalists used to joke amongst themselves about the extensive variety of Kind bars provided, but now the snacks aren’t being replenished. Instead, employees frequently remind each other that there is little reason to bother creating quality work within Yahoo’s vast eco-system of middle-brow content. “You are competing against Kim Kardashian’s ass,” goes a common refrain.

Yahoo’s billion-person-a-month home page is run by an algorithm, with a spare editorial staff, that pulls in the best-performing content from across the site. Yahoo engineers generally believed that these big names should have been able to support themselves, garner their own large audiences, and shouldn’t have relied on placement on the home page to achieve large audiences. As a result, they were expected to sink or swim on their own.

“Yahoo is reverting to its natural form,” a former staffer told me, “a crap home page for the Midwest.”

That is why Yahoo! ultimately had to shut down almost all their verticals. They were optimized algorithmically for short term wins rather than building things with longterm resonance.

Death by bean counter.

The above also has an incredibly damaging knock on effect on society.

People miss the key news. “what articles got the most views, and thus “clicks.” Put bluntly, it was never the articles on my catching Bernanke pulling system liquidity into the maw of the collapse in 2008, while he maintained to Congress he had done the opposite.” – Karl Denninger

The other issue is PR is outright displacing journalism. As bad as that is at creating general disinformation, it gets worse when people presume diversity of coverage means a diversity of thought process, a diversity of work, and a diversity of sources. Even people inside the current presidential administration state how horrible this trend is on society:

“All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.” … “We created an echo chamber,” he told the magazine. “They [the seemingly independent experts] were saying things that validated what we had given them to say.”

That is basically the government complaining to the press about it being “too easy” to manipulate the press.

Adding Echo to the Echo

Much of what “seems” like an algorithm on the tech platforms is actually a bunch of lowly paid humans pretending to be an algorithm.

This goes back to the problem of the limited diversity in original sources and rise of thin “take” pieces. Stories with an inconvenient truth can get suppressed, but “newsworthy” stories with multiple sources covering them may all use the same biased source.

After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm. … A topic was often blacklisted if it didn’t have at least three traditional news sources covering it

As algorithms take over more aspects of our lives and eat more of the media ecosystem, the sources they feed upon will consistently lose quality until some sort of major reset happens.

The strategy to keep sacrificing the long term to hit the short term numbers can seem popular. And then, suddenly, death.

You can say the soul is gone
And the feeling is just not there
Not like it was so long ago.
– Neil Young, Stringman

Micropayments & Paywalls

It is getting cheap enough that just about anyone can run a paid membership site, but it is quite hard to create something worth paying for on a recurring basis.

There are a few big issues with paywalls:

  • If you have something unique and don’t market it aggressively then nobody will know about it. And, in fact, in some businesses your paying customers may have no interest in sharing your content because they view it as one of their competitive advantages. This was one of the big reasons I ultimately had to shut down our membership site.
  • If you do market something well enough to create demand then some other free sites will make free derivatives, and it is hard to keep having new things to write worth paying for in many markets. Eventually you exhaust the market or get burned out or stop resonating with it. Even free websites have churn. Paid websites have to bring in new members to offset old members leaving.
  • In most markets worth being in there is going to be plenty of free sites in the vertical which dominate the broader conversation. Thus you likely need to publish a significant amount of information for free which leads into an eventual sale. But knowing where to put the free line & how to move it over time isn’t easy. Over the past year or two I blogged far less than I should have if I was going to keep running our site as a paid membership site.
  • And the last big issue is that a paywall is basically counter to all the other sort of above business models the mainstream media is trying. You need deeper content, better content, content that is not off topic, etc. Many of the easy wins for ad funded media become easy losses for paid membership sites. And just like it is hard for newspapers to ween themselves off of print ad revenues, it can be hard to undo many of the quick win ad revenue boosters if one wants to change their business model drastically. Regaining you sou takes time, and often, death.

“It’s only after we’ve lost everything that we’re free to do anything.” ― Chuck Palahniuk, Fight Club

Categories: 

from SEO Book http://www.seobook.com/newspaperhow
via KCG Auto Feed

Search marketing in China: the rise of so.com

Baidu, the leading Chinese search engine, is the third most popular search engine in the world, despite being mostly concentrated in and around China. That speaks clearly to the immense size and power of the Chinese market.

An estimated 507 million Chinese use search engines. This is an enormous marketplace for companies who want to grow overseas and engage with new prospective customers.

Although Google dominates much of the search engine traffic in North America and Europe, in China it is one of the least popular search engines.

Instead, Baidu, and its rising competitor Qihoo 360, control the landscape. Those interested in doing business in China will need to make sure they understand these search engines if they want to compete.

How is the Chinese market changing? – So.com

The market in China is quickly changing and evolving. Baidu has long dominated the search engine sphere, and they still control an estimated 54% of the search engine market share. Over the past few years, however, there has been a fast rising competitor that is seizing an increasing percentage of the search volume.

baidu serp

Qihoo 360 was developed by a security software company and its search engine so.com. It was only launched in 2012, but by 2015 it controlled an estimated 30% of the Chinese search market.

Its popularity has likely been influenced by the growth of mobile. By Q3 in 2014, mobile devices were the leading source of searches and revenue for Chinese search engine marketing, and Qihoo 360 has been responsible for building the most popular app store in China.

How is search engine marketing different in the APAC region than in the US?

Brands who want to expand overseas into the APAC region need to be familiar with the local ranking factors and how to conduct SEO for the popular search engines, particularly Baidu and so.com as optimizing for one site will allow you to improve your rankings on both.

Tips for SEO in China:

Do not try to get a website ranked by using automatic translators or just students of the language. Using a native speaker will provide you with an infinitely superior site, as you will be able to avoid major grammatical errors, have the content flow more naturally, select more relevant keywords and use vocabulary that resonates better with the local audience. Your site will fit better overall into the framework of the Chinese digital ecosystem. Translation issues can hurt your reputation and cause you to rank lower on the SERPs.

When setting up a website, you want to try and get a .CN domain. If that is not possible, then seek a .COM or .Net. You website should also be hosted in China and you should secure an ICP license from the Chinese Ministry of Industry and Information Technology. Avoid having multiple domains or subdomains.

It is imperative that you know the list of blacklisted words that cannot be posted online. Inclusion of these words can cause your site to be de-indexed and even taken down. Remember that your website can not criticize the government in any way.

As you build the website, keep your title tags under 35 characters in Simplified Chinese and your meta descriptions below 78 characters in Simplified Chinese.

Website speed is highly valued. Regularly test your site to make sure it loads quickly. Inbound links are also viewed as valuable for search rankings, so finding opportunities to build a strong backlink profile can be very helpful.

All of the links you create should be in plain HTML. In general, avoiding Javascript is preferred, because sometimes content in that format is not indexed.

The Chinese search engines value fresh content. So regularly publishing on your page will help boost your reputation and success. You should submit your blog posts to the Baidu News Feed, which will help you attract new readers to your material.

For businesses interested in expanding into Asia, understanding how the local search engine market is evolving and changing can be critical to creating sites that rank well on the local search engines.

For business expanding globally outside of the US, make sure you optimize for premium search engines for key regions such as Naver (South Korea) and Yandex (Russia) also!

from Search Engine Watch https://searchenginewatch.com/2016/05/09/search-marketing-in-china-the-rise-of-so-com/
via Auto Feed

Google Search Console: a complete overview

The Search Console (or Google Webmaster Tools as it used to be known) is a completely free and indispensably useful service offered by Google to all webmasters.

Although you certainly don’t have to be signed up to Search Console in order to be crawled and indexed by Google, it can definitely help with optimising your site and its content for search.

Search Console Dashboard

Search Console is where you can monitor your site’s performance, identify issues, submit content for crawling, remove content you don’t want indexed, view the search queries that brought visitors to your site, monitor backlinks… there’s lots of good stuff here.

Perhaps most importantly though, Search Console is where Google will communicate with you should anything go wrong (crawling errors, manual penalties, increase in 404 pages, malware detected, etc.)

If you don’t have a Search Console account, then you should get one now. You may find that you won’t actually need some of the other fancier, more expensive tools that essentially do the same thing.

To get started, all you need is a Google sign-in, which you probably already have if you regularly use Google or Gmail, and visit Search Console.

Then follow this complete guide which will take you through every tool and feature, as clearly and concisely as possible.

Please note: we published a guide to the old Webmaster Tools service, written by Simon Heseltine, back in 2014. This is an updated, rewritten version that reflects the changes and updates to Search Console since, but much of the credit should go to Simon for laying the original groundwork.

Quick Links:

Add a property

If you haven’t already, you will have to add your website to Search Console.

Just click on the big red Add a Property button, then add your URL to the pop-up box.

add property in search console

Verification

Before Search Console can access your site, you have to prove to Google that you’re an authorized webmaster. You don’t have be in charge, but you do need permission from whoever is.

There are five methods of verification for Search Console There’s no real preference as to which method you use, although Google does give prominence to its ‘recommended method’…

1) The HTML file upload: Google provides you with a HTML verification file that you need to upload to the root directory of your site. Once you’ve done that, you just click on the provided URL, hit the verify button and you’ll have full access to Search Console data for the site.

verify your site in search console

There are also four alternative methods if the above doesn’t suit…

alternate methods of uploading to Search Console2) HTML tag: this provides you with a meta tag that needs to be inserted in the <head> section of your homepage, before the first <body> section.

If you make any further updates to the HTML of your homepage, make sure the tag is still in place, otherwise your verification will be revoked. If this does happen, you’ll just have to go through the process again.

3) Domain Name Provider: here you’re presented with a drop down list of domain registrars or name providers, then Google will give you a step-by-step guide for inserting a TXT record to your DNS configuration.

4) Google Analytics: assuming you’re using Google Analytics and your Google account is the same one you’re using for Search Console, then you can verify the site this way, as long as the GA code is in the <head> section of your home page (and remains there), and you have ‘edit’ permission.

5) Google Tag Manager: this option allows you to use your own Google Tag Manager account to verify your site, providing you’re using the ‘container snippet’ and you have ‘manage’ permission.

Now that you’re verified, you’ll be able to see your site on the Home screen. (As well as any sites you’re also a webmaster for). Here you can access the site, add another property and see how many unread messages you’ve received from Google.

Search Console Home

if you click on your site, you will be taken to its own unique Dashboard.

For the purposes of the following walk-throughs, I’ll be using my own website Methods Unsound, which means you can see all the things I need to fix and optimise in my own project.

Dashboard

Here’s where you can access all of your site’s data, adjust your settings and see how many unread messages you have.

Search Console Dashboard

The left-hand Dashboard Menu is where you can navigate to all the reports and tools at your disposal.

The three visualisations presented on the Dashboard itself (Crawl Errors, Search Analytics, and Sitemaps) are quick glimpses at your general site health and crawlability. These act as short-cuts to reports found in the left-hand menu, so we’ll cover these as we walk-through the tools.

Also note that Google may communicate a message directly on the dashboard, if it’s deemed important enough to be pulled out of your Messages. As you can see I have errors on my AMP pages that need fixing, but we’ll look at this when we get to the Dashboard Menu section further down.

First let’s take a look at settings…

Settings

Clicking on the gear icon in the top right corner will give you access to a variety of simple tools, preferences and admin features.

search console preferences

Search Console Preferences

This is simply where you can set your email preferences. Google promises not to spam you with incessant emails so it’s best to opt-in.

Search Console Preferences - email

Site Settings

Here’s where you can set your preferred domain and crawl rate.

site settings

  • Preferred domain let’s you set which version of your site you’d like indexed and whether your site shows up in search results with the www prefix or without it. Links may point to your site using http://www.example.com or http://example.com, but choosing a preference here will set how the URL is displayed in search.Google states that: “If you don’t specify a preferred domain, we may treat the www and non-www versions of the domain as separate references to separate pages” thus cannibalising your search visibility.
  • Crawl rate lets you slow down the rate that Googlebots crawls your site. You only need to do this if you’re having server issues and crawling is definitely responsible for slowing down the speed of your server. Google has pretty sophisticated algorithms to make sure your site isn’t hit by Googlebots too often, so this is a rare occurrence.

Change of Address

This is where you tell Google if you’ve migrated your entire site to a new domain.

Search Console Change of Address

Once your new site is live and you’ve permanently 301 redirected the content from your old site to the new one, you can add the new site to Search Console (following the Add a Property instructions from earlier). You can then check the 301 redirects work properly, check all your verification methods are still intact on both old and new sites, then submit your change of address.

This will help Google index your new site quicker, rather than if you just left the Googlebots to detect all your 301 redirects on their own accord.

Google Analytics Property

If you want to see Search Console data in Google Analytics, you can use this tool to associate a site with your GA account and link it directly with your reports.

Search Console Google Analytics Property

If you don’t have Google Analytics, there’s a link at the bottom of the page to set up a new account.

Users & Property Owners

Here you can see all the authorized users of the Search Console account, and their level of access.

Search Console Users and Property Owners

You can add new users here and set their permission level.

  • Anyone listed as an Owner will have permission to access every report and tool in Search Console.
  • Full permission users can do everything except add users, link a GA account, and inform Google of a change of address.
  • Those with Restricted permission have the same restrictions as Full permission users plus they only have limited viewing capabilities on data such as crawl errors and malware infections. Also they cannot submit sitemaps, URLs, reconsideration requests or request URL removals.

Verification Details

This lets you see the all the users of your Search Console account, their personal email addresses and how they were verified (including all unsuccessful attempts.)

Search Console verification details

You can unverify individuals here (providing you’re the owner).

Associates

Another Google platform, such as a G+ or AdWords, can be associated (or connected) with your website through Search Console. if you allow this association request, it will grant them capabilities specific to the platform they are associating with you.

Here’s an example direct from Google: “Associating a mobile app with a website tells Google Search to show search result links that point to the app rather than the website when appropriate.”

If you add an associate, they won’t be able to see any data in Search Console, but they can do things like publish apps or extensions to the Chrome Web Store on behalf of your site.

Search Console associates

Here’s where you’ll find all your reports and tools available in the Search Console.

Search Console Dashboard menu

Let’s look at each option one-by-one.

Messages

Here’s where Google communicates with webmasters.

Search Console All Messages

Again, you won’t get spammed here as Google promises not to bombard you with more than a couple of messages a month. You do need to pay attention when you do receive one though as this is where you’ll be informed if your site’s health is compromised.

This can be anything from a rise in 404 pages, to issues with crawling your site, or even more serious problems like your site being infected with malware.

Search Appearance

If you click on the ? icon to the right of ‘Search Appearance’ a handy pop-up will appear. Search Appearance Overview breaks down and explains each element of the search engine results page (SERP).

Search appearance Dashboard

By clicking on each individual element, an extra box of information will appear telling you how to optimise that element to influence click-through, and where to find extra optimisation guidance within Search Console.

Search Console Dashboard explainer

Structured Data

Structured data is a way for a webmaster to add information to their site that informs Google about the context of any given webpage and how it should appear in search results.

For example, you can add star ratings, calorie counts, images or customer ratings to your webpage’s structured data and these may appear in the snippets of search results.

captain america civil war review rich snippet

The Structured Data section in Search Console contains information about all the structured data elements Google has located on your site, whether from Schema markup or other microformats.

structured data in search console

It will also show you any errors it has found while crawling your structured data. If you click on the individual ‘Data Types’ it will show you exactly which URLs contain that particular markup and when it was detected.

If you click one of the URLs listed, you can see a further breakdown of the data, as well as a tool to show exactly how it looks in live search results. Just click on ‘Test Live Data’ and it will fetch and validate the URL using Google’s Structured Data Testing Tool.

Search Console Structured Data test

Data Highlighter

Data Highlighter is an alternative to adding structured data to your HTML. As the explainer video below says, it’s a point and click tool where you can upload any webpage then highlight various elements to tell Google how you want that page to appear in search results.

There’s no need to implement any code on the website itself and you can set the Data Highlighter so it tags similar pages for you automatically.

To begin, click on the big red ‘Start Highlighting’ button…

Search Console Data Highlighter

Then enter the URL you wish to markup…

Search Console Data Highlighter upload

Then start highlighting and tagging…

structured data highlighter

After you hit publish, Google will take your added structured data into account once it has recrawled your site. You can also remove any structured data by clicking ‘Unpublish’ on the same page if you change your mind.

HTML Improvements

This is where Search Console will recommend any improvements to your meta descriptions and title tags, as well as informing you of any non-indexable content.

Search Console HTML Improvements

This is a very handy, easy-to-use feature that gives you optimisation recommendations that you can action right away.

For instance, if I click on the ‘Short meta descriptions’ link, I’ll be able to see the 14 URLs and their respective meta descriptions. I can then go into each one of these pages in my own CMS and add lengthier, more pertinent text.

Search Console HTML Improvements meta descriptions

Title tags and meta descriptions should be unique for each page and fall within certain character lengths, so for the purposes of both user experience and keeping Google informed about your site, this is a worthwhile report.

Sitelinks

Sitelinks are the subcategories that appear under the main URL when you search for a brand or a publisher.

sitelinks example

Sadly you can’t specify to Google which categories you want highlighted here, but if you’re popular enough and your site’s architecture is solid enough then these will occur organically.

However in the Sitelinks section of Search Console, you can tell Google to remove a webpage that you DON’T wish to be included as a sitelink in your search results.

Search Console Sitelinks

Accelerated Mobile Pages

This is a brand new tool, as Google’s AMP programme has only been available since earlier this year. AMP is a way for webmasters to serve fast-loading, stripped down webpages specifically to mobile users. Site speed and mobile friendliness are considered ranking signals so this is an important feature, although some SEOs are slow to adopt it.

As you can see from the report below, we’ve just started introducing AMP to our webpages and making a bit of a hash of it…

Search Console Accelerated Mobile Pages report

Accelerated Mobile Pages lets you see all the pages on your site with AMP implemented and which ones have errors. If you click on the error, you can see a list of your URLs with errors. Then by clicking on the URL, you will be recommended a fix by Google.

Search Console Accelerated Mobile Pages fix

Clearly we have some custom JavaScript issues on our site that need addressing. If you click on the ‘Open Page’ button, you can see exactly how your AMP content appears on mobile.

Search Traffic

Search Analytics

Search Analytics tells you how much traffic you get from search, revealing clicks and impressions delivered on SERPs. It will also work out your click-through rate (CTR) and reveal your average organic position for each page.

And here’s the *really* good stuff… you can also see the queries that searchers are using in order to be served your site’s content.

Search Console Search Analytics

The data for this is collected differently from Google Analytics, so don’t expect it to tally, however what this feature is really useful for is seeing which keywords and phrases are driving traffic to your site, as well as individual traffic-generating pages.

You can toggle between a variety of options, filters and date-ranges. I highly recommend looking at Impressions and CTR, to see which pages are generating high visibility but low click-through rate. Perhaps all these pages need is a tweak of a meta-description or some structured data?

Links to Your Site

Here’s where you can see the domains that link to your site and its content the most, as well as your most linked webpages.

Search Console Links to Your Site

This isn’t an exhaustive list, but a good indicator of where your content is appreciated enough to be linked. Clicking on the URLs on the right hand-side will show where they’re being linked to individually.

Internal Links

Here is where you can see how often each page on your site has been internally linked. Clicking on each ‘Target page’ will show a list of URLs where the internal link occurs.

Search Console Internal Links

There is a limit to how many ‘Target pages’ Search Console will show you, but if you have a small number of pages you can reverse the sort order and see which target pages have zero internal links. You can then go into your site and give these pages an internal link, or redirect them to somewhere else if they’re old legacy pages.

Manual Actions

This is where Google will inform you if it has administered a manual action to your site or specific webpage.

GWT Manual Actions

Google will offer any recommendations for you to act upon here, and will give you the chance to resubmit your site for reconsideration after you’ve fixed any problems.

Here’s a guide to what Google will most likely give you a manual penalty for and how you can avoid it.

International Targeting

Here you can target an audience based on language and country.

Search Console International Targeting

  • Country: If you have a neutral top-level domain (.com or .org), geotargeting helps Google determine how your site appears in search results, particularly for geographic queries. Just pick your chosen country from the drop-down menu. If you don’t want your site associated with any country, select ‘Unlisted’.
  • Language: If you manage a website for users speaking a different language, you need to make sure that search results display the correct version of your pages. To do this, insert hreflang tags in your site’s HTML, as this is what Google uses to match a user’s language preference to the right version of your pages. Or alternatively you can use sitemaps to submit language and regional alternatives for your pages.

Mobile usability

As mobile has overtaken desktop for searches this year, obviously your site has to be mobile-friendly, otherwise you’re providing a poor user experience to potentially half your visitors.

This report tells you of any issues your site has with mobile usability. And you’ll really want to be seeing the following message, as Google explicitly states you’ll otherwise be demoted.

Search Console Mobile Usability

Possible errors that will be highlighted by Search Console here include:

  • Flash usage: mobile browsers do not render Flash-based content, so don’t use it.
  • Viewport not configured: visitors to your site use a variety of devices with differing screen sizes so your pages should specify a viewport using the meta viewport tag.
  • Fixed-width viewport: viewports fixed to a pixel-size width will flag up errors. Responsive design should help solve this.
  • Content not sized to viewport: if a user has to scroll horizontally to see words and images, this will come up as an error.
  • Small font size: if your font size is too small to be legible and requires mobile users to ‘pinch to zoom’ this will need to be changed.
  • Touch elements too close: tappable buttons that are too close together can be a nightmare for mobile visitors trying to navigate your site.
  • Interstitial usage: Google will penalise you if you’re using a full-screen interstitial pop-up to advertise an app when a user visits your mobile site.

Google Index

Index Status

This lets you know how many pages of your website are currently included in Google’s index.

Search Console Index Status

You can quickly see any worrying trends from the last year (for instance that little dip in May 2015), as well as any pages that have been blocked by robots or removed.

Content Keywords

Here you can see the most common keywords found by the Googlebots as they last crawled your site.

Search Console Content Keywords

If you click on each keyword, you’ll be able to see the other synonyms found for that keyword, as well as the number of occurrences.

As Simon Heseltine suggests, look out for unexpected, unrelated keywords showing up as it’s an indication your site may have been hacked and hidden keywords have been injected into your pages.

Blocked resources

This section lets you know of any images, CSS, JavaScript or other resources on your site that’s blocked to Googlebots.

Search Console Blocked Resources

These are listed by host-name, then by specific pages, which you can follow steps to diagnose and resolve.

Remove URLs

Where essentially you can make your content disappear from Google.

remove urls search console

This only acts as a temporary fix, but by the time you’ve done this and either deleted your offending webpage or 301 redirected it elsewhere, there theoretically should no longer be a record of it.

Just enter the URL then select whether you want it removed from the search results and the cache, just from the cache or if you want an entire directory removed.

Be warned: this request can take between two to 12 hours to be processed.

Crawl

Crawl Errors

This report shows all the errors that Google has found when crawling your site over the last 90 days.

Search Console Crawl Errors

Site errors: the top half of the screen shows three tabs, where if you click on each you can see any past problems with your DNS, your server connectivity or whether a crawl had to be postponed. (Google will postpone a crawl rather than risk crawling URLs you don’t want indexed).

URL errors: the bottom half of the screen shows URL errors for desktop, smartphone and feature phone (a phone that can access the internet, but doesn’t have the advanced features of a smartphone).

You’ll likely see reports for the following on all three device types:

  • Server error: Google can’t access your site because the server is too slow to respond, or because your site is blocking Google.
  • Soft 404: this occurs when your server returns a real page for a URL that doesn’t actually exist on your site. You should replace these pages with 404 (Not found) or a 410 (Gone) return codes.
  • Not found: these are all your 404 pages that occur when a Googlebot attempts to visit a page that doesn’t exist (because you deleted it or renamed it without redirecting the old URL, etc.) Generally 404 pages are fine and won’t harm your rankings, so only pay attention to the ones related to high-ranking content.

Crawl Stats

This section shows the progress of Googlebots crawling your site in the last 90 days.

Search Console Crawl Stats

You can see how fast your pages are being crawled, kilobytes downloaded per day and average time spent downloading pages on your site.

Spikes are perfectly normal, and there’s not very much you can do about them. But if you see a sustained drop in any of these charts then it might be worth investigating to see what’s dragging it down.

Fetch as Google

Here you can check how any page on your website is seen by Google once its been been crawled.

You can also submit these webpages for indexing. You may find this is a quicker way to be crawled and indexed then if you were to let Google find the page automatically.

Search Console Fetch as Google

  • When you ‘Fetch’ a page, Google will simulate a crawl and you can quickly check any network connectivity problems or security issues with your site.
  • ‘Fetch and Render’ does the same as the above, but it also lets you check how the page itself looks on mobile or desktop, including all resources on the page (such as images and scripts) and will let you know if any of these are blocked to Googlebots.

Remember the crawler is meant to see the same page as the visitor would, so this is a good way to get a direct on-page comparison.

If the page is successfully fetched and rendered, you can submit it to the index. You are allowed 500 webpage fetches per week, but you can only submit a webppage and have Google crawl ALL the pages linked within it, 10 times per month.

robots.txt Editor

A robots.txt file placed within the root of your site, is where you can specify pages you don’t want crawled by search engines. Typically this is used because you don’t want your server overwhelmed by Googlebots, particularly if you want them to ignore script or style files, or if you want certain images not to appear in Google Image Search.

Here is where you can edit your robots.txt and check for errors. The bottom of the page reveals your errors and warnings.

robots.txt editor search console

Sitemaps

Sitemaps are hosted on the server of your website and they basically inform search engines of every page of your site, including any new ones added. It’s a good way to let Google better crawl and understand your website.

Here’s where you can access all of the information about any sitemaps either submitted manually or found by Search Console. The blue bar represents pages or images submitted, the red bar represents actual pages and images indexed.

sitemaps search console

You can test a sitemap by clicking the ‘Add/Test sitemap’ button, and if it’s valid you can then add it to Search Console.

URL Parameters

As Simon Heseltine has previously commented, this section isn’t used much anymore since the introduction of canonical tags.

However you should use URL Parameters if, for instance, you need to tell Google to distinguish between pages targeted to different countries. These preferences can encourage Google to crawl a preferred version of your URL or prevent Google from crawling duplicate content on your site.

URL parameters

Security Issues

Although any security issues will be communicated with you in the Messages section and on the Dashboard screen, here’s where you can check on problems in more detail.

Search Console Security Issues

There’s also plenty of accessible information here about how to fix your site if it’s been hacked or been infected with malware.

Other Resources

Here’s where you can access all the tools provided by Google, outside of Search Console. Including the Structured Data Testing Tool and Markup Helper, which we went into greater detail about in earlier sections.

Search Console Other Resources

Other helpful resources here are the Google My Business Center, where you can use to improve your business’s local search visibility and the PageSpeed Insights tool, which will tell you exactly how well your site is performing on mobile and desktop in terms of loading time, and how to fix any issues.

from Search Engine Watch https://searchenginewatch.com/2016/05/09/google-search-console-a-complete-overview/
via Auto Feed

Local SEO: Key challenges and tips from #ClickZChat

In previous ClickZChat sessions we’ve largely covered content and platforms, but seeing as it’s a Twitter event held by both ClickZ AND Search Engine Watch, it seemed only right that we spend some time looking at search in more depth.

This week we took to Twitter for an hour to ask our followers for their local SEO challenges and solutions. Here’s everything we learned:

Question 1: What are the biggest challenges you face when optimizing for local search?

  • SPAM

Several users (including me) mentioned spam being a much bigger issue for local ranking, with maps being particularly open to abuse, and Search Engines slower to act on this than in other cases:

  • Citations

Many people also felt that citations were a hassle for a variety of reasons.

  • Resources

Indeed, the issue of keeping up to date was seen as a major challenge. Data is often fragmented and many organisations with several locations do not have the time or resources to roll out best practice – or even standard practice – to all location listings, with local stores and outlets being left to fend for themselves:

This issue is compounded when you consider the lack of SEO expertise on site. In many cases it simply isn’t considered an issue.

With that said, it was also felt that this state of affairs meant there were big opportunities for those businesses that are getting it right, with small changes making a big difference

Q2: What are the absolute essentials for a decent local SEO presence?

This is where those quick wins we mentioned really come into their own. As our own Graham Charlton mentioned, not enough businesses are taking time to claim their Google Business listings:

Of course, once you do start claiming listings, you need to have a consistent data structure in mind. Google will focus on listings that are formatted correctly in multiple locations:

Once you have your listings in order, there’s also a big case to be made for (you guessed it) content. While there’s no doubt that technical optimisation plays a huge part, it is worth remembering that with so many local searches taking place on mobile devices, user intent is the primary motivator.

  • Reviews

This of course brings us into the realm of reviews, a hugely important component for local business. Even if you lack the resources to optimise your listings properly, this can still make you stand out to a certain extent:

Finally our very own Andrew Warren-Payne mentioned this useful list from Moz, very helpful if you want to get organised:

Q3: What one local SEO tip has proven the most successful for you?

We had a rush of great suggestions to this question, so I’ve pulled them into a quick reference list of ‘Golden rules for Local SEO’ for you:

1: Build on your past success
Leverage existing product content. Marketing reaches across the isle to customer success. ‪#SEO improves ‪ via @colincrook

‬2: Be as focused as you can on the needs of the local customer
We created separate web pages for each of the locations, with unique content & optimised them with local keywords – @anshikamails

3: Good local SEO takes time. Make time to maintain it
Build citations. An oldie but a goldie. https://www.brightlocal.com/2013/09/11/top-50-local-citation-sites/ – @Lexx2099

4: But just doing what you can will help

In some areas, just the basics of listings and data are enough if your competitors aren’t up to speed. ‪#ClickZChat – @gcharlton

5: Remember that Google services are linked together. Focus on the bigger picture
Google business page creation and posting in G+ page. – @shaileshk

Be sure to publish FROM Google+ TO OTHER platforms.. ‪#ClickZChat – @steveplunkett

6: Get your data in order

Site 1st with NAP for all locations, Category, Description, Social, reviews, schema & repeat in citations ‪#ClickZChat ‪#ClickZChat – @rajnijjer

7: And make sure you never stop learning

SEO changes so fast that it’s hard for anything to be easy! :p Best advice: stay aware & current on industry trends! – @hilph

8: Remember why people are searching in the first place

And of course, we can always rely on Search Engine Watch’s editor to chime in with some useful advice…

That’s it for this week. A huge thank you as always to everyone who took part. We’ll be holding another chat this Wednesday at 12 noon Eastern Time.

For more on Local SEO, check out Graham Charlton’s handy list of 30 quick and easy SEO tips for small businesses.

from Search Engine Watch https://searchenginewatch.com/2016/05/09/local-seo-key-challenges-and-tips-from-clickzchat/
via Auto Feed

Should you republish old content?

Lots of content you write is timeless. One year from now this article about republishing old content will still be as valid as it is today. Still, if you don’t share it or talk about it, very few people will notice it. A way to make sure your content won’t be forgotten is to republish it. But what’s a smart strategy for that? You don’t want to annoy your audience with old news. In this post, I’ll talk you through different ways to republish your old content.

Republish old content

Why would you republish old content?

A lot of content is valid for a longer period of time. And your audience changes and grows. Things you’ve written a year ago probably won’t be read by your new audience. So, it’s a waste of quality copy if you’d publish it only once.

Moreover, sometimes an article or blog post isn’t picked up properly the first time. Maybe your timing was off. An article that was published in summer, perhaps got little attention because of a very hot day. If you share posts on Facebook, you’ll most definitely notice that some posts are shared and liked much more than others. The reason why some posts are picked up by a large audience, while others aren’t, isn’t necessarily related to the quality of your post. Republishing can be a way to give your content a second chance to reach your audience.

Make sure your content is up to date!

Most important advice on republishing old content is that you should never republish anything that isn’t up to date. Nobody wants to read something that is out of date or no longer applicable. So before republishing, you should do some reviewing and possibly some re-writing!

Republishing content with minor changes

If you want to republish an article in which you’ve made minor changes, we would advise you to change the last modified date. That way, people are able to see when the article or post was altered last. It’s instantly clear that the information is still up to date.

We would advise you to hide the comments on a post you republish. It just looks weird if you push out an article with comments that are made a year earlier.

When updated, push out your article using social media like Facebook and Twitter. Or write about the article in your newsletter. You can mention that you wrote this post some time ago and that the information is still very useful. You can also choose to treat the republished post as a normal post and do the things you normally do to draw attention to new content.

Republishing old content with major changes

Sometimes you’ll need to make major adjustments to articles. Things can change entirely, making your old article rather useless. Or, your opinion or advice on how to handle certain things might change.

If you make big changes on an article, rewriting the entire text, we would advice you to publish it as if it were new content. You’ll then change the date of the article. Changing the date will enable you to keep all of the links from other websites to your original post (which is great for SEO of course).

Conclusion

Republishing can be a great way to get extra attention to those great articles you wrote some time ago. Make sure to keep those articles up to date, though. And, don’t go overboard! You shouldn’t republish your articles every other week. If people notice you’re publishing the same blogpost again and again, they’ll definitely get annoyed!

Read more: ‘10 tips for an awesome and SEO-friendly blog post’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/republish-old-content/
via KCG Auto Feed

Seven most interesting search marketing news stories of the week

Welcome to our weekly round-up of all the latest news and research from around the world of search marketing and beyond.

This week we have a bountiful collection of news, a heaving trove of stats and a swollen haul of insight from the last seven days.

These adjectives will make more sense in about one headline’s time.

Google AI is improving its conversational skills with… romance novels

Yep.

my fair viking

According to Buzzfeed – YES that’s where we get our intel from – for the past few months, Google has been feeding text from romance novels into its AI engine because the’ve determined that “parsing the text of romance novels could be a great way of enhancing the company’s technology with some of the personality and conversational skills it lacks.”

Buzzfeed also reports that the plan this seems to be working. “Google’s research team recently got the AI to write sentences that resemble those in the books.”

So expect your next Google search for a well reviewed local restaurant to include at least 12 synonyms for ‘throbbing’.

AdWords will launch redesigned AdWords on May 24th

And you can watch the launch live, if that’s the sort of thing you like doing with your time.

AdWords is being being redesigned for the mobile-first market and aesthetically will fall into line with its recently launched 360 suite.

Here’s a sneak peek:

Redesigned AdWords

You can get an early demo during the Google Performance Summit livestream on May 24th at 9:00am PT/12:00pm ET, which you can sign-up for here.

Seats are being booked up fast though, so hurry.

Jk, it’s on the internet. You just need to nudge the cat off the sofa.

Half of SEOs are either unaware of AMP or only have a “passing awareness”

As Rebecca Sentance reported this week, SEOs have been slow to implement Google’s accelerated mobile pages (AMP) in the two months since its official launch, despite the promise that AMP is an important ranking signal.

A survey, carried out by SEO PowerSuite, looked at awareness and uptake of AMP among 385 SEO professionals in North America and Europe. Of the respondents surveyed, less than a quarter (23%) had implemented AMP for their mobile sites.

Although general awareness of Accelerated Mobile Pages was high – 75% of the SEO professionals surveyed were aware of AMP – 21% said they were only aware of it “in passing.”

A column graph showing awareness of AMP among SEOs surveyed, with 21% of SEOs aware of AMP "in passing", 35% "have done SOME research" into AMP, 18% "have done A LOT of research" into AMP, while 25% are "not aware" of AMP.

Of those SEOs who hadn’t yet begun to implement AMP on their mobile sites, only 29% said they would do so in the next six months, and 5% of respondents said they had no intention of supporting AMP on their mobile sites whatsoever.

180% increase in websites being hacked in 2015

This week we reported on Google’s fight against webspam in 2015, revealing the following info:

  • An algorithmic update helped remove the amount of webspam in search results, impacting 5% of queries.
  • Google sent more than 4.3 million messages to webmasters notifying them of manual actions it had imposed on sites affected by spam.
  • Google saw a 33% increase in the number of sites that went through a spam clean-up “towards a successful reconsideration process.”

Most worrying of all was the massive 180% increase in hacking from 2014. If you haven’t already, it’s time to seriously think about the security of your website.

Google is moving all blogspot domain blogs to HTTPS

This week, Google has introduced a HTTPS version for every blogspot domain blog, meaning that visitors can access any blogspot domain blog over an encrypted channel.

https

Google has also removed the HTTPS Availability setting and all blogs will automatically have a HTTPS version enabled. So you don’t have to do a thing, and you may even get a little traffic boost as secure servers are seen as a ranking signal.

Google has taken action against sneaky mobile redirects

To tackle the trend of websites redirecting mobile users to spammy, unrelated domains Google has taken action on sites that sneakily redirect users in this way by issuing manual penalties.

sneaky mobile redirects

If your site has been affected, Google offers help on getting rid of these redirects to clean up your site and hopefully avoid further action.

Moz has introduced a new, free to use, keyword research tool

The new Keyword Researcher launched by Moz this week, can help take you all the way through the keyword research process. It has a variety of useful metrics including estimating the relative CTR of organic results and it surfaces results from almost all popular sources used by SEOs.

And best of all, you can run 2 free searches per day without logging in, another five with a free community account, and if you’re a Pro subscriber you already have full access.

keyword explorer

from Search Engine Watch https://searchenginewatch.com/2016/05/06/seven-most-interesting-search-marketing-news-stories-of-the-week/
via Auto Feed

Positioning your shop in the online market

Successful positioning adds value to your business and gives you a head start on the competition. Positioning is the art of distinguishing your business from others in the mind of your customers. You can make your webshop stand out by high product quality, great service, low prices or dedicated care for the environment. But it’s equally important to communicate this distinctive factor to your target group. Your position is their minds. In this post I’ll help you construct your desired position for your webshop.

positioning your shop online

The fifth P

Every marketing expert in the world knows the name of Philip Kotler. And even if you don’t know that name, you must have heard from the four P’s: product, price, place and promotion. These were the core of every marketing strategy when I studied Marketing decades ago. Since then, many have added their own extra P’s like people and purpose. Philip Kotler himself mentions another P as well: Positioning.

Definition of positioning

Kotler defines positioning as:

“the act of designing the company’s offering and image to occupy a distinctive place in the mind of the target market. The end result of positioning is the successful creation of a customer-focused value proposition, a cogent reason why the target market should buy the product.”
(Philip Kotler: Marketing Management, 2003)

This is closely related to finding your niche market. In my post about finding your shop’s niche, I explained how a product and target audience can be considered shop shapers. You can build an entire shop just based on the right product and the right market. Since positioning is about finding your spot in the mind of the target market, it’s clear that emotions play a part as well.

Questions to ask yourself

If you want to position your shop, it might help to ask yourself some questions:

  • What is your ideal customer? Not in terms of budget, but in terms of values.
  • What are my personal values and how do these relate to my products or company?
  • What do I consider the core competences of my company and how can I make these visible?
  • What brands do I like and how would people associated our company with these brands?
  • What are current trends in my market and what can our products contribute to that?

It’s not that simple to answer these questions. It’s quite heavy stuff, come to think of it. Especially since it’s almost all emotions. But thinking about these topics can help you find your shop’s position.

Construct your shop’s position

There is a simple way to construct your position. First define the following variables:

  1. Company name
  2. Product
  3. Target market
  4. Needs of your target market
  5. Distinctiveness of your company

That might require some research, and perhaps you haven’t thought about a number of these variables. But when you have defined them, your brand position will be something like this:

[Company] supplies [product] to [target market], looking for [needs]. [Company] distinguishes itself from competitors by [distinctiveness].

Some examples

This is quite a strict format, where you should of course craft this to fit you as a person or your company. Let’s look at some possible examples for known companies.

Coca-Cola

Cola is popular worldwide and is liked by people of all age groups while the diet coke targets the niche segment for people who are more health conscious. Coca Cola uses competitive positioning strategy to be way ahead of its competitors in the non-alcoholic beverages market.
(Source: Marketing91.com)

Patagonia

Build the best product, cause no unnecessary harm, use business to inspire and implement solutions to the environmental crisis for males, females, and children, at any age that love the outdoors. Patagonia calls out other companies with “environmental initiatives” to beat theirs.
(Source: Adventures in Branding)

Body Shop

The Body Shop expects its customers to view its products as beauty products with great quality, from a trustworthy brand. The fact that its products have a compelling natural, ethical and environmental story is an added advantage, and how it differentiates its brand from other big mainstream brands and retailers, instead of ethical or charity purchases to customers.
(Source: Natural Cosmetics Lovers)

Note that these aren’t the brand positions these companies set up in their mission statement or marketing plans. These are the positions that others imagine these companies have or had. These examples are simply here to illustrate to you what your position could be.

So what to do?

Find the elements that your desired clients would look for in a product or company. And find the areas where you want to and are able to distinct yourself from your competition. Kotler refers to these as ‘points-of-parity’ and ‘points-of-differentiation’. That sums it up quite nicely, I think.

Positioning is the first thing to do, and creating buzz should be the second. And strongly agree with that. Tell the world about your brand position! Use your blog, use social media, even use your site design to express your values and position your (company and) products in an online market with competition from all over the world.

Make sure your buzz is related to your products. Animal testing and the environment could be topics for your blog, if you want to position your company as conscious. Write about promotions and other sales if your desired position is to be the cheapest online perfume outlet ever. Positioning is about distinctiveness and relevance.

Over to you

What about your shop? Do you have a hard time construction your shop’s position? Or do you manage to occupy a “distinctive place in the mind of the target market”? Share your experience in the comments below!

Read more: ‘Find your shop’s niche’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/positioning-your-shop-online/
via KCG Auto Feed

Do bounce rates affect a site’s search engine ranking?

The bounce rate debate continues…

Bounce rates and how they affect a website’s ranking on Google has been discussed, dissected, and dismembered over and over again.

As fully transcribed on this site, a conversation between Rand Fishkin, CEO of Moz, and Andrey Lipattsev, Google’s search quality senior strategist, led to a surprising discussion on click and bounce rates affecting search rankings.

Rand stated that he has recently been running a few experimental tests with various crowds of 500 to a couple thousand people.

Everyone participating was prompted to take out their cellphones, laptops, and digital what-have-yous and perform a specific search. Once the search listing appeared, he had everyone in the crowd click one of the listings at the bottom of the results page and then click away from that site. He then monitored the results over the next few days.

Rand found a whole bunch of inconsistencies. In a little more than half of the experiments, the ranking did change on the search engine results page (SERP), and in a little less than half of the experiments, the rankings did not change.

This begs the question:

Do bounce rates affect a site’s search engine ranking? If so, how much?

Lipattsev believes that for each individual search query in the experiment, the generated interest regarding those specific searches impacts the rankings change rather than just the clicks and bounces.

He said that if a certain topic is gaining a substantial amount of searches and an increase in social media mentions, Google would pay more attention to that rather than a site getting more clicks.

Lipattsev says that it is certainly doable to determine exactly what causes a large rankings jump for an individual listing, but Internet-wide, it is much more difficult.

All this being said, what actually is a bounce rate?

The bounce rate is the percentage of visitors to a particular site who navigate or “bounce” away after only viewing that individual webpage.

Usually, the term ‘bounce rate’ has a negative connotation associated with it. People think that if a visitor only visits one page and then leaves, it’s bad for business. Their logic isn’t that flawed, either. After all, a high bounce rate would indicate that a site does not have the high-quality, relevant content Google wants out of its top ranked sites.

A great Search Engine Journal article shows nine negative reasons why your website could potentially have a high bounce rate, including poor web design, incorrect keyword selection, improper links, and just bad content. It’s true that these high bounce rates can reflect poorly on a website… sometimes.

So, what gives?

Having a high bounce rate on something like a ‘contact us’ page can actually be a good thing. That’s more of a call-to-action site, where the goal of that particular page is to have the user find the contact information, and then actually contact the business. The visitor got what they came for and then left. Extra navigation around the website doesn’t really mean anything in this case.

Of course, if your site is more content-driven or offers a product or service, then your goal should be to have a higher click-through rate (CTR) and more traffic to each page.

bouncy castles

But what about Google?

Does Google know your bounce rate and are they using it to affect rankings? This Search Engine Roundtable article provides the short answer (which is “no”).

Many organizations don’t use Google Analytics, so Google has no way of tracking their bounce rate information. And even with the analytics that they can trace, it’s difficult to determine what they actually mean because every situation is different.

There are many factors that go into determining how long a visitor stays on a particular webpage. If a visitor remains on a site for over 20 minutes, they could be so engaged with your site’s content that they can’t even imagine leaving your wonderful webpage… or… it could mean they fell asleep at the screen because your website was so boring. It’s too difficult to tell.

If you are operating one of those websites that should have a lower bounce rate, these tips on lowering that number should be able to help. Some highlights include making sure each of your pages loads quickly, offers user-friendly navigation, avoids cluttered advertisements, and features quality content!

If bounce rates don’t affect Google’s rankings as much as you thought, you wonder how significant other ranking factors are. Well, Google recently revealed that magical information. They narrowed it down to three top ranking factor used by Google to drive search results:

  • Links: strong links and link votes play a major role in search rankings.
  • Content: having quality content is more important than ever.
  • RankBrain: Google’s AI ranking system.

It’s no shock that links and content matter, but RankBrain is still relatively new. It’s Google’s new algorithm to help determine search results (after factoring in links and content). RankBrain filters more complex searches and converts them into shorter ones, all the while maintaining the complexity of the search, thusly refining the results.

Google’s newest AI technology – and whatever other secret technologies they are working on – may resolve the never-ending debate over bounce rates, but it’s certainly going to be a difficult process.

More research is to come and Andrey believes the challenge to make bounce rate click data a strong and measurable metric is “gameable,” but Google still has a long way to go.

“If we solve it, good for us,” Andrey said, “but we’re not there yet.”

There is no one-size-fits-all answer when it comes to SEO and all its intricacies. The greatest answer to any SEO question is always “it depends.”

from Search Engine Watch https://searchenginewatch.com/2016/05/04/do-bounce-rates-affect-a-sites-search-engine-ranking/
via Auto Feed

How Google fights webspam and what you need to learn from this

Google has this week revealed its annual report on how it has policed the internet over the last 12 months. Or at least how it policed the vast chunk of the internet it allows on its results pages.

Although it’s self-congratulatory stuff, and as much as you can rightfully argue with some of Google’s recent penalties, you do need to understand what Google is punishing in terms of ‘bad quality’ internet experiences so you can avoid the same mistakes.

It’s important to remember that Google for some people IS the internet, or at least the ‘front door’ to it (sorry Reddit), but it’s equally important to remember that Google is still a product; one that needs to make money to survive and (theoretically) provide the best possible experience for its users, or else it is off to DuckDuckGo they… uh… go.

So therefore Google has to ensure the results it serves on its SERPs (search engine results pages) are of the highest quality possible. Algorithms are built and manual reviews by actual human beings are carried out to ensure crappy websites with stolen/thin/manipulative/harmful content stay hidden.

Here’s how Google is currently kicking ass and taking names… and how you can avoid falling between its crosshairs.

google webspam

How Google fought webspam

According to Google, an algorithmic update helped remove the amount of webspam in search results, impacting 5% of queries.

The remaining spam was tackled manually. Google sent more than 4.3 million messages to webmasters notifying them of manual actions it had imposed on sites affected by spam.

Following this, Google saw a 33% increase in the number of sites that went through a spam clean-up “towards a successful reconsideration process.” It’s unclear whether the remaining sites are still in the process of appealing, or have been booted off the face of the internet.

Who watches the watchmen?

More than 400,000 spam reports were manually submitted by Google users around the world. Google acted on 65% of them, and considered 80% of those acted upon to be spam.

Hacking

There was a huge 180% increase in websites being hacked in 2015, compared to the previous year. Hacking can take on a number of guises, whether its website spam or malware, but the result will be the same. You’ll be placed ‘in quarantine’ and your site will be flagged or removed.

Google has a number of official guidelines on how to help avoid being hacked. These include:

  • Strengthen your account security with lengthy, difficult to guess or crack passwords and not reusing those passwords across platforms.
  • Keep your site’s software updated, including its CMS and various plug-ins.
  • Research how your hosting provider handles security issues and check its policy when it comes to cleaning up hacked sites. Will it offer live support if your site is compromised?
  • Use tools to stay informed of potential hacked content on your site. Signing up to Search Console is a must, as it’s Google’s way of communicating any site issues with you.

google spam fighting

Thin, low quality content

Google saw an increase in the number of sites with thin, low quality content, a substantial amount likely to be provided by scraper sites.

Unfortunately there is very little you can do if your site is being scraped, as Google has discontinued its reporting tool and believes this problem to be your own fault. You just have to be confident that your own site’s authority, architecture and remaining content is enough to ensures it ranks higher than a scraper site.

If you have been served a manual penalty for ‘thin content with little or no added value’ there are things you can do to rectify it, which can mostly be boiled down to ‘stop making crappy content, duh’.

1) Start by checking your site for the following:

  • Auto-generated content: automatically generated content that reads like it was written by a piece of software because it probably was.
  • Thin content pages with affiliate links: affiliate links in quality articles are fine, but pages where the affiliates contain descriptions or reviews copied directly from the original retailer without any added original content are bad. As a rule, affiliates should form only a small part of the content of your site.
  • Scraped content: if you’re a site that automatically scrapes and republishes entire articles from other websites without permission then you should just flick the off-switch right away.
  • Doorway pages: these are pages which can appear multiple times for a particular query’s search results but ultimately lead users to the same destination. The purpose of doorway pages are purely to manipulate rankings.

2) Chuck them all in the bin.

3) If after all that you’re 100% sure your site somehow offers value, then you can resubmit to Google for reconsideration.

For more information on Google’s fight against webspam, read its official blog-post.

And finally, I’ll leave you with this terrifying vision of things to come…

robots and people

from Search Engine Watch https://searchenginewatch.com/2016/05/04/how-google-fights-webspam-and-what-you-need-to-learn-from-this/
via Auto Feed

The reviews are on sale! And more news

Do you need to improve the SEO of your website? Do you want our Yoast SEO experts to thoroughly analyze the SEO of your website? You should definitely order a website review now! Our Gold SEO reviews are on sale until May 18 and will cost only $599 instead of $799.

Sale on our SEO reviews

Change in types of SEO reviews

We’ve decided to simplify our assortment of SEO reviews a bit. Up until now you could choose between four types of reviews: Silver, Gold, Diamond and Platinum. As of today, we’ll offer two types of reviews. You can choose our Gold SEO review in which we give lots of practical advice. Or, you can choose our Platinum SEO review, which is a full audit of your website. The Gold SEO review is on sale and costs only $599 (instead of $799). Our Platinum SEO review costs $2999.

Upcoming: Yoast Consulting project

As of next month, we’ll offer a new type of review. At Yoast, we regularly get questions from people who need more guidance in SEO than our reviews can give them. Also, our SEO team likes to carry out more in-depth SEO projects. They love to really dive into a website and give high quality and personal advice. That’s why, as of next month, we’ll start offering Yoast Consulting projects.

In a Yoast Consulting project, we’ll look at every aspect of your website with our complete SEO team! This team consists of Joost, Michiel, Annelieke, Judith, Jaro, Michelle, Patrick and Meike. You’ll receive a complete analysis and many practical tips. We’ll start with an intake meeting by Skype (or you can come by our office in the Netherlands). Later we’ll also have a Skype follow-up meeting, to make sure you’re completely satisfied. A Yoast Consulting project will cost $10.000. We’ll only do one Yoast Consulting project a month, as it will take much of our time. If you’re interested in purchasing a Yoast Consulting project, make sure to contact us.

Read more: ‘What our website reviews can do for you’ »

from Yoast • The Art & Science of Website Optimization https://yoast.com/reviews-sale-news/
via KCG Auto Feed