The short version:
The long version: Inside the invisible government: war, propaganda, Clinton & Trump
Or, if you prefer video:
from SEO Book http://www.seobook.com/im-her
via KCG Auto Feed
The short version:
The long version: Inside the invisible government: war, propaganda, Clinton & Trump
Or, if you prefer video:
from SEO Book http://www.seobook.com/im-her
via KCG Auto Feed
Remember the whole shtick about good, legitimate, high-quality content being created for readers without concern for search engines – even as though search engines do not exist?
Whatever happened to that?
We quickly shifted from the above “ideology” to this:
The red triangle/exclamation point icon was arrived at after the Chrome team commissioned research around the world to figure out which symbols alarmed users the most.
Google is explicitly spreading the message that they are doing testing on how to create maximum fear to try to manipulate & coerce the ecosystem to suit their needs & wants.
At the same time, the Google AMP project is being used as the foundation of effective phishing campaigns.
Scare users off of using HTTP sites AND host phishing campaigns.
Killer job Google.
Someone deserves a raise & some stock options. Unfortunately that person is in the PR team, not the product team.
I’d like to tell you that I was preparing the launch of https://amp.secured.mobile.seobook.com but awareness of past ecosystem shifts makes me unwilling to make that move.
I see it as arbitrary hoop jumping not worth the pain.
If you are an undifferentiated publisher without much in the way of original thought, then jumping through the hoops make sense. But if you deeply care about a topic and put a lot of effort into knowing it well, there’s no reason to do the arbitrary hoop jumping.
Remember how mobilegeddon was going to be the biggest thing ever? Well I never updated our site layout here & we still outrank a company which raised & spent 10s of millions of dollars for core industry terms like [seo tools].
Though it is also worth noting that after factoring in increased ad load with small screen sizes & the scrape graph featured answer stuff, a #1 ranking no longer gets it done, as we are well below the fold on mobile.
In the above example I am not complaining about ranking #5 and wishing I ranked #2, but rather stating that ranking #1 organically has little to no actual value when it is a couple screens down the page.
Google indicated their interstitial penalty might apply to pop ups that appear on scroll, yet Google welcomes itself to installing a toxic enhanced version of the Diggbar at the top of AMP pages, which persistently eats 15% of the screen & can’t be dismissed. An attempt to dismiss the bar leads the person back to Google to click on another listing other than your site.
As bad as I may have made mobile search results appear earlier, I was perhaps being a little to kind. Google doesn’t even have mass adoption of AMP yet & they already have 4 AdWords ads in their mobile search results AND when you scroll down the page they are testing an ugly “back to top” button which outright blocks a user’s view of the organic search results.
What happens when Google suggests what people should read next as an overlay on your content & sells that as an ad unit where if you’re lucky you get a tiny taste of the revenues?
Is it worth doing anything that makes your desktop website worse in an attempt to try to rank a little higher on mobile devices?
Given the small screen size of phones & the heavy ad load, the answer is no.
I realize that optimizing a site design for mobile or desktop is not mutually exclusive. But it is an issue we will revisit later on in this post.
Many people new to SEO likely don’t remember the importance of using Google Checkout integration to lower AdWords ad pricing.
You either supported Google Checkout & got about a 10% CTR lift (& thus 10% reduction in click cost) or you failed to adopt it and got priced out of the market on the margin difference.
And if you chose to adopt it, the bad news was you were then spending yet again to undo it when the service was no longer worth running for Google.
How about when Google first started hyping HTTPS & publishers using AdSense saw their ad revenue crash because the ads were no longer anywhere near as relevant.
Not like Google cared much, as it is their goal to shift as much of the ad spend as they can onto Google.com & YouTube.
It is not an accident that Google funds an ad blocker which allows ads to stream through on Google.com while leaving ads blocked across the rest of the web.
Android Pay might be worth integrating. But then it also might go away.
It could be like Google’s authorship. Hugely important & yet utterly trivial.
Faces help people trust the content.
Then they are distracting visual clutter that need expunged.
Then they once again re-appear but ONLY on the Google Home Service ad units.
They were once again good for users!!!
Neat how that works.
Or it could be like Google Reader. A free service which defunded all competing products & then was shut down because it didn’t have a legitimate business model due to it being built explicitly to prevent competition. With the death of Google reader many blogs also slid into irrelevancy.
Their FeedBurner acquisition was icing on the cake.
Techdirt is known for generally being pro-Google & they recently summed up FeedBurner nicely:
Thanks, Google, For Fucking Over A Bunch Of Media Websites – Mike Masnick
Ultimately Google is a horrible business partner.
And they are an even worse one if there is no formal contract.
When Google routinely acts so anti-competitive & abusive it is no surprise that some of the “standards” they propose go nowhere.
You can only get screwed so many times before you adopt a spirit of ambivalence to the avarice.
Google is the type of “partner” that conducts security opposition research on their leading distribution partner, while conveniently ignoring nearly a billion OTHER Android phones with existing security issues that Google can’t be bothered with patching.
Deliberately screwing direct business partners is far worse than coding algorithms which belligerently penalize some competing services all the while ignoring that the payday loan shop funded by Google leverages doorway pages.
BackChannel recently published an article foaming at the mouth promoting the excitement of Google’s AI:
This 2016-to-2017 Transition is going to move us from systems that are explicitly taught to ones that implicitly learn.” … the engineers might make up a rule to test against—for instance, that “usual” might mean a place within a 10-minute drive that you visited three times in the last six months. “It almost doesn’t matter what it is — just make up some rule,” says Huffman. “The machine learning starts after that.
The part of the article I found most interesting was the following bit:
After three years, Google had a sufficient supply of phonemes that it could begin doing things like voice dictation. So it discontinued the [phone information] service.
Google launches “free” services with an ulterior data motive & then when it suits their needs, they’ll shut it off and leave users in the cold.
As Google keeps advancing their AI, what do you think happens to your AMP content they are hosting? How much do they squeeze down on your payout percentage on those pages? How long until the AI is used to recap / rewrite content? What ad revenue do you get when Google offers voice answers pulled from your content but sends you no visitor?
A recent Wall Street Journal article highlighting the fast ad revenue growth at Google & Facebook also mentioned how the broader online advertising ecosystem was doing:
Facebook and Google together garnered 68% of spending on U.S. online advertising in the second quarter—accounting for all the growth, Mr. Wieser said. When excluding those two companies, revenue generated by other players in the U.S. digital ad market shrank 5%
The issue is NOT that online advertising has stalled, but rather that Google & Facebook have choked off their partners from tasting any of the revenue growth. This problem will only get worse as mobile grows to a larger share of total online advertising:
By 2018, nearly three-quarters of Google’s net ad revenues worldwide will come from mobile internet ad placements. – eMarketer
Media companies keep trusting these platforms with greater influence over their business & these platforms keep screwing those same businesses repeatedly.
You pay to get likes, but that is no longer enough as edgerank declines. Thanks for adopting Instant Articles, but users would rather see live videos & read posts from their friends. You are welcome to pay once again to advertise to the following you already built. The bigger your audience, the more we will charge you! Oh, and your direct competitors can use people liking your business as an ad targeting group.
Worse yet, Facebook & Google are even partnering on core Internet infrastructure.
Any hope of AMP turning the corner on the revenue front is a “no go”:
“We want to drive the ecosystem forward, but obviously these things don’t happen overnight,” Mr. Gingras said. “The objective of AMP is to have it drive more revenue for publishers than non-AMP pages. We’re not there yet”.
Publishers who are critical of AMP were reluctant to speak publicly about their frustrations, or to remove their AMP content. One executive said he would not comment on the record for fear that Google might “turn some knob that hurts the company.”
Look at that.
Leadership through fear once again.
At least they are consistent.
As more publishers adopt AMP, each publisher in the program will get a smaller share of the overall pie.
Just look at Google’s quarterly results for their current partners. They keep showing Google growing their ad clicks at 20% to 40% while partners oscillate between -15% and +5% quarter after quarter, year after year.
In the past quarter Google grew their ad clicks 42% YoY by pushing a bunch of YouTube auto play video ads, faster search growth in third world markets with cheaper ad prices, driving a bunch of lower quality mobile search ad clicks (with 3 then 4 ads on mobile) & increasing the percent of ad clicks on “own brand” terms (while sending the FTC after anyone who agrees to not cross bid on competitor’s brands).
The lower quality video ads & mobile ads in turn drove their average CPC on their sites down 13% YoY.
The partner network is relatively squeezed out on mobile, which makes it shocking to see the partner CPC off more than core Google, with a 14% YoY decline.
What ends up happening is eventually the media outlets get sufficiently defunded to where they are sold for a song to a tech company or an executive at a tech company. Alibaba buying SCMP is akin to Jeff Bezos buying The Washington Post.
The Wall Street Journal recently laid off reporters. The New York Times announced they were cutting back local cultural & crime coverage.
If news organizations of that caliber can’t get the numbers to work then the system has failed.
The Tribune Company, already through bankruptcy & perhaps the dumbest of the lot, plans to publish thousands of AI assisted auto-play videos in their articles every day. That will guarantee their user experience on their owned & operated sites is worse than just about anywhere else their content gets distributed to, which in turn means they are not only competing against themselves but they are making their own site absolutely redundant & a chore to use.
That the Denver Guardian (an utterly fake paper running fearmongering false stories) goes viral is just icing on the cake.
These tech companies are literally reshaping society & are sucking the life out of the economy, destroying adjacent markets & bulldozing regulatory concerns, all while offloading costs onto everyone else around them.
An FTC report recommended suing Google for their anti-competitive practices, but no suit was brought. The US Copyright Office Register was relieved of her job after she went against Google’s views on set top boxes.
And in spite of the growing importance of tech media coverage of the industry is a trainwreck:
This is what it’s like to be a technology reporter in 2016. Freebies are everywhere, but real access is scant. Powerful companies like Facebook and Google are major distributors of journalistic work, meaning newsrooms increasingly rely on tech giants to reach readers, a relationship that’s awkward at best and potentially disastrous at worst.
Being a conduit breeds exclusives. Challenging the grand narrative gets one blackballed.
Google announced they are releasing a mobile first search index:
Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices.
There are some forms of content that simply don’t work well on a 350 pixel wide screen, unless they use a pinch to zoom format. But using that format is seen as not being mobile friendly.
Imagine you have an auto part database which lists alternate part numbers, price, stock status, nearest store with part in stock, time to delivery, etc. … it is exceptionally hard to get that information to look good on a mobile device. And good luck if you want to add sorting features on such a table.
The theory that using the desktop version of a page to rank mobile results is flawed because users might find something which is only available on the desktop version of a site is a valid point. BUT, at the same time, a publisher may need to simplify the mobile site & hide data to improve usability on small screens & then only allow certain data to become visible through user interactions. Not showing those automotive part databases to desktop users would ultimately make desktop search results worse for users by leaving huge gaps in the search results. And a search engine choosing to not index the desktop version of a site because there is a mobile version is equally short sighted. Desktop users would no longer be able to find & compare information from those automotive parts databases.
Once again money drives search “relevancy” signals.
Since Google will soon make 3/4 of their ad revenues on mobile that should be the primary view of the web for everyone else & alternate versions of sites which are not mobile friendly should be disappeared from the search index if a crappier lite mobile-friendly version of the page is available.
Amazon converts well on mobile in part because people already trust Amazon & already have an account registered with them. Most other merchants won’t be able to convert at anywhere near as well of a rate on mobile as they do on desktop, so if you have to choose between having a mobile friendly version that leaves differentiated aspects hidden or a destkop friendly version that is differentiated & establishes a relationship with the consumer, the deeper & more engaging desktop version is the way to go.
The heavy ad load on mobile search results only further combine with the low conversion rates on mobile to make building a relationship on desktop that much more important.
Even TripAdvisor is struggling to monetize mobile traffic, monetizing it at only about 30% to 33% the rate they monetize desktop & tablet traffic. Google already owns most the profits from that market.
Webmasters are better off NOT going mobile friendly than going mobile friendly in a way that compromises the ability of their desktop site.
Mobile-first: with ONLY a desktop site you’ll still be in the results & be findable. Recall how mobilegeddon didn’t send anyone to oblivion?— Gary Illyes (@methode) November 6, 2016
I am not the only one suggesting an over-simplified mobile design that carries over to a desktop site is a losing proposition. Consider Nielsen Norman Group’s take:
in the current world of responsive design, we’ve seen a trend towards insufficient information density and simplifying sites so that they work well on small screens but suboptimally on big screens.
Publishers are getting squeezed to subsidize the primary web ad networks. But the narrative is that as cross-device tracking improves some of those benefits will eventually spill back out into the partner network.
I am rather skeptical of that theory.
Facebook already makes 84% of their ad revenue from mobile devices where they have great user data.
They are paying to bring new types of content onto their platform, but they are only just now beginning to get around to test pricing their Audience Network traffic based on quality.
Priorities are based on business goals and objectives.
When these ad networks are strong & growing quickly they may be able to take a stand, but when growth slows the stock prices crumble, data security becomes less important during downsizing when morale is shattered & talent flees. Further, creating alternative revenue streams becomes vital “to save the company” even if it means selling user data to dangerous dictators.
The other big risk of such tracking is how data can be used by other parties.
Data is being used in all sorts of crazy ways the central ad networks are utterly unaware of. These crazy policies are not limited to other countries. Buying dog food with your credit card can lead to pet licensing fees. Even cheerful “wellness” programs may come with surprises.
from SEO Book http://www.seobook.com/securing-fear-and-mobile-monetization
via KCG Auto Feed
On Friday Google’s Gary Illyes announced Penguin 4.0 was now live.
Key points highlighted in their post are:
Things not mentioned in the post
Since the update was announced, the search results have become more stable.
No signs of major SERP movement yesterday – the two days since Penguin started rolling out have been quieter than most of September.— Dr. Pete Meyers (@dr_pete) September 24, 2016
They still may be testing out fine tuning the filters a bit…
Fyi they’re still split testing at least 3 different sets of results. I assume they’re trying to determine how tight to set the filters.— SEOwner (@tehseowner) September 24, 2016
…but what exists now is likely to be what sticks for an extended period of time.
Now that Penguin is baked into Google’s core ranking algorithms, no more Penguin updates will be announced. Panda updates stopped being announced last year. Instead we now get unnamed “quality” updates.
Earlier in the month many SEOs saw significant volatility in the search results, beginning ahead of Labor Day weekend with a local search update. The algorithm update observations were dismissed as normal fluctuations in spite of the search results being more volatile than they have been in over 4 years.
There are many reasons for search engineers to want to roll out algorithm updates (or at least test new algorithms) before a long holiday weekend:
Just about any of the algorithm volatility tools showed far more significant shift earlier in this month than over the past few days.
One issue with looking at any of the indexes is the rank shifts tend to be far more dramatic as you move away from the top 3 or 4 search results, so the algorithm volatility scores are much higher than the actual shifts in search traffic (the least volatile rankings are also the ones with the most usage data & ranking signals associated with them, so the top results for those terms tend to be quite stable outside of verticals like news).
You can use AWR’s flux tracker to see how volatility is higher across the top 20 or top 50 results than it is across the top 10 results.
I shut down our membership site in April & spend most of my time reading books & news to figure out what’s next after search, but a couple legacy clients I am winding down working with still have me tracking a few keywords & one of the terms saw a lot of smaller sites (in terms of brand awareness) repeatedly slide and recover over the past month.
Notice how a number of sites would spike down on the same day & then back up. And then the pattern would repeat.
As a comparison, here is that chart over the past 3 months.
Notice the big ranking moves which became common over the past month were not common the 2 months prior.
There is a weird sect of alleged SEOs which believes Google is omniscient, algorithmic false positives are largely a myth, AND negative SEO was never a real thing.
As it turns out, negative SEO was real, which likely played a part in Google taking years to rolll out this Penguin update AND changing how they process Penguin from a sitewide negative factor to something more granular.
Part of the reason many people think there was no Penguin update or responded to the update with “that’s it?” is because few sites which were hit in the past recovered relative to the number of sites which ranked well until recently just got clipped by this algorithm update.
When Google updates algorithms or refreshes data it does not mean sites which were previously penalized will immediately rank again.
Some penalties (absent direct Google investment or nasty public relations blowback for Google) require a set amount of time to pass before recovery is even possible.
Google has no incentive to allow a broad-based set of penalty recoveries on the same day they announce a new “better than ever” spam fighting algorithm.
They’ll let some time base before the penalized sites can recover.
Further, many of the sites which were hit years ago & remain penalized have been so defunded for so long that they’ve accumulated other penalties due to things like tightening anchor text filters, poor user experience metrics, ad heavy layouts, link rot & neglect.
So here are some of the obvious algorithmic holes left by the new Penguin approach…
The trite advice is to make quality content, focus on the user, and build a strong brand.
But you can do all of those well enough that you change the political landscape yet still lose money.
“Mother Jones published groundbreaking story on prisons that contributed to change in govt policy. Cost $350k & generated $5k in ad revenue”— SEA☔☔LE SEO (@searchsleuth998) August 22, 2016
Google & Facebook are in a cold war, competing to see who can kill the open web faster, using each other as justification for their own predation.
And that is without getting hit by a penalty.
It is getting harder to win in search period.
And it is getting almost impossible to win in search by focusing on search as an isolated channel.
I never understood mentality behind Penguin “recovery” people. The spam links ranked you, why do you expect to recover once they’re removed?— SEOwner (@tehseowner) September 25, 2016
Efforts and investments in chasing the algorithms in isolation are getting less viable by the day.
Obviously removing them may get you out of algorithm, but then you’ll only have enough power to rank where you started before spam links.— SEOwner (@tehseowner) September 25, 2016
Anyone operating at scale chasing SEO with automation is likely to step into a trap.
When it happens, that player better have some serious savings or some non-Google revenues, because even with “instant” algorithm updates you can go months or years on reduced revenues waiting for an update.
And if the bulk of your marketing spend while penalized is spent on undoing past marketing spend (rather than building awareness in other channels outside of search) you can almost guarantee that business is dead.
“If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits.” – Matt Cutts
from SEO Book http://www.seobook.com/penguin-40-update
via KCG Auto Feed
Kes Consulting Group has started a new venture on Amelia Island. Amelia Web Works is now open to help the web development & design and SEO needs of the Fernandina area community. We plan on keeping KCG open to our clients across the country but Amelia Web Works will be deeply involved in the Amelia Island community, as that is where we plan on growing roots for the long-term. We’re looking forward to many more years of success.
Google recently made it much harder to receive accurate keyword data from the AdWords keyword tool.
They have not only grouped similar terms, but then they broadened out the data ranges to absurdly wide ranges like 10,000 to 100,000 searches a month. Only active AdWords advertisers receive (somewhat?) decent keyword data. And even with that, there are limitations. Try to view too many terms and you get:
“You’ve reached the maximum number of page views for this day. This page now shows ranges for search volumes. For a more detailed view, check back in 24 hours.”
Jennifer Slegg shared a quote from an AdWords advertiser who spoke with a representative:
“I have just spoken to a customer service manger from the Australia support help desk. They have advised me that there must be continuous activity in your google ad-words campaign (clicks and campaigns running) for a minimum of 3-4 months continuous in order to gain focused keyword results. If you are seeing a range 10-100 or 100-1k or 1k -10k its likely your adwords account does not have an active campaign or has not had continuous campaigns or clicks.”
So you not only need to be an advertiser, but you need to stay active for a quarter-year to a third of a year to get decent data.
Part of the sales pitch of AdWords/PPC was that you can see performance data right away, whereas SEO investments can take months or years to back out.
But with Google outright hiding keyword data even from active advertisers, it is probably easier and more productive for those advertisers to start elsewhere.
There are many other keyword data providers (Wordtracker, SEMrush, Wordze, Spyfu, KeywordSpy, Keyword Discovery, Moz, Compete.com, SimilarWeb, Xedant, Ubersuggest, KeywordTool.io, etc.) And there are newer entrants like the Keyword Keg Firefox extension & the brilliantly named KeywordShitter).
In light of Google’s push to help make the web more closed-off & further tilt the web away from the interests of searchers toward the interest of big advertisers*, we decided to do the opposite & recently upgraded our keyword tool to add the following features…
Our keyword tool lists estimated search volumes, bid prices, cross links to SERPs, etc. Using it does require free account registration to use, but it is a one-time registration and the tool is free. And we don’t collect phone numbers, hard sell over the phone, etc. We even shut down our paid members area, so you are not likely to receive any marketing messages from us anytime soon.
Export is lightning quick AND, more importantly, we have a panda in our logo!
Here is what the web interface looks like
And here is an screenshot of data in Excel with the various keyword match types
If the tool looks like it is getting decent usage, we may upgrade it further to refresh the data more frequently, consider adding more languages, add a few more reference links to related niche sites in the footer cross-reference section, and maybe add a few other features.
“Every market has some rules and boundaries that restrict freedom of choice. A market looks free only because we so unconditionally accept its underlying restrictions that we fail to see them.” — Ha-Joon Chang
from SEO Book http://www.seobook.com/keyword-tool-upgrades
via KCG Auto Feed
Brand new research out today reveals that, since Google AdWords removed its right-hand side ads and brought in an occasional fourth paid ad position for ‘highly commercial’ search terms, this fourth ad appears for nearly one-quarter of all search topics.
It’s been an interesting time for search marketers, with lot of early research indicating various different trends and anomalies for the new look SERP.
The major worry is that paid search advertising will become more competitive and that organic results are getting pushed further and further down the page.
Although one of the surprising developments is that having your ad appear in position 4 may lead to as high CTR as position 1.
Today’s research however highlights the need for paid search teams to align their strategies with customer intent.
As well as the headline stat, BrightEdge has discovered the following important takeaways you need to be aware of:
That customer intent is everything, and that the ‘micro-moments’ that you will have heard Google recommending you pay attention to, should be right at the top of your search strategy.
As Chris Lake mentions in his post on how to optimise for near me search, Google says micro-moments are the “critical touch points within today’s consumer journey, and when added together, they ultimately determine how that journey ends.”
Or to put it simply, the
These all have three things in common – immediacy, context and intent.
So going back to the BrightEdge research, Google is creating a pay-to-play battleground where the only winners will be the marketers who align their paid search efforts with customer intent.
According to Google, examples of commercial queries include topics such as “hotels in New York City” and “car insurance”. Other examples are “CRM software” and “energy management systems”. Also note that research from Sirius Decisions indicates that 67% of the B2B buyer’s journey is now done online.
That’s not to say there’s no room for organic search marketing for commercial terms…
The key to search marketing is supporting organic efforts with paid advertising, and filling the gaps when on-page SEO and content marketing isn’t enough.
However you must understand that with Google becoming ever savvier about quality, it’s vital you’re creating content that’s trustworthy and relevant.
But as the research points out, “searches with commercial intent on average display a higher number of ads at the top of the page than other searches, click-through-rates are lower for organic search results as compared to those with fewer top-of-the-page ads.”
So again, it’s now much harder for organic results to gain any love on SERPs for commercial search terms.
The key is knowing which commercial terms have organic search results above the fold, so organic and paid search teams can work together in targeting these terms to boost ROI for both paid and organic efforts.
You should also research which pages are currently ranking for these terms and create further webpages that help bolster this presence, by mapping content to exactly what searchers are looking for.
And then for search topics where there are fewer ads displayed, organic search teams should take the lead in creating content that delivers on all points, from relevancy, to quality to user experience, in order to attract and retain customers.
For more on the research, check the full report from BrightEdge.
from Search Engine Watch https://searchenginewatch.com/2016/05/12/google-ad-4-pack-now-shown-for-23-of-all-online-search-topics/
via Auto Feed
When you start an online shop, keyword strategy might seem less important. You are selling products, right? So the product names are your keywords. While that might be true in a few cases, in most cases you need to focus on the keywords that describe the problem you/your products are solving for your customer. Selling sun protection? The problems you’re solving are among others sunburn and skin cancer, so these are your keywords as well. In this post, I’ll give you a practical approach on how to perform keyword research for your webshop.
After you’ve defined your position and found your niche, you must have a pretty good idea of the main keywords for your website. By putting some real effort in positioning your website, you unconsciously were thinking about what we like to call ‘long tail keywords’. You were thinking about how you could refine your product or better: product description to match a certain target market.
Let’s first explain the concept of keyword research. Keyword research can be defined as:
The activity you undertake in order to come up with an extensive list of keywords you would like to rank for.
Your keyword strategy follows that definition, as it consists of all the decisions you make based on that keyword research. This is where your search marketing starts: what do you do, explained in the language of your target market. It will help you come up with an extensive list of (long tail) keywords you’d like to rank for.
Keyword research consists of three steps:
Where we say keywords, we also mean keyphrases (more than one keyword in a search query, as in ‘search engine optimization’).
We have gone into this before. The mission of your online shop consists of the ideas you have about your website and your company. For the moment, let’s not focus on if that mission statement will prove to be genius enough to sell to people. This also largely depends on the market you are in.
Some markets are highly competitive, with large companies dominating the search results. For instance, an online shop with illustrations for children would even have to compete with online giants like Disney. Did you know Disney is an online publisher as well? Blogs like babble.com attract thousands of readers per day. I can assure you that these companies have a bit more budget for marketing (and SEO) than a starting shop like yours might be. Competing in these markets is hard, ranking in these markets is hard. All the more reason to make a good decision on niches and positioning. And keywords, obviously.
Please note that if you decide on a specific long tail keyword, that doesn’t mean you can forget about the competitive keywords altogether. These need to be mentioned, or better need to have a role in your website as well. You can’t optimize for ‘low-cal chocolate cupcake’ without focusing on ‘cupcake’ and ‘chocolate cupcake’ as well.
Try to focus on what benefits you bring to the customer, not on what you are selling from your own point of view. Normally, that will give you keywords they will most likely to use in their Google searches as well. Or, as we say in our Content SEO eBook:
What will these people be looking for? What kind of search terms could they be using while looking for your amazing service or product? Ask yourself these questions and write down as many answers as you possibly can.
Keep your unique selling point, or your customer’s unique buying reasons in mind when drafting your list of keywords. Make sure these keywords fit your website.
If you need any help with finding the right keywords, please go read this post by Marieke about keyword tools you can use.
Now that you have found your main keywords, you need to make sure these are represented in the right way on your website. A commonly used way is to create a landing page per keyword.
A landing page is a page where your visitors “land” (arrive) from other sources, such as search engines or social media. So basically it’s a page that’s optimized to evoke a certain reaction from the visitor, such as buying a product or subscribing to a newsletter.
We did an article on this subject, which will help you understand how to set up a proper landing page.
You should create collections of pages that work together in Google. Creating one page containing all the keywords you came up with won’t get you visitors. There’s just too much competition on most keywords. You should create multiple landing pages and embed them in a structure that tells Google how those pages relate to each other. Joost wrote an excellent post on how to achieve that by creating cornerstone content and internal linking.
By the way, feel free to perform an exit-intend survey to ask your visitors what keyword they used to find your site, and if they found sufficient information about the topic. Oxfam uses a form like that:
This will give you great insights in what keywords you still have to improve for.
Want your shop’s pages to rank? Then you should carry out keyword research for your webshop! You can do keyword research in three steps:
To finish things off, create nice collections of pages and give these pages a logical place within your site structure.
So go and determine a keywords strategy for your webshop!
Or pose your question in the comments below.
from Yoast • The Art & Science of Website Optimization https://yoast.com/keyword-research-for-your-webshop/
via KCG Auto Feed
If you are new to SEO it is hard to appreciate how easy SEO was say 6 to 8 years ago.
Almost everything worked quickly, cheaply, and predictably.
Go back a few years earlier and you could rank a site without even looking at it. 😀
Links, links, links.
Back then sharing SEO information acted like a meritocracy. If you had something fantastic to share & it worked great you were rewarded. Sure you gave away some of your competitive advantage by sharing it publicly, but you would get links and mentions and recommendations.
These days most of the best minds in SEO don’t blog often. And some of the authors who frequently publish literally everywhere are a series of ghostwriters.
Further, most of the sharing has shifted to channels like Twitter, where the half-life of the share is maybe a couple hours.
Yet if you share something which causes search engineers to change their relevancy algorithms in response the half-life of that algorithm shift can last years or maybe even decades.
These days breaking in can be much harder. I see some sites with over 1,000 high quality links that are 3 or 4 months old which have clearly invested deep into 6 figures which appear to be getting about 80 organic search visitors a month.
From a short enough timeframe it appears nothing works, even if you are using a system which has worked, should work, and is currently working on other existing & trusted projects.
Time delays have an amazing impact on our perceptions and how our reward circuitry is wired.
Most the types of people who have the confidence and knowledge to invest deep into 6 figures on a brand new project aren’t creating “how to” SEO information and giving it away free. Doing so would only harm their earnings and lower their competitive advantage.
Most of the info created about SEO today is derivative (people who write about SEO but don’t practice it) or people overstating the risks and claiming x and y and z don’t work, can’t work, and will never work.
And then from there you get the derivative amplifications of don’t, can’t, won’t.
And then there are people who read and old blog post about how things were x years ago and write as though everything is still the same.
If you are using lagging knowledge from derivative “experts” to drive strategy you are most likely going to lose money.
With all the misinformation, how do you find out what works?
You can pay for good advice. But most people don’t want to do that, they’d rather lose. 😉
The other option is to do your own testing. Then when you find out somewhere where conventional wisdom is wrong, invest aggressively.
“To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. Most large organizations embrace the idea of invention, but are not willing to suffer the string of failed experiments necessary to get there. Outsized returns often come from betting against conventional wisdom, and conventional wisdom is usually right.” – Jeff Bezos
That doesn’t mean you should try to go against consensus view everywhere, but wherever you are investing the most it makes sense to invest in something that is either hard for others to do or something others wouldn’t consider doing. That is how you stand out & differentiate.
But to do your own testing you need to have a number of sites. If you have one site that means everything to you and you get wildly experimental then the first time one of those tests goes astray you’re hosed.
And, even if you do nothing wrong, if you don’t build up a stash of savings you can still get screwed by a false positive. Even having a connection in Google may not be enough to overcome a false positive.
Cutts said, “Oh yeah, I think you’re ensnared in this update. I see a couple weird things. But sit tight, and in a month or two we’ll re-index you and everything will be fine.” Then like an idiot, I made some changes but just waited and waited. I didn’t want to bother him because he’s kind of a famous person to me and I didn’t want to waste his time. At the time Google paid someone to answer his email. Crazy, right? He just got thousands and thousands of messages a day.
I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”
“How did you go bankrupt?”
Two ways. Gradually, then suddenly.”
― Ernest Hemingway, The Sun Also Rises
A lot of SEMrush charts look like the following
What happened there?
Well, obviously that site stopped ranking.
You can’t be certain why without doing some investigation. And even then you can never be 100% certain, because you are dealing with a black box.
That said, there are constant shifts in the algorithms across regions and across time.
Paraphrasing quite a bit here, but in this video Search Quality Senior Strategist at Google Andrey Lipattsev suggested…
He also explained the hole Google has in their Arabic index, with spam being much more effective there due to there being little useful content to index and rank & Google modeling their ranking algorithms largely based on publishing strategies in the western world. Fixing many of these holes is also less of a priority because they view evolving with mobile friendly, AMP, etc. as being a higher priority. They algorithmically ignore many localized issues & try to clean up some aspects of that manually. But even whoever is winning by the spam stuff at the moment might not only lose due to an algorithm update or manual clean up, but once Google has something great to rank there it will eventually win, displacing some of the older spam on a near permanent basis. The new entrant raises the barrier to entry for the lower-quality stuff that was winning via sketchy means.
Over time the relevancy algorithms shift. As new ingredients get added to the algorithms & old ingredients get used in new ways it doesn’t mean that a site which once ranked
In fact, sites which don’t get a constant stream of effort & investment are more likely to slide than have their rankings sustained.
The above SEMchart is for a site which uses the following as their header graphic
When there is literally no competition and the algorithms are weak, something like that can rank.
But if Google looks at how well people respond to what is in the result set, a site as ugly as that is going nowhere fast.
Further, a site like that would struggle to get any quality inbound links or shares.
If nobody reads it then nobody will share it.
The content on the page could be Pulitzer prize level writing and few would take it seriously.
With that design, death is certain in many markets.
The above ugly header design with no taste and a really dumb condescending image is one way to lose. But there are also many other ways.
Excessive keyword repetition like the footer with the phrase repeated 100 times.
Excessive focus on monetization to where most visitors quickly bounce back to the search results to click on a different listing.
Ignoring the growing impact of mobile.
Blowing out the content footprint with pagination and tons of lower quality backfill content.
Stale content full of outdated information and broken links.
A lack of investment in new content creation AND promotion.
Aggressive link anchor text combined with low quality links.
The harder & more expensive Google makes it to enter the search channel the greater incentive there is to spend elsewhere.
Why is Facebook doing so well? In part because Google did the search equivalent to what Yahoo! did with their web portal. The rich diversity in the tail was sacrificed to send users down well worn paths. If Google doesn’t want to rank smaller sites, their associated algorithmic biases mean Facebook and Amazon.com rank better, thus perhaps it makes more sense to play on those platforms & get Google traffic as a free throw-in.
Of course aggregate stats are useless and what really matters is what works for your business. Some may find Snapchat, Instagram, Pinterest or even long forgotten StumbleUpon as solid traffic drivers. Other sites might do well with an email newsletter and exposure on Twitter.
Each bit of exposure (anywhere) leads to further awareness. Which can in turn bleed into aggregate search performance.
People can’t explicitly look for you in a differentiated way unless they are already aware you exist.
Some amount of remarketing can make sense because it helps elevate the perceived status of the site, so long as it is not overdone. However if you are selling a product the customer already bought or you are marketing to marketers there is a good chance such investments will be money wasted while you alienate pas
Years ago people complained about an SEO site being far too aggressive with ad retargeting. And while surfing today I saw that same site running retargeting ads to where you can’t scroll down the page enough to have their ad disappear before seeing their ad once again.
If you don’t have awareness in channels other than search it is easy to get hit by an algorithm update if you rank in competitive markets, particularly if you managed to do so via some means which is the equivalent of, erm, stuffing the ballot box.
And if you get hit and then immediately run off to do disavows and link removals, and then only market your business in ways that are passively driven & tied to SEO you’ll likely stay penalized in a long, long time.
While waiting for an update, you may find you are Waiting for Godot.
from SEO Book http://www.seobook.com/reinventing-seo
via KCG Auto Feed
Does organic click-through rate (CTR) data impact page rankings on Google? This has been a huge topic of speculation for years within the search industry.
Why is there such a debate? Well, often people get hung up on details and semantics (are we talking about a direct or indirect ranking factor?), Google patents (which may or may not even be in use), and competing theories (everyone’s got an opinion based off something they heard or read). To make matters more confusing, Google is less than forthcoming about the secrets of their algorithm.
But if CTR truly does impact Google’s organic search rankings, shouldn’t we be able to measure it? Yes!
In this post, I’ll share some intriguing data on the relationship between Google CTR and rankings. I’ll also share four tips for making sure your Google click-through rates on the organic SERPs are where they need to be.
To be clear: my goal with this post is to provide just a brief background and some actionable insights about the topic of organic click-through rates on Google. We won’t dissect every tweet or quote ever made by anyone at Google, dive deep into patents, or refute all the SEO theories about whether CTR is or isn’t a ranking factor. I’m sharing my own theory based on what I’ve seen, and my recommendations on how to act on it.
Eric Enge of Stone Temple Consulting recently published a post with a headline stating that CTR isn’t a ranking factor. He clarifies within that post that Google doesn’t use CTR as a direct ranking factor.
What’s the difference between a direct and indirect ranking factor? Well, I suggest you watch Rand Fishkin’s awesome video on this very topic.
Basically, we know certain things directly impact rankings (I got a link from a reputable website, hooray!), but there are many other things that don’t have a direct impact, but nevertheless do impact ranking (some big-time influencer tweeted about my company and now tons of people are searching for us and checking out our site, awesome!).
It’s essentially the same issue as last touch attribution, which assigns all the credit to the last interaction. But in reality, multiple channels (PPC, organic, social, email, affiliates, etc.) can play important roles in the path to conversion.
The same is true with ranking. Many factors influence ranking.
So here’s my response: Direct, indirect, who cares? CTR might not be a “direct core ranking signal,” but if it impacts rank (and I believe it does), then it matters. Further, even if it doesn’t impact rank, you should still care!
But don’t take my word for it that Google has the technology. Check out these slides from Google engineer Paul Haahr, who spoke at SMX:
Also, AJ Kohn put together a good post about Google click-through rate as a ranking signal last year. He included a couple eye-opening quotes that I’ll share here because they are important. The first from Edmond Lau, a former Google engineer:
“It’s pretty clear that any reasonable search engine would use click data on their own results to feed back into ranking to improve the quality of search results. Infrequently clicked results should drop toward the bottom because they’re less relevant, and frequently clicked results bubble toward the top. Building a feedback loop is a fairly obvious step forward in quality for both search and recommendations systems, and a smart search engine would incorporate the data.”
The second from Marissa Mayer in 2007 talking about how Google used CTR as a way to determine when to display a OneBox:
“We hold them to a very high click-through rate expectation and if they don’t meet that click-through rate, the OneBox gets turned off on that particular query. We have an automated system that looks at click-through rates per OneBox presentation per query. So it might be that news is performing really well on Bush today but it’s not performing very well on another term, it ultimately gets turned off due to lack of click-through rates. We are authorizing it in a way that’s scalable and does a pretty good job enforcing relevance.”
Also, check out this amazing excerpt from an FTC document that was obtained by the WSJ:
“In addition, click data (the website links on which a user actually clicks) is important for evaluating the quality of the search results page. As Google’s former chief of search quality Udi Manber testified:
‘The ranking itself is affected by the click data. If we discover that, for a particular query, hypothetically, 80 percent of people click on Result No. 2 and only 10 percent click on Result No. 1, after a while we figure out, well, probably Result 2 is the one people want. So we’ll switch it.’
Testimony from Sergey Brin and Eric Schmidt confirms that click data is important for many purposes, including, most importantly, providing ‘feedback’ on whether Google’s search algorithms are offering its users high quality results.”
If you have great positions in the SERPs, that’s awesome. But even high rankings don’t guarantee visits to your site.
What really matters is how many people are clicking on your listing (and not bouncing back immediately). You want to attract more visitors who are likely to stick around and then convert.
In 2009, the head of Google’s webspam team at the time, Matt Cutts, was asked about the importance of maximizing your organic CTR. Here’s a key quote that says it all:
“It doesn’t really matter how often you show up. It matters how often you get clicked on and then how often you … convert those to whatever you really want (sales, purchases, subscriptions)… Do spend some time looking at your title, your URL, and your snippet that Google generates, and see if you can find ways to improve that and make it better for users because then they’re more likely to click. You’ll get more visitors, you’ll get better return on your investment.”
In another video, he talked about the importance of titles, especially on your important web pages: “you want to make something that people will actually click on when they see it in the search results – something that lets them know you’re gonna have the answer they’re looking for.”
Bottom line: Google cares a lot about overall user engagement with the results they show in the SERPs. So if Google is testing your page for relevancy to a particular keyword search, and you want that test to go your way, you better have a great CTR (and great content and great task completion rates). Otherwise, you’ll fail the quality test and someone else will get chosen.
Rand Fishkin conducted one of the most popular tests of the influence of CTR on Google’s search results. He asked people to do a specific search and click on the link to his blog (which was in 7th position). This impacted the rankings for a short period of time, moving the post up to 1st position.
But these are all temporary changes. The rankings don’t persist because the inflated CTR’s aren’t natural.
It’s like how you can’t increase your AdWords Quality Scores simply by clicking on your own ads a few times. This is the oldest trick in the book and it doesn’t work. (Sorry.)
The results of another experiment appeared on Search Engine Land last August and concluded that CTR isn’t a ranking factor. But this test had a pretty significant flaw – it relied on bots artificially inflating CTRs and search volume (and this test was only for a single two-word keyword: “negative SEO”). So essentially, this test was the organic search equivalent of click fraud.
I’ve seen a lot of people saying Google will never use CTR in organic rankings because “it’s too easy to game” or “too easy to fake.” I disagree. Google AdWords has been fighting click fraud for 15 years and they can easily apply these learnings to organic search. There are plenty of ways to detect unnatural clicking. What did I just say about old tricks?
Before we look at the data, a final “disclaimer.” I don’t know if what this data reveals is due to RankBrain, or another machine-learning-based ranking signal that’s already part of the core Google algorithm. Regardless, there’s something here – and I can most certainly say with confidence that CTR is impacting rank.
Google has said that RankBrain is being tested on long-tail terms, which makes sense. Google wants to start testing its machine-learning system with searches they have little to no data on – and 99% of pages have zero external links pointing to them.
How is Google able to tell which pages should rank in these cases?
By examining engagement and relevance. CTR is one of the best indicators of both.
High-volume head terms, as far as we know, aren’t being exposed to RankBrain right now. So by observing the differences between the organic search CTRs of long-tail terms versus head terms, we should be able to spot the difference:
So here’s what we did: We looked at 1,000 keywords in the same keyword niche (to isolate external factors like Google shopping and other SERP features that can alter CTR characteristics). The keywords are all from my own website: wordstream.com.
I compared CTR versus rank for one- or two-word search terms, and did the same thing for long-tail keywords (search terms between 4 to 10 words).
Notice how the long-tail terms get much higher average CTRs for a given position. For example, in this data set, the head term in position 1 got an average CTR of 17.5%, whereas the long-tail term in position 1 had a remarkably high CTR, at an average of 33%.
You’re probably thinking: “Well, that makes sense. You’d expect long-tail terms to have stronger query intent, thus higher CTRs.” That’s true, actually.
But why is that long-tail keyword terms with high CTRs are so much more likely to be in top positions versus bottom-of-page organic positions? That’s a little weird, right?
OK, let’s do an analysis of paid search queries in the same niche. We use organic search to come up with paid search keyword ideas and vice versa, so we’re looking at the same keywords in many cases.
Long-tail terms in this same vertical get higher CTRs than head terms. However, the difference between long-tail and head term CTR is very small in positions 1–2, and becomes huge as you go out to lower positions.
So in summary, something unusual is happening:
Why are the same keywords behaving so differently in organic versus paid?
The difference (we think) is that pages with higher organic click-through rates are getting a search ranking boost.
CTR and ranking are codependent variables. There’s obviously a relationship between the two, but which is causing what? In order to get to the bottom of this “chicken versus egg” situation, we’re going to have to do a bit more analysis.
The following graph takes the difference between an observed organic search CTR minus the expected CTR, to figure out if your page is beating — or being beaten by — the expected average CTR for a given organic position.
By only looking at the extent by which a keyword beats or is beaten by the predicted CTR, you are essentially isolating the natural relationship between CTR and ranking in order to get a better picture of what’s going on.
We found that, on average, if you beat the expected CTR, then you’re far more likely to rank in more prominent positions. Failing to beat the expected CTR makes it more likely you’ll appear in positions 6–10.
So, based on our example of long-tail search terms for this niche, if a page:
And so on.
Here’s a greatly simplified rule of thumb:
The more your pages beat the expected organic CTR for a given position, the more likely you are to appear in prominent organic positions.
If your pages fall below the expected organic Google search CTR, then you’ll find your pages in lower organic positions on the SERP.
Want to move up by one position in Google’s rankings? Increase your CTR by 3%. Want to move up another spot? Increase your CTR by another 3%.
If you can’t beat the expected click-through rate for a given position, you’re unlikely to appear in positions 1–5.
Essentially, you can think of all of this as though Google is giving bonus points to pages that have high click-through rates. The fact that it looks punitive is just a natural side effect.
If Google gives “high CTR bonus points” to other websites, then your relative performance will decline. It’s not that you got penalized; it’s just that you didn’t get the rewards.
Many “expert” SEOs will tell you not to waste time trying to maximize your CTRs since it’s supposedly “not a direct ranking signal.” “Let’s build more links and make more infographics,” they say.
I couldn’t disagree more. If you want to rank better, you need to get more people to your website. (And getting people to your website is the whole point of ranking anyway!)
AdWords and many other technologies look at user engagement signals to determine page quality and relevance. We’ve already seen evidence that CTR is important to Google.
So how do you raise your Google CTRs – not just for a few days, but in a sustained way? You should focus your efforts in four key areas:
You want to make sure your pages get as many organic search clicks as possible. Doing so means more people are visiting your site, which will send important signals to Google that your page is relevant and awesome.
Our research also shows that above-expected user engagement metrics result in better organic rankings, which results in even more clicks to your site.
Don’t settle for average CTRs. Be a unicorn in a sea of donkeys! Raise your CTRs and engagement rates! Get optimizing now!
This article was originally published on the Word Stream blog, reprinted with permission.
from Search Engine Watch https://searchenginewatch.com/2016/05/12/why-you-need-to-raise-organic-ctrs-and-how-to-do-it/
via Auto Feed
Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”
What sort of strategy is helping to drive that industry transformation?
How about doorway pages.
These sorts of doorway pages are still live to this day. Simply look at the footer area of lendup.com/payday-loans
This in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.
March 16, 2015 Google To Launch New Doorway Page Penalty Algorithm
Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.
Today we get
journalists conduits for Google’s public relations efforts writing headlines like: Google: Payday Loans Are Too Harmful to Advertise.
Today those sorts of stories are literally everywhere.
Tomorrow the story will be over.
And when it is.
Precisely zero journalists will have covered the above contrasting behaviors.
As they weren’t in the press release.
Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp’s lead in re-branding their offers as being something else in name.
Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.
Don’t expect to see a link to this blog post on TechCrunch.
There you’ll read some hard-hitting cutting edge tech news like:
Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.
from SEO Book http://www.seobook.com/google-rethinking-payday-loans
via KCG Auto Feed