Rank Checker Update

Recently rank checker started hanging on some search queries & the button on the SEO Toolbar which launched rank checker stopped working. Both of these issues should now be fixed if you update your Firefox extensions.

If ever the toolbar button doesn’t work one can enable the Menu bar in Firefox, then go under the tools menu to the rank checker section to open it.

Years ago we created a new logo for rank checker which we finally got around to changing it today. :)

Rank Checker.

Categories: 

from SEO Book http://www.seobook.com/rank-checker-update
via KCG Auto Feed

DMOZ Shut Down

Last August I wrote a blog post about how attention merchants were sucking the value out of online publishing. In it I noted how the Yahoo! Directory disappeared & how even DMOZ saw a sharp drop in traffic & rankings over the past few years.

The concept of a neutral web is dead. In its place is agenda-driven media.

  • Politically charged misinformed snippets.
  • Ads cloaked as content.
  • Public relations propaganda.
  • Mostly correct (but politically insensitive) articles being “fact checked” where a minor detail is disputed to label the entire piece as not credible.

As the tech oligarchs broadly defund publishing, the publishers still need to eat. Aggregate information quality declines to make the numbers work. Companies which see their ad revenues slide 20%, 30% or 40% year after year can’t justify maintaining the labor-intensive yet unmonetized side projects.

There is Wikipedia, but it is not without bias & beyond the value expressed in the hidden bias most of the remaining value from it flows on through to the attention merchant / audience aggregation / content scraper platforms.

Last month DMOZ announced they were closing on March 14th without much fanfare. And on March 17th the directory went offline.

A number of people have pushed to preserve & archive the DMOZ data. Some existing DMOZ editors are planning on launching a new directory under a different name but as of the 17th DMOZ editors put up a copy at dmoztools.net. Jim Boykin scraped DMOZ & uploaded a copy here. A couple other versions of DMOZ have been published at OpenDirectoryProject.org & Freemoz.org.

DMOZ was not without criticism or controversy,

Although site policies suggest that an individual site should be submitted to only one category, as of October 2007, Topix.com, a news aggregation site operated by DMOZ founder Rich Skrenta, has more than 17,000 listings.

Early in the history of DMOZ, its staff gave representatives of selected companies, such as Rolling Stone or CNN, editing access in order to list individual pages from their websites. Links to individual CNN articles were added until 2004, but were entirely removed from the directory in January 2008 due to the content being outdated and not considered worth the effort to maintain.

but by-and-large it added value to the structure of the web.

As search has advanced (algorithmic evolution, economic power, influence over publishers, enhanced bundling of distribution & user tracking) general web directories haven’t been able to keep pace. Ultimately the web is a web of links & pages rather than a web of sites. Many great sites span multiple categories. Every large quality site has some misinformation on it. Every well-known interactive site has some great user contributions & user generated spam on it. Search engines have better signals about what pages are important & which pages have maintained importance over time. As search engines have improved link filtering algorithms & better incorporated user tracking in rankings, broad-based manual web directories had no chance.

The web of pages vs web of sites concept can be easily observed in how some of the early successful content platforms have broken down their broad-based content portals into a variety of niche sites.

When links were (roughly) all that mattered, leveraging a website’s link authority meant it was far more profitable for a large entity to keep publishing more content on the one main site. That is how eHow became the core of a multi-billion Dollar company.

Demand Media showed other publishers the way. And if the other existing sites were to stay competitive, they also had to water down content quality to make the numbers back out. The problem with this was the glut of content was lower ad rates. And the decline in ad rates was coupled with a shift away from a links-only view of search relevancy to a model based on weighting link profiles against user engagement metrics.

Websites with lots of links, lots of thin content & terrible engagement metrics were hit.

Kristen Moore, vp of marketing for Demand Media, explained what drove the most egregious aspects of eHow’s editorial strategy: “There’s some not very bright people out there.”

eHow improved their site design, drastically reduced their ad density, removed millions of articles from their site, and waited. However nothing they did on that domain name was ever going to work. They dug too deep of a hole selling the growth story to pump a multi-billion Dollar valuation. And they generated so much animosity from journalists who felt overwork & underpaid that even when they did rank journalists would typically prefer to link to anything but them.

The flip side of that story is the newspaper chains, which rushed to partner with Demand Media to build eHow-inspired sections on their sites.

Brands which enjoy the Google brand subsidy are also quite hip to work with Demand Media, which breathes new life into once retired content: “Sometimes Demand will even dust off old content that’s been published but is no longer live and repurpose it for a brand.”

As Facebook & Google grew more dominant in the online ad ecosystem they aggressively moved to suck in publisher content & shift advertiser spend onto their core properties. The rise of time spent on social sites only made it harder for websites to be sought out destination. Google also effectively cut off direct distribution by consolidating & de-monetizing the RSS reader space then shutting down a project they easily could have left run.

As the web got more competitive, bloggers & niche publications which were deeply specialized were able to steal marketshare in key verticals by leveraging a differentiated editorial opinion.

Even if they couldn’t necessarily afford to build strong brands via advertising, they were worthy of a follow on some social media channels & perhaps an email subscription. And the best niche editorial remains worthy of a direct visit:

Everything about Techmeme and its lingering success seems to defy the contemporary wisdom of building a popular website. It publishes zero original reporting and is not a social network. It doesn’t have a mobile app or a newsletter or even much of a social presence beyond its Twitter account, which posts dry commodity news with zero flair for clickability.

As a work around to the Panda hits, sites like eHow are now becoming collections of niche-focused sites (Cuteness.com, Techwalla.com, Sapling.com, Leaf.tv, etc will join Livestrong.com & eHow.com). It appears to be working so far…

…but they may only be 1 Panda update away from finding out the new model isn’t sustainable either.

About.com has done the same thing (TheSpruce.com, Verywell.com, Lifewire.com, TheBalance.com). Hundreds of millions of Dollars are riding on the hope that as the algorithms keep getting more granular they won’t discover moving the content to niche brands wasn’t enough.

As content moves around search engines with billions of Dollars in revenue can recalibrate rankings for each page & adjust rankings based on user experience. Did an influential “how to” guide become irrelevant after a software or hardware update? If so, they can see it didn’t solve the user’s problem and rank a more recent document which reflects the current software or hardware. Is a problem easy to solve with a short snippet of content? If so, that can get scraped into the search results.

Web directories which are built around sites rather than pages have no chance of competing against the billions of Dollars of monthly search ads & the full cycle user tracking search companies like Google & Bing can do with their integrated search engines, ad networks, web browsers & operating systems.

Arguably in most cases the idea of neutral-based publishing no longer works on the modern web. The shill gets exclusive stories. The political polemic gets automatic retweets from those who identify. The content which lacks agenda probably lacks the economics to pay for ads & buy distribution unless people can tell the creator loves what they do so much it influences them enough to repeatedly visit & perhaps pay for access.

Categories: 

from SEO Book http://www.seobook.com/dmoz-shut-down
via KCG Auto Feed

New gTLDs are Like Used Cars

There may be a couple exceptions which prove the rule, but new TLDs are generally an awful investment for everyone except the registry operator.

Here is the short version…

And the long version…

Diminishing Returns

About a half-decade ago I wrote about how Google devalued domain names from an SEO perspective & there have been a number of leading “category killer” domains which have repeatedly been recycled from startup to acquisition to shut down to PPC park page to buy now for this once in a lifetime opportunity in an endless water cycle.

The central web platforms are becoming ad heavy, which in turn decreases the reach of anything which is not an advertisement. For the most valuable concepts / markets / keywords ads eat up the entire interface for the first screen full of results. Key markets like hotels might get a second round of vertical ads to further displace the concept of organic results.

Proprietary, Closed-Ecosystem Roach Motels

The tech monopolies can only make so much money by stuffing ads onto their own platform. To keep increasing their take they need to increase the types, varieties & formats of media they host and control & keep the attention on their platform.

Both Google & Facebook are promoting scams where they feed on desperate publishers & suck a copy of the publisher’s content into being hosted by the tech monopoly platform de jour & sprinkle a share of the revenues back to the content sources.

They may even pay a bit upfront for new content formats, but then after the market is primed the deal shifts to where (once again) almost nobody other than the tech monopoly platform wins.

The attempt to “own” the web & never let users go is so extreme both companies will make up bogus statistics to promote their proprietary / fake open / actually closed standards.

If you ignore how Google’s AMP double, triple, or quadruple counts visitors in Google Analytics the visit numbers look appealing.

But the flip side of those fake metrics is actual revenues do not flow.

Facebook has the same sort of issues, with frequently needing to restate various metrics while partners fly blind.

These companies are restructuring society & the race to the bottom to try to make the numbers work in an increasingly unstable & parasitic set of platform choices is destroying adjacent markets:

Have you tried Angry Birds lately? It’s a swamp of dark patterns. All extractive logic meant to trick you into another in-app payment. It’s the perfect example of what happens when product managers have to squeeze ever-more-growth out of ever-less-fertile lands to hit their targets year after year. … back to the incentives. It’s not just those infused by venture capital timelines and return requirements, but also the likes of tax incentives favoring capital gains over income. … that’s the truly insidious part of the tech lords solution to everything. This fantasy that they will be greeted as liberators. When the new boss is really a lot like the old boss, except the big stick is replaced with the big algorithm. Depersonalizing all punishment but doling it out just the same. … this new world order is being driven by a tiny cabal of monopolies. So commercial dissent is near impossible. … competition is for the little people. Pitting one individual contractor against another in a race to the bottom. Hoarding all the bargaining power at the top. Disparaging any attempts against those at the bottom to organize with unions or otherwise.

To be a success on the attention platforms you have to push toward the edges. But as you become successful you become a target.

And the dehumanized “algorithm” is not above politics & public relations.

Pewdiepie is the biggest success story on the YouTube platform. When he made a video showing some of the absurd aspects of Fiverr it led to a WSJ investigation which “uncovered” a pattern of anti-semitism. And yet one of the reporters who worked on that story wrote far more offensive and anti-semetic tweets. The hypocrisy of the hit job didn’t matter. They still were able to go after Pewdiepie’s ad relationships to cut him off from Disney’s Maker Studios & the premium tier of YouTube ads.

The fact that he is an individual with broad reach means he’ll still be fine economically, but many other publishers would quickly end up in a death spiral from the above sequence.

If it can happen to a leading player in a closed ecosystem then the risk to smaller players is even greater.

In some emerging markets Facebook effectively *is* the Internet.

The Decline of Exact Match Domains

Domains have been so devalued (from an SEO perspective) that some names like PaydayLoans.net sell for about $3,000 at auction.

$3,000 can sound like a lot to someone with no money, but names like that were going for 6 figures at their peak.

Professional domain sellers participate in the domain auctions on sites like NameJet & SnapNames. Big keywords like [payday loans] in core trusted extensions are not missed. So if the 98% decline in price were an anomaly, at least one of them would have bid more in that auction.

Why did exact match domains fall so hard? In part because Google shifted from scoring the web based on links to considering things like brand awareness in rankings. And it is very hard to run a large brand-oriented ad campaign promoting a generically descriptive domain name. Sure there are a few exceptions like Cars.com & Hotels.com, but if you watch much TV you’ll see a lot more ads associated with businesses that are not built on generically descriptive domain names.

Not all domains have fallen quite that hard in price, but the more into the tail you go the less the domain acts as a memorable differentiator. If the barrier to entry increases, then the justification for spending a lot on a domain name as part of a go to market strategy makes less sense.

Brandable Names Also Lose Value

Arguably EMDs have lost more value than brandable domain names, but even brandable names have sharply slid.

If you go back a decade or two tech startups would secure their name (say Snap.com or Monster.com or such) & then try to build a business on it.

But in the current marketplace with there being many paths to market, some startups don’t even have a domain name at launch, but begin as iPhone or Android apps.

Now people try to create success on a good enough, but cheap domain name & then as success comes they buy a better domain name.

Jelly was recently acquired by Pinterest. Rather than buying jelly.com they were still using AskJelly.com for their core site & Jelly.co for their blog.

As long as domain redirects work, there’s no reason to spend heavily on a domain name for a highly speculative new project.

Rather then spending 6 figures on a domain name & then seeing if there is market fit, it is far more common to launch a site on something like getapp.com, joinapp.com, app.io, app.co, businessnameapp.com, etc.

This in turn means that rather than 10,000s of startups all chasing their core .com domain name off the start, people test whatever is good enough & priced close to $10. Then only after they are successful do they try to upgrade to better, more memorable & far more expensive domain names.

Money isn’t spent on the domain names until the project has already shown market fit.

One in a thousand startups spending $1 million is less than one in three startups spending $100,000.

New TLDs Undifferentiated, Risky & Overpriced

No Actual Marketing Being Done

Some of the companies which are registries for new TLDs talk up investing in marketing & differentiation for the new TLDs, but very few of them are doing much on the marketing front.

You may see their banner ads on domainer blogs & they may even pay for placement with some of the registries, but there isn’t much going on in terms of cultivating a stable ecosystem.

When Google or Facebook try to enter & dominate a new vertical, the end destination may be extractive rent seeking by a monopoly BUT off the start they are at least willing to shoulder some of the risk & cost upfront to try to build awareness.

Where are the domain registries who have built successful new businesses on some of their new TLDs? Where are the subsidies offered to key talent to help drive awareness & promote the new strings?

As far as I know, none of that stuff exists.

In fact, what is prevalent is the exact opposite.

Greed-Based Anti-Marketing

So many of them are short sighted greed-based plays that they do the exact opposite of building an ecosystem … they hold back any domain which potentially might not be complete garbage so they can juice it for a premium ask price in the 10s of thousands of dollars.

While searching on GoDaddy Auctions for a client project I have seen new TLDs like .link listed for sale for MORE THAN the asking price of similar .org names.

If those prices had any sort of legitimate foundation then the person asking $30,000 for a .link would have bulk bought all the equivalent .net and .org names which are listed for cheaper prices.

But the prices are based on fantasy & almost nobody is dumb enough to pay those sorts of prices.

Anyone dumb enough to pay that would be better off buying their own registry rather than a single name.

The holding back of names is the exact opposite of savvy marketing investment. It means there’s no reason to use the new TLD if you either have to pay through the nose or use a really crappy name nobody will remember.

I didn’t buy more than 15 of Uniregistry’s domains because all names were reserved in the first place and I didn’t feel like buying 2nd tier domains … Domainers were angry when the first 2 Uniregistry’s New gTLDs (.sexy and .tattoo) came out and all remotely good names were reserved despite Frank saying that Uniregistry would not reserve any domains.

Who defeats the race to the bottom aspects of the web by starting off from a “we only sell shit” standpoint?

Nobody.

And that’s why these new TLDs are a zero.

Defaults Have Value

Many online verticals are driven by winner take most monopoly economics. There’s a clear dominant leader in each of these core markets: social, search, short-form video, long-form video, retail, auctions, real estate, job search, classifieds, etc. Some other core markets have consolidated down to 3 or 4 core players who among them own about 50 different brands that attack different parts of the market.

Almost all the category leading businesses which dominate aggregate usage are on .com domains.

Contrast the lack of marketing for new TLDs with all the marketing one sees for the .com domain name.

Local country code domain names & .com are not going anywhere. And both .org and .net are widely used & unlikely to face extreme price increases.

Hosing The Masses…

A decade ago domainers were frustrated Verisign increased the price of .com domains in ~ 5% increments:

Every mom, every pop, every company that holds a domain name had no say in the matter. ICANN basically said to Verisign: “We agree to let you hose the masses if you stop suing us”.

I don’t necessarily mind paying more for domains so much as I mind the money going to a monopolistic regulator which has historically had little regard for the registrants/registrars it should be serving

Those 5% or 10% shifts were considered “hosing the masses.”

Imagine what sort of blowback PIR would get from influential charities if they tried to increase the price of .org domains 30-fold overnight. It would be such a public relations disaster it would never be considered.

Domain registries are not particularly expensive to run. A person who has a number of them can run each of them for less than the cost of a full time employee – say $25,000 to $50,00 per year.

And yet, the very people who complained about Verisign’s benign price increases, monopolistic abuses & rent extraction are now pushing massive price hikes:

.Hosting and .juegos are going up from about $10-$20 retail to about $300. Other domains will also see price increases.

Here’s the thing with new TLD pricing: registry operators can increase prices as much as they want with just six months’ notice.

in its applications, Uniregistry said it planned to enter into a contractual agreement to not increase its prices for five years.

Why would anyone want to build a commercial enterprise (or anything they care about) on such a shoddy foundation?

If a person promises…

  • no hold backs of premium domains, then reserves 10s of thousands of domains
  • no price hikes for 5 years, then hikes prices
  • the eventual price hikes being inline with inflation, then hikes prices 3,000%

That’s 3 strikes and the batter is out.

Doing the Math

The claim the new TLDs need more revenues to exist are untrue. Running an extension costs maybe $50,000 per year. If a registry operator wanted to build a vibrant & stable ecosystem the first step would be dumping the concept of premium domains to encourage wide usage & adoption.

There are hundreds of these new TLD extensions and almost none of them can be trusted to be a wise investment when compared against similar names in established extensions like .com, .net, .org & CCTLDs like .co.uk or .fr.

There’s no renewal price protection & there’s no need, especially as prices on the core TLDs have sharply come down.

Domain Pricing Trends

Aggregate stats are somewhat hard to come by as many deals are not reported publicly & many sites which aggregate sales data also list minimum prices.

However domains have lost value for many reasons

  • declining SEO-related value due to the search results becoming over-run with ads (Google keeps increasing their ad clicks 20% to 30% year over year)
  • broad market consolidation in key markets like travel, ecommerce, search & social
    • Google & Facebook are eating OVER 100% of online advertising growth – the rest of industry is shrinking in aggregate
    • are there any major news sites which haven’t struggled to monetize mobile?
    • there is a reason there are few great indy blogs compared to a decade ago
  • rising technical costs in implementing independent websites (responsive design, HTTPS, AMP, etc.) “Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don’t like standardization … it looks like rent seeking behaviors on top of friction” – Gabe Newell
  • harder to break into markets with brand-biased relevancy algorithms (increased chunk size of competition)
  • less value in trying to build a brand on a generic name, which struggles to rank in a landscape of brand-biased algorithms (inability to differentiate while being generically descriptive)
  • decline in PPC park page ad revenues
    • for many years Yahoo! hid the deterioration in their core business by relying heavily on partners for ad click volumes, but after they switched to leveraging Bing search, Microsoft was far more interested with click quality vs click quantity
    • absent the competitive bid from Yahoo!, Google drastically reduced partner payouts
    • most web browsers have replaced web address bars with dual function search boxes, drastically reducing direct navigation traffic

All the above are the mechanics of “why” prices have been dropping, but it is also worth noting many of the leading portfolios have been sold.

If the domain aftermarket is as vibrant as some people claim, there’s no way the Marchex portfolio of 200,000+ domains would have sold for only $28.1 million a couple years ago.

RegistrarStats shows .com registrations have stopped growing & other extensions like .net, .org, .biz & .info are now shrinking.

Both aftermarket domain prices & the pool of registered domains on established gTLDs are dropping.

I know I’ve dropped hundreds & hundreds of domains over the past year. That might be due to my cynical views of the market, but I did hold many names for a decade or more.

As barrier to entry increases, many of the legacy domains which could have one day been worth developing have lost much of their value.

And the picked over new TLDs are an even worse investment due to the near infinite downside potential of price hikes, registries outright folding, etc.

Into this face of declining value there is a rush of oversupply WITH irrational above-market pricing. And then the registries which spend next to nothing on marketing can’t understand why their great new namespaces went nowhere.

As much as I cringe at .biz & .info, I’d prefer either of them over just about any new TLD.

Any baggage they may carry is less than the risk of going with an unproven new extension without any protections whatsoever.

Losing Faith in the Zimbabwe Dollar

Who really loses is anyone who read what these domain registry operators wrote & trusted them.

Uniregistry does not believe that registry fees should rise when the costs of other technology services have uniformly trended downward, simply because a registry operator believes it can extract higher profit from its base of registrants.

How does one justify a 3000% price hike after stating “Our prices are fixed and only indexed to inflation after 5 years.”

Are they pricing these names in Zimbabwe Dollars? Or did they just change their minds in a way that hurt anyone who trusted them & invested in their ecosystem?

Frank Schilling warned about the dangers of lifting price controls

The combination of “presumptive renewal” and the “lifting of price controls on registry services” is incredibly dangerous.
Imagine buying a home, taking on a large mortgage, remodeling, moving in, only to be informed 6 months later that your property taxes will go up 10,000% with no better services offered by local government. The government doesn’t care if you can’t pay your tax/mortgage because they don’t really want you to pay your tax… they want you to abandon your home so they can take your property and resell it to a higher payer for more money, pocketing the difference themselves, leaving you with nothing.

This agreement as written leaves the door open to exactly that type of scenario

He didn’t believe the practice to be poor.

Rather he felt he would have been made poorer, unless he was the person doing it:

It would be the mother of all Internet tragedies and a crippling blow to ICANN’s relevance if millions of pioneering registrants were taxed out of their internet homes as a result of the greed of one registry and the benign neglect, apathy or tacit support of its master.

It is a highly nuanced position.

Categories: 

from SEO Book http://www.seobook.com/new-tlds-are-junk
via KCG Auto Feed

Google & Facebook Squeezing Out Partners

Sections

Just Make Great Content…

Remember the whole shtick about good, legitimate, high-quality content being created for readers without concern for search engines – even as though search engines do not exist?

Whatever happened to that?

We quickly shifted from the above “ideology” to this:

The red triangle/exclamation point icon was arrived at after the Chrome team commissioned research around the world to figure out which symbols alarmed users the most.

Search Engine Engineering Fear

Google is explicitly spreading the message that they are doing testing on how to create maximum fear to try to manipulate & coerce the ecosystem to suit their needs & wants.

At the same time, the Google AMP project is being used as the foundation of effective phishing campaigns.

Scare users off of using HTTP sites AND host phishing campaigns.

Killer job Google.

Someone deserves a raise & some stock options. Unfortunately that person is in the PR team, not the product team.

Ignore The Eye Candy, It’s Poisoned

I’d like to tell you that I was preparing the launch of https://amp.secured.mobile.seobook.com but awareness of past ecosystem shifts makes me unwilling to make that move.

I see it as arbitrary hoop jumping not worth the pain.

If you are an undifferentiated publisher without much in the way of original thought, then jumping through the hoops make sense. But if you deeply care about a topic and put a lot of effort into knowing it well, there’s no reason to do the arbitrary hoop jumping.

Remember how mobilegeddon was going to be the biggest thing ever? Well I never updated our site layout here & we still outrank a company which raised & spent 10s of millions of dollars for core industry terms like [seo tools].

Though it is also worth noting that after factoring in increased ad load with small screen sizes & the scrape graph featured answer stuff, a #1 ranking no longer gets it done, as we are well below the fold on mobile.

   

Below the Fold = Out of Mind

In the above example I am not complaining about ranking #5 and wishing I ranked #2, but rather stating that ranking #1 organically has little to no actual value when it is a couple screens down the page.

Google indicated their interstitial penalty might apply to pop ups that appear on scroll, yet Google welcomes itself to installing a toxic enhanced version of the Diggbar at the top of AMP pages, which persistently eats 15% of the screen & can’t be dismissed. An attempt to dismiss the bar leads the person back to Google to click on another listing other than your site.

As bad as I may have made mobile search results appear earlier, I was perhaps being a little to kind. Google doesn’t even have mass adoption of AMP yet & they already have 4 AdWords ads in their mobile search results AND when you scroll down the page they are testing an ugly “back to top” button which outright blocks a user’s view of the organic search results.

What happens when Google suggests what people should read next as an overlay on your content & sells that as an ad unit where if you’re lucky you get a tiny taste of the revenues?

Is it worth doing anything that makes your desktop website worse in an attempt to try to rank a little higher on mobile devices?

Given the small screen size of phones & the heavy ad load, the answer is no.

I realize that optimizing a site design for mobile or desktop is not mutually exclusive. But it is an issue we will revisit later on in this post.

Coercion Which Failed

Many people new to SEO likely don’t remember the importance of using Google Checkout integration to lower AdWords ad pricing.

You either supported Google Checkout & got about a 10% CTR lift (& thus 10% reduction in click cost) or you failed to adopt it and got priced out of the market on the margin difference.

And if you chose to adopt it, the bad news was you were then spending yet again to undo it when the service was no longer worth running for Google.

How about when Google first started hyping HTTPS & publishers using AdSense saw their ad revenue crash because the ads were no longer anywhere near as relevant.

Oops.

Not like Google cared much, as it is their goal to shift as much of the ad spend as they can onto Google.com & YouTube.

It is not an accident that Google funds an ad blocker which allows ads to stream through on Google.com while leaving ads blocked across the rest of the web.

Android Pay might be worth integrating. But then it also might go away.

It could be like Google’s authorship. Hugely important & yet utterly trivial.
Faces help people trust the content.
Then they are distracting visual clutter that need expunged.
Then they once again re-appear but ONLY on the Google Home Service ad units.
They were once again good for users!!!

Neat how that works.

Embrace, Extend, Extinguish

Or it could be like Google Reader. A free service which defunded all competing products & then was shut down because it didn’t have a legitimate business model due to it being built explicitly to prevent competition. With the death of Google reader many blogs also slid into irrelevancy.

Their FeedBurner acquisition was icing on the cake.

Techdirt is known for generally being pro-Google & they recently summed up FeedBurner nicely:

Thanks, Google, For Fucking Over A Bunch Of Media Websites – Mike Masnick

Ultimately Google is a horrible business partner.

And they are an even worse one if there is no formal contract.

Dumb Pipes, Dumb Partnerships

They tried their best to force broadband providers to be dumb pipes. At the same time they promoted regulation which will prevent broadband providers from tracking their own users the way that Google does, all the while broadening out Google’s privacy policy to allow personally identifiable web tracking across their network. Once Google knew they would retain an indefinite tracking advantage over broadband providers they were free to rescind their (heavily marketed) free tier of Google Fiber & they halted the Google Fiber build out.

When Google routinely acts so anti-competitive & abusive it is no surprise that some of the “standards” they propose go nowhere.

You can only get screwed so many times before you adopt a spirit of ambivalence to the avarice.

Google is the type of “partner” that conducts security opposition research on their leading distribution partner, while conveniently ignoring nearly a billion OTHER Android phones with existing security issues that Google can’t be bothered with patching.

Deliberately screwing direct business partners is far worse than coding algorithms which belligerently penalize some competing services all the while ignoring that the payday loan shop funded by Google leverages doorway pages.

“User” Friendly

BackChannel recently published an article foaming at the mouth promoting the excitement of Google’s AI:

This 2016-to-2017 Transition is going to move us from systems that are explicitly taught to ones that implicitly learn.” … the engineers might make up a rule to test against—for instance, that “usual” might mean a place within a 10-minute drive that you visited three times in the last six months. “It almost doesn’t matter what it is — just make up some rule,” says Huffman. “The machine learning starts after that.

The part of the article I found most interesting was the following bit:

After three years, Google had a sufficient supply of phonemes that it could begin doing things like voice dictation. So it discontinued the [phone information] service.

Google launches “free” services with an ulterior data motive & then when it suits their needs, they’ll shut it off and leave users in the cold.

As Google keeps advancing their AI, what do you think happens to your AMP content they are hosting? How much do they squeeze down on your payout percentage on those pages? How long until the AI is used to recap / rewrite content? What ad revenue do you get when Google offers voice answers pulled from your content but sends you no visitor?

The Numbers Can’t Work

A recent Wall Street Journal article highlighting the fast ad revenue growth at Google & Facebook also mentioned how the broader online advertising ecosystem was doing:

Facebook and Google together garnered 68% of spending on U.S. online advertising in the second quarter—accounting for all the growth, Mr. Wieser said. When excluding those two companies, revenue generated by other players in the U.S. digital ad market shrank 5%

The issue is NOT that online advertising has stalled, but rather that Google & Facebook have choked off their partners from tasting any of the revenue growth. This problem will only get worse as mobile grows to a larger share of total online advertising:

By 2018, nearly three-quarters of Google’s net ad revenues worldwide will come from mobile internet ad placements. – eMarketer

Media companies keep trusting these platforms with greater influence over their business & these platforms keep screwing those same businesses repeatedly.

You pay to get likes, but that is no longer enough as edgerank declines. Thanks for adopting Instant Articles, but users would rather see live videos & read posts from their friends. You are welcome to pay once again to advertise to the following you already built. The bigger your audience, the more we will charge you! Oh, and your direct competitors can use people liking your business as an ad targeting group.

Worse yet, Facebook & Google are even partnering on core Internet infrastructure.

Any hope of AMP turning the corner on the revenue front is a “no go”:

“We want to drive the ecosystem forward, but obviously these things don’t happen overnight,” Mr. Gingras said. “The objective of AMP is to have it drive more revenue for publishers than non-AMP pages. We’re not there yet”.

Publishers who are critical of AMP were reluctant to speak publicly about their frustrations, or to remove their AMP content. One executive said he would not comment on the record for fear that Google might “turn some knob that hurts the company.”

Look at that.

Leadership through fear once again.

At least they are consistent.

As more publishers adopt AMP, each publisher in the program will get a smaller share of the overall pie.

Just look at Google’s quarterly results for their current partners. They keep showing Google growing their ad clicks at 20% to 40% while partners oscillate between -15% and +5% quarter after quarter, year after year.

In the past quarter Google grew their ad clicks 42% YoY by pushing a bunch of YouTube auto play video ads, faster search growth in third world markets with cheaper ad prices, driving a bunch of lower quality mobile search ad clicks (with 3 then 4 ads on mobile) & increasing the percent of ad clicks on “own brand” terms (while sending the FTC after anyone who agrees to not cross bid on competitor’s brands).

The lower quality video ads & mobile ads in turn drove their average CPC on their sites down 13% YoY.

The partner network is relatively squeezed out on mobile, which makes it shocking to see the partner CPC off more than core Google, with a 14% YoY decline.

What ends up happening is eventually the media outlets get sufficiently defunded to where they are sold for a song to a tech company or an executive at a tech company. Alibaba buying SCMP is akin to Jeff Bezos buying The Washington Post.

The Wall Street Journal recently laid off reporters. The New York Times announced they were cutting back local cultural & crime coverage.

If news organizations of that caliber can’t get the numbers to work then the system has failed.

The Guardian is literally incinerating over 5 million pounds per month. ABC is staging fake crime scenes (that’s one way to get an exclusive).

The Tribune Company, already through bankruptcy & perhaps the dumbest of the lot, plans to publish thousands of AI assisted auto-play videos in their articles every day. That will guarantee their user experience on their owned & operated sites is worse than just about anywhere else their content gets distributed to, which in turn means they are not only competing against themselves but they are making their own site absolutely redundant & a chore to use.

That the Denver Guardian (an utterly fake paper running fearmongering false stories) goes viral is just icing on the cake.

These tech companies are literally reshaping society & are sucking the life out of the economy, destroying adjacent markets & bulldozing regulatory concerns, all while offloading costs onto everyone else around them.

An FTC report recommended suing Google for their anti-competitive practices, but no suit was brought. The US Copyright Office Register was relieved of her job after she went against Google’s views on set top boxes.

And in spite of the growing importance of tech media coverage of the industry is a trainwreck:

This is what it’s like to be a technology reporter in 2016. Freebies are everywhere, but real access is scant. Powerful companies like Facebook and Google are major distributors of journalistic work, meaning newsrooms increasingly rely on tech giants to reach readers, a relationship that’s awkward at best and potentially disastrous at worst.

Being a conduit breeds exclusives. Challenging the grand narrative gets one blackballed.

Mobile Search Index

Google announced they are releasing a mobile first search index:

Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices.

There are some forms of content that simply don’t work well on a 350 pixel wide screen, unless they use a pinch to zoom format. But using that format is seen as not being mobile friendly.

Imagine you have an auto part database which lists alternate part numbers, price, stock status, nearest store with part in stock, time to delivery, etc. … it is exceptionally hard to get that information to look good on a mobile device. And good luck if you want to add sorting features on such a table.

The theory that using the desktop version of a page to rank mobile results is flawed because users might find something which is only available on the desktop version of a site is a valid point. BUT, at the same time, a publisher may need to simplify the mobile site & hide data to improve usability on small screens & then only allow certain data to become visible through user interactions. Not showing those automotive part databases to desktop users would ultimately make desktop search results worse for users by leaving huge gaps in the search results. And a search engine choosing to not index the desktop version of a site because there is a mobile version is equally short sighted. Desktop users would no longer be able to find & compare information from those automotive parts databases.

Once again money drives search “relevancy” signals.

Since Google will soon make 3/4 of their ad revenues on mobile that should be the primary view of the web for everyone else & alternate versions of sites which are not mobile friendly should be disappeared from the search index if a crappier lite mobile-friendly version of the page is available.

Amazon converts well on mobile in part because people already trust Amazon & already have an account registered with them. Most other merchants won’t be able to convert at anywhere near as well of a rate on mobile as they do on desktop, so if you have to choose between having a mobile friendly version that leaves differentiated aspects hidden or a destkop friendly version that is differentiated & establishes a relationship with the consumer, the deeper & more engaging desktop version is the way to go.

The heavy ad load on mobile search results only further combine with the low conversion rates on mobile to make building a relationship on desktop that much more important.

Even TripAdvisor is struggling to monetize mobile traffic, monetizing it at only about 30% to 33% the rate they monetize desktop & tablet traffic. Google already owns most the profits from that market.

Webmasters are better off NOT going mobile friendly than going mobile friendly in a way that compromises the ability of their desktop site.

I am not the only one suggesting an over-simplified mobile design that carries over to a desktop site is a losing proposition. Consider Nielsen Norman Group’s take:

in the current world of responsive design, we’ve seen a trend towards insufficient information density and simplifying sites so that they work well on small screens but suboptimally on big screens.

Tracking Users

Publishers are getting squeezed to subsidize the primary web ad networks. But the narrative is that as cross-device tracking improves some of those benefits will eventually spill back out into the partner network.

I am rather skeptical of that theory.

Facebook already makes 84% of their ad revenue from mobile devices where they have great user data.

They are paying to bring new types of content onto their platform, but they are only just now beginning to get around to test pricing their Audience Network traffic based on quality.

Priorities are based on business goals and objectives.

Both Google & Facebook paid fines & faced public backlash for how they track users. Those tracking programs were considered high priority.

When these ad networks are strong & growing quickly they may be able to take a stand, but when growth slows the stock prices crumble, data security becomes less important during downsizing when morale is shattered & talent flees. Further, creating alternative revenue streams becomes vital “to save the company” even if it means selling user data to dangerous dictators.

The other big risk of such tracking is how data can be used by other parties.

Spooks preferred to use the Google cookie to spy on users. And now Google allows personally identifiable web tracking.

Data is being used in all sorts of crazy ways the central ad networks are utterly unaware of. These crazy policies are not limited to other countries. Buying dog food with your credit card can lead to pet licensing fees. Even cheerful “wellness” programs may come with surprises.

Categories: 

from SEO Book http://www.seobook.com/securing-fear-and-mobile-monetization
via KCG Auto Feed

Penguin 4.0 Update

On Friday Google’s Gary Illyes announced Penguin 4.0 was now live.

Key points highlighted in their post are:

  • Penguin is a part of their core ranking algorithm
  • Penguin is now real-time, rather than something which periodically refreshes
  • Penguin has shifted from being a sitewide negative ranking factor to a more granular factor

Things not mentioned in the post

  • if it has been tested extensively over the past month
  • if the algorithm is just now rolling out or if it is already done rolling out
  • if the launch of a new version of Penguin rolled into the core ranking algorithm means old sites hit by the older versions of Penguin have recovered or will recover anytime soon

Since the update was announced, the search results have become more stable.

They still may be testing out fine tuning the filters a bit…

…but what exists now is likely to be what sticks for an extended period of time.

Penguin Algorithm Update History

  • Penguin 1: April 24, 2012
  • Penguin 2: May 26, 2012
  • Penguin 3: October 5, 2012
  • Penguin 4: May 22, 2013 (AKA: Penguin 2.0)
  • Penguin 5: October 4, 2013 (AKA Penguin 2.1)
  • Penguin 6: rolling update which began on October 17, 2014 (AKA Penguin 3.0)
  • Penguin 7: September 23, 2016 (AKA Penguin 4.0)

Now that Penguin is baked into Google’s core ranking algorithms, no more Penguin updates will be announced. Panda updates stopped being announced last year. Instead we now get unnamed “quality” updates.

Volatility Over the Long Holiday Weekend

Earlier in the month many SEOs saw significant volatility in the search results, beginning ahead of Labor Day weekend with a local search update. The algorithm update observations were dismissed as normal fluctuations in spite of the search results being more volatile than they have been in over 4 years.

There are many reasons for search engineers to want to roll out algorithm updates (or at least test new algorithms) before a long holiday weekend:

  • no media coverage: few journalists on the job & a lack of expectation that the PR team will answer any questions. no official word beyond rumors from self-promotional marketers = no story
  • many SEOs outside of work: few are watching as the algorithms tip their cards.
  • declining search volumes: long holiday weekends generally have less search volume associated with them. Thus anyone who is aggressively investing in SEO may wonder if their site was hit, even if it wasn’t.
    The communications conflicts this causes between in-house SEOs and their bosses, as well as between SEO companies and their clients both makes the job of the SEO more miserable and makes the client more likely to pull back on investment, while ensuring the SEO has family issues back home as work ruins their vacation.
  • fresh users: as people travel their search usage changes, thus they have fresh sets of eyes & are doing somewhat different types of searches. This in turn makes their search usage data more dynamic and useful as a feedback mechanism on any changes made to the underlying search relevancy algorithm or search result interface.

Algo Flux Testing Tools

Just about any of the algorithm volatility tools showed far more significant shift earlier in this month than over the past few days.

Take your pick: Mozcast, RankRanger, SERPmetrics, Algaroo, Ayima Pulse, AWR, Accuranker, SERP Watch & the results came out something like this graph from Rank Ranger:

One issue with looking at any of the indexes is the rank shifts tend to be far more dramatic as you move away from the top 3 or 4 search results, so the algorithm volatility scores are much higher than the actual shifts in search traffic (the least volatile rankings are also the ones with the most usage data & ranking signals associated with them, so the top results for those terms tend to be quite stable outside of verticals like news).

You can use AWR’s flux tracker to see how volatility is higher across the top 20 or top 50 results than it is across the top 10 results.

Example Ranking Shifts

I shut down our membership site in April & spend most of my time reading books & news to figure out what’s next after search, but a couple legacy clients I am winding down working with still have me tracking a few keywords & one of the terms saw a lot of smaller sites (in terms of brand awareness) repeatedly slide and recover over the past month.

Notice how a number of sites would spike down on the same day & then back up. And then the pattern would repeat.

As a comparison, here is that chart over the past 3 months.

Notice the big ranking moves which became common over the past month were not common the 2 months prior.

Negative SEO Was Real

There is a weird sect of alleged SEOs which believes Google is omniscient, algorithmic false positives are largely a myth, AND negative SEO was never a real thing.

As it turns out, negative SEO was real, which likely played a part in Google taking years to rolll out this Penguin update AND changing how they process Penguin from a sitewide negative factor to something more granular.

Update != Penalty Recovery

Part of the reason many people think there was no Penguin update or responded to the update with “that’s it?” is because few sites which were hit in the past recovered relative to the number of sites which ranked well until recently just got clipped by this algorithm update.

When Google updates algorithms or refreshes data it does not mean sites which were previously penalized will immediately rank again.

Some penalties (absent direct Google investment or nasty public relations blowback for Google) require a set amount of time to pass before recovery is even possible.

Google has no incentive to allow a broad-based set of penalty recoveries on the same day they announce a new “better than ever” spam fighting algorithm.

They’ll let some time base before the penalized sites can recover.

Further, many of the sites which were hit years ago & remain penalized have been so defunded for so long that they’ve accumulated other penalties due to things like tightening anchor text filters, poor user experience metrics, ad heavy layouts, link rot & neglect.

What to do?

So here are some of the obvious algorithmic holes left by the new Penguin approach…

  • only kidding
  • not sure that would even be a valid mindset in the current market
  • hell, the whole ecosystem is built on quicksand

The trite advice is to make quality content, focus on the user, and build a strong brand.

But you can do all of those well enough that you change the political landscape yet still lose money.

Google & Facebook are in a cold war, competing to see who can kill the open web faster, using each other as justification for their own predation.

Even some of the top brands in big money verticals which were known as the canonical examples of SEO success stories are seeing revenue hits and getting squeezed out of the search ecosystem.

And that is without getting hit by a penalty.

It is getting harder to win in search period.

And it is getting almost impossible to win in search by focusing on search as an isolated channel.

Efforts and investments in chasing the algorithms in isolation are getting less viable by the day.

Anyone operating at scale chasing SEO with automation is likely to step into a trap.

When it happens, that player better have some serious savings or some non-Google revenues, because even with “instant” algorithm updates you can go months or years on reduced revenues waiting for an update.

And if the bulk of your marketing spend while penalized is spent on undoing past marketing spend (rather than building awareness in other channels outside of search) you can almost guarantee that business is dead.

“If you want to stop spam, the most straight forward way to do it is to deny people money because they care about the money and that should be their end goal. But if you really want to stop spam, it is a little bit mean, but what you want to do, is break their spirits.” – Matt Cutts

Categories: 

from SEO Book http://www.seobook.com/penguin-40-update
via KCG Auto Feed

Free Google AdWords Keyword Suggestion Tool Alternative

Google recently made it much harder to receive accurate keyword data from the AdWords keyword tool.

They have not only grouped similar terms, but then they broadened out the data ranges to absurdly wide ranges like 10,000 to 100,000 searches a month. Only active AdWords advertisers receive (somewhat?) decent keyword data. And even with that, there are limitations. Try to view too many terms and you get:

“You’ve reached the maximum number of page views for this day. This page now shows ranges for search volumes. For a more detailed view, check back in 24 hours.”

Jennifer Slegg shared a quote from an AdWords advertiser who spoke with a representative:

“I have just spoken to a customer service manger from the Australia support help desk. They have advised me that there must be continuous activity in your google ad-words campaign (clicks and campaigns running) for a minimum of 3-4 months continuous in order to gain focused keyword results. If you are seeing a range 10-100 or 100-1k or 1k -10k its likely your adwords account does not have an active campaign or has not had continuous campaigns or clicks.”

So you not only need to be an advertiser, but you need to stay active for a quarter-year to a third of a year to get decent data.

Part of the sales pitch of AdWords/PPC was that you can see performance data right away, whereas SEO investments can take months or years to back out.

But with Google outright hiding keyword data even from active advertisers, it is probably easier and more productive for those advertisers to start elsewhere.

There are many other keyword data providers (Wordtracker, SEMrush, Wordze, Spyfu, KeywordSpy, Keyword Discovery, Moz, Compete.com, SimilarWeb, Xedant, Ubersuggest, KeywordTool.io, etc.) And there are newer entrants like the Keyword Keg Firefox extension & the brilliantly named KeywordShitter).

In light of Google’s push to help make the web more closed-off & further tilt the web away from the interests of searchers toward the interest of big advertisers*, we decided to do the opposite & recently upgraded our keyword tool to add the following features…

  • expanded the results per search to 500
  • we added negative match and modified broad match to the keyword export spreadsheet (along with already having phrase, broad & exact match)

Our keyword tool lists estimated search volumes, bid prices, cross links to SERPs, etc. Using it does require free account registration to use, but it is a one-time registration and the tool is free. And we don’t collect phone numbers, hard sell over the phone, etc. We even shut down our paid members area, so you are not likely to receive any marketing messages from us anytime soon.

Export is lightning quick AND, more importantly, we have a panda in our logo!

Here is what the web interface looks like

And here is an screenshot of data in Excel with the various keyword match types

If the tool looks like it is getting decent usage, we may upgrade it further to refresh the data more frequently, consider adding more languages, add a few more reference links to related niche sites in the footer cross-reference section, and maybe add a few other features.

“Every market has some rules and boundaries that restrict freedom of choice. A market looks free only because we so unconditionally accept its underlying restrictions that we fail to see them.”Ha-Joon Chang

Categories: 

from SEO Book http://www.seobook.com/keyword-tool-upgrades
via KCG Auto Feed

Reinventing SEO

Back in the Day…

If you are new to SEO it is hard to appreciate how easy SEO was say 6 to 8 years ago.

Almost everything worked quickly, cheaply, and predictably.

Go back a few years earlier and you could rank a site without even looking at it. 😀

Links, links, links.

Meritocracy to Something Different

Back then sharing SEO information acted like a meritocracy. If you had something fantastic to share & it worked great you were rewarded. Sure you gave away some of your competitive advantage by sharing it publicly, but you would get links and mentions and recommendations.

These days most of the best minds in SEO don’t blog often. And some of the authors who frequently publish literally everywhere are a series of ghostwriters.

Further, most of the sharing has shifted to channels like Twitter, where the half-life of the share is maybe a couple hours.

Yet if you share something which causes search engineers to change their relevancy algorithms in response the half-life of that algorithm shift can last years or maybe even decades.

Investing Big

These days breaking in can be much harder. I see some sites with over 1,000 high quality links that are 3 or 4 months old which have clearly invested deep into 6 figures which appear to be getting about 80 organic search visitors a month.

From a short enough timeframe it appears nothing works, even if you are using a system which has worked, should work, and is currently working on other existing & trusted projects.

Time delays have an amazing impact on our perceptions and how our reward circuitry is wired.

Most the types of people who have the confidence and knowledge to invest deep into 6 figures on a brand new project aren’t creating “how to” SEO information and giving it away free. Doing so would only harm their earnings and lower their competitive advantage.

Derivatives, Amplifications & Omissions

Most of the info created about SEO today is derivative (people who write about SEO but don’t practice it) or people overstating the risks and claiming x and y and z don’t work, can’t work, and will never work.

And then from there you get the derivative amplifications of don’t, can’t, won’t.

And then there are people who read and old blog post about how things were x years ago and write as though everything is still the same.

Measuring the Risks

If you are using lagging knowledge from derivative “experts” to drive strategy you are most likely going to lose money.

  • First, if you are investing in conventional wisdom then there is little competitive advantage to that investment.
  • Secondly, as techniques become more widespread and widely advocated Google is more likely to step in and punish those who use those strategies.
  • It is when the strategy is most widely used and seems safest that both the risk is at its peak while the rewards are de minimus.

With all the misinformation, how do you find out what works?

Testing

You can pay for good advice. But most people don’t want to do that, they’d rather lose. 😉

The other option is to do your own testing. Then when you find out somewhere where conventional wisdom is wrong, invest aggressively.

“To invent you have to experiment, and if you know in advance that it’s going to work, it’s not an experiment. Most large organizations embrace the idea of invention, but are not willing to suffer the string of failed experiments necessary to get there. Outsized returns often come from betting against conventional wisdom, and conventional wisdom is usually right.” – Jeff Bezos

That doesn’t mean you should try to go against consensus view everywhere, but wherever you are investing the most it makes sense to invest in something that is either hard for others to do or something others wouldn’t consider doing. That is how you stand out & differentiate.

But to do your own testing you need to have a number of sites. If you have one site that means everything to you and you get wildly experimental then the first time one of those tests goes astray you’re hosed.

False Positives

And, even if you do nothing wrong, if you don’t build up a stash of savings you can still get screwed by a false positive. Even having a connection in Google may not be enough to overcome a false positive.

Cutts said, “Oh yeah, I think you’re ensnared in this update. I see a couple weird things. But sit tight, and in a month or two we’ll re-index you and everything will be fine.” Then like an idiot, I made some changes but just waited and waited. I didn’t want to bother him because he’s kind of a famous person to me and I didn’t want to waste his time. At the time Google paid someone to answer his email. Crazy, right? He just got thousands and thousands of messages a day.

I kept waiting. For a year and a half, I waited. The revenues kept trickling down. It was this long terrible process, losing half overnight but then also roughly 3% a month for a year and a half after. It got to the point where we couldn’t pay our bills. That’s when I reached out again to Matt Cutts, “Things never got better.” He was like, “What, really? I’m sorry.” He looked into it and was like, “Oh yeah, it never reversed. It should have. You were accidentally put in the bad pile.”

“How did you go bankrupt?”
Two ways. Gradually, then suddenly.”
― Ernest Hemingway, The Sun Also Rises

True Positives

A lot of SEMrush charts look like the following

What happened there?

Well, obviously that site stopped ranking.

But why?

You can’t be certain why without doing some investigation. And even then you can never be 100% certain, because you are dealing with a black box.

That said, there are constant shifts in the algorithms across regions and across time.

Paraphrasing quite a bit here, but in this video Search Quality Senior Strategist at Google Andrey Lipattsev suggested…

He also explained the hole Google has in their Arabic index, with spam being much more effective there due to there being little useful content to index and rank & Google modeling their ranking algorithms largely based on publishing strategies in the western world. Fixing many of these holes is also less of a priority because they view evolving with mobile friendly, AMP, etc. as being a higher priority. They algorithmically ignore many localized issues & try to clean up some aspects of that manually. But even whoever is winning by the spam stuff at the moment might not only lose due to an algorithm update or manual clean up, but once Google has something great to rank there it will eventually win, displacing some of the older spam on a near permanent basis. The new entrant raises the barrier to entry for the lower-quality stuff that was winning via sketchy means.

Over time the relevancy algorithms shift. As new ingredients get added to the algorithms & old ingredients get used in new ways it doesn’t mean that a site which once ranked

  • deserved to rank
  • will keep on ranking

In fact, sites which don’t get a constant stream of effort & investment are more likely to slide than have their rankings sustained.

The above SEMchart is for a site which uses the following as their header graphic

When there is literally no competition and the algorithms are weak, something like that can rank.

But if Google looks at how well people respond to what is in the result set, a site as ugly as that is going nowhere fast.

Further, a site like that would struggle to get any quality inbound links or shares.

If nobody reads it then nobody will share it.

The content on the page could be Pulitzer prize level writing and few would take it seriously.

With that design, death is certain in many markets.

Many Ways to Become Outmoded

The above ugly header design with no taste and a really dumb condescending image is one way to lose. But there are also many other ways.

Excessive keyword repetition like the footer with the phrase repeated 100 times.

Excessive focus on monetization to where most visitors quickly bounce back to the search results to click on a different listing.

Ignoring the growing impact of mobile.

Blowing out the content footprint with pagination and tons of lower quality backfill content.

Stale content full of outdated information and broken links.

A lack of investment in new content creation AND promotion.

Aggressive link anchor text combined with low quality links.

Investing in Other Channels

The harder & more expensive Google makes it to enter the search channel the greater incentive there is to spend elsewhere.

Why is Facebook doing so well? In part because Google did the search equivalent to what Yahoo! did with their web portal. The rich diversity in the tail was sacrificed to send users down well worn paths. If Google doesn’t want to rank smaller sites, their associated algorithmic biases mean Facebook and Amazon.com rank better, thus perhaps it makes more sense to play on those platforms & get Google traffic as a free throw-in.

Of course aggregate stats are useless and what really matters is what works for your business. Some may find Snapchat, Instagram, Pinterest or even long forgotten StumbleUpon as solid traffic drivers. Other sites might do well with an email newsletter and exposure on Twitter.

Each bit of exposure (anywhere) leads to further awareness. Which can in turn bleed into aggregate search performance.

People can’t explicitly look for you in a differentiated way unless they are already aware you exist.

Some amount of remarketing can make sense because it helps elevate the perceived status of the site, so long as it is not overdone. However if you are selling a product the customer already bought or you are marketing to marketers there is a good chance such investments will be money wasted while you alienate pas

Years ago people complained about an SEO site being far too aggressive with ad retargeting. And while surfing today I saw that same site running retargeting ads to where you can’t scroll down the page enough to have their ad disappear before seeing their ad once again.

If you don’t have awareness in channels other than search it is easy to get hit by an algorithm update if you rank in competitive markets, particularly if you managed to do so via some means which is the equivalent of, erm, stuffing the ballot box.

And if you get hit and then immediately run off to do disavows and link removals, and then only market your business in ways that are passively driven & tied to SEO you’ll likely stay penalized in a long, long time.

While waiting for an update, you may find you are Waiting for Godot.

Categories: 

from SEO Book http://www.seobook.com/reinventing-seo
via KCG Auto Feed

Google Rethinking Payday Loans & Doorway Pages?

Nov 12, 2013 WSJ: Google Ventures Backs LendUp to Rethink Payday Loans

Google Ventures Partner Blake Byers joined LendUp’s board of directors with his firm’s investment. The investor said he expects LendUp to make short-term lending reasonable and favorable for the “80 million people banks won’t give credit cards to,” and help reshape what had been “a pretty terrible industry.”

What sort of strategy is helping to drive that industry transformation?

How about doorway pages.

These sorts of doorway pages are still live to this day. Simply look at the footer area of lendup.com/payday-loans

This in spite of last year Google going out of their way to say they were going to kill those sorts of strategies.

March 16, 2015 Google To Launch New Doorway Page Penalty Algorithm

Google does not want to rank doorway pages in their search results. The purpose behind many of these doorway pages is to maximize their search footprint by creating pages both externally on the web or internally on their existing web site, with the goal of ranking multiple pages in the search results, all leading to the same destination.

Today we get journalists conduits for Google’s public relations efforts writing headlines like: Google: Payday Loans Are Too Harmful to Advertise.

Today those sorts of stories are literally everywhere.

Tomorrow the story will be over.

And when it is.

Precisely zero journalists will have covered the above contrasting behaviors.

As they weren’t in the press release.

Best yet, not only does Google maintain their investment in payday loans via LendUp, but there is also a bubble in the personal loans space, so Google will be able to show effectively the same ads for effectively the same service & by the time the P2P loan bubble pops some of the payday lenders will have followed LendUp’s lead in re-branding their offers as being something else in name.

Meanwhile, off to revolutionize the next industry by claiming everyone else is greedy and scummy and there is a wholesome way to do the same thing leveraging new technology, when in reality the primary difference between the business models is simply a thin veneer of tech utopian PR misinformation.

Don’t expect to see a link to this blog post on TechCrunch.

There you’ll read some hard-hitting cutting edge tech news like:

Banks are so greedy that LendUp can undercut them, help people avoid debt, and still make a profit on its payday loans and credit card.

#MomentOfZeroTruth #ZMOT

Categories: 

from SEO Book http://www.seobook.com/google-rethinking-payday-loans
via KCG Auto Feed

The (Hollow) Soul of Technology

The Daily Obituary

As far as being an investable business goes, news is horrible.

And it is getting worse by the day.

Look at these top performers.

The above chart looks ugly, but in reality it puts an optimistic spin on things…

  • it has survivorship bias
  • the Tribune Company has already went through bankruptcy
  • the broader stock market is up huge over the past decade after many rounds of quantitative easing and zero (or even negative) interest rate policy
  • the debt carrying costs of the news companies are also artificially low due to the central banking bond market manipulation
  • the Tribune Company recently got a pop on a buy out offer

Selling The Story

Almost all the solutions to the problems faced by the mainstream media are incomplete and ultimately will fail.

That doesn’t stop the market from selling magic push button solutions. The worse the fundamentals get, the more incentive (need) there is to sell the dream.

Video

Video will save us.

No it won’t.

Video is expensive to do well and almost nobody at any sort of scale on YouTube has an enviable profit margin. Even the successful individuals who are held up as the examples of success are being squeezed out and Google is trying to push to make the site more like TV. As they get buy in from big players they’ll further squeeze out the indy players – just like general web search.

Even if TV shifts to the web, along with chunks of the associated ad budget, most of the profits will be kept by Google & ad tech management rather than flowing to publishers.

Some of the recent acquisitions are more about having more scale on an alternative platform or driving offline commerce rather than hoping for online ad revenue growth.

Expand Internationally

The New York times is cutting back on their operations in Paris.

Spread Across Topics

What impact does it have on Marketwatch’s brand if you go there for stocks information and they advise you on weight loss tips?

And, once again, when everyone starts doing that it is no longer a competitive advantage.

There have also been cases where newspapers like The New York Times acquired About.com only to later sell it for a loss. And now even About.com is unbundling itself.

Native Ads

The more companies who do them & the more places they are seen, the lower the rates go, the less novel they will seem, and the greater the likelihood a high-spending advertiser decides to publish it on their own site & then drive the audience directly to their site.

When it is rare or unique it stands out and is special, justifying the extra incremental cost. But when it is a scaled process it is no longer unique enough to justify the vastly higher cost.

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Get Into Affiliate Marketing

It won’t scale across all the big publishers. It only works well at scale in select verticals and as more entities test it they’ll fill up the search results and end up competing for a smaller slice of attention. Further, each new affiliate means every other affiliate’s cookie lasts for a shorter duration.

It is unlikely news companies will be able to create commercially oriented review content at scale while having the depth of Wirecutter.

“We move as much product as a place 10 times bigger than us in terms of audience,” Lam said in an interview. “That’s because people trust us. We earn that trust by having such deeply-researched articles.”

Further, as it gets more pervasive it will lead to questions of editorial integrity.

Charging People to Comment

It won’t work, as it undermines the social proof of value the site would otherwise have from having many comments on it.

Meal Delivery Kits

Absurd. And a sign of extreme desperation.

Trust Tech Monopolies

Here is Doug Edwards on Larry Page:

He wondered how Google could become like a better version of the RIAA – not just a mediator of digital music licensing – but a marketplace for fair distribution of all forms of digitized content. I left that meeting with a sense that Larry was thinking far more deeply about the future than I was, and I was convinced he would play a large role in shaping it.

If we just give Google or Facebook greater control, they will save us.

No they won’t.

You are probably better off selling meal kits.

As time passes, Google and Facebook keep getting a larger share of the pie, growing their rake faster than the pie is growing.

Here is the RIAA’s Cary Sherman on Google & Facebook:

Just look at Silicon Valley. They’ve done an extraordinary job, and their market cap is worth gazillions of dollars. Look at the creative industries — not just the music industry, but all of them. All of them have suffered.

Over time media sites are becoming more reliant on platforms for distribution, with visitors having fleeting interest: “bounce rates on media sites having gone from 20% of visitors in the early 2000s to well over 70% of visitors today.”

Accelerated Mobile Pages and Instant Articles?

These are not solutions. They are only a further acceleration of the problem.

How will giving greater control to monopolies that are displacing you (while investing in AI) lead to a more sustainable future for copyright holders? If they host your content and you are no longer even a destination, what is your point of differentiation?

If someone else hosts your content & you are depended on them for distribution you are competing against yourself with an entity that can arbitrarily shift the terms on you whenever they feel like it.

“The cracks are beginning to show, the dependence on platforms has meant they are losing their core identity,” said Rafat Ali “If you are just a brand in the feed, as opposed to a brand that users come to, that will catch up to you sometime.”

Do you think you gain leverage over time as they become more dominant in your vertical? Not likely. Look at how Google’s redesigned image search shunted traffic away from the photographers. Google’s remote rater guidelines even mentioned giving lower ratings to images with watermaks on them. So if you protect your works you are punished & if you don’t, good luck negotiating with a monopoly. You’ll probably need the EU to see any remedy there.

When something is an embarrassment to Google & can harm their PR fixing it becomes a priority, otherwise most the costs of rights management fall on the creative industry & Google will go out of their way to add cost to that process. Facebook is, of course, playing the same game with video freebooting.

Algorithms are not neutral and platforms change what they promote to suit their own needs.

As the platforms aim to expand into new verticals they create new opportunities, but those opportunities are temporal.

Whatever happened to Zynga?

Even Buzzfeed, the current example of success on Facebook, missed their revenue target badly, even as they become more dependent on the Facebook feed.

“One more implication of aggregation-based monopolies is that once competitors die the aggregators become monopsonies — i.e. the only buyer for modularized suppliers. And this, by extension, turns the virtuous cycle on its head: instead of more consumers leading to more suppliers, a dominant hold over suppliers means that consumers can never leave, rendering a superior user experience less important than a monopoly that looks an awful lot like the ones our antitrust laws were designed to eliminate.” – Ben Thompson

Long after benefit stops passing to the creative person the platform still gets to re-use the work. The Supreme Court only recentlyrefused to hear the ebook scanning case & Google is already running stories about using romance novels to train their AI. How long until Google places their own AI driven news rewrites in front of users?

Who then will fund journalism?

Dumb it Down

Remember how Panda was going to fix crap content for the web? eHow has removed literally millions of articles from their site & still has not recovered in Google. Demand Media’s bolt-on articles published on newspaper sites still rank great in Google, but that will at some point get saturated and stop being a growth opportunity, shifting from growth to zero sum to a negative sum market, particularly as Google keeps growing their knowledge scraper graph.

Now maybe if you dumb it down with celebrity garbage you get quick clicks from other channels and longterm SEO traffic doesn’t matter as much.

But if everyone is pumping the same crap into the feed it is hard to stand out. When everyone starts doing it the strategy is no longer a competitive advantage. Further, if you build a business that is algorithmically optimized for short-term clicks is also optimizing for its own longterm irrelevancy.

Yahoo’s journalists used to joke amongst themselves about the extensive variety of Kind bars provided, but now the snacks aren’t being replenished. Instead, employees frequently remind each other that there is little reason to bother creating quality work within Yahoo’s vast eco-system of middle-brow content. “You are competing against Kim Kardashian’s ass,” goes a common refrain.

Yahoo’s billion-person-a-month home page is run by an algorithm, with a spare editorial staff, that pulls in the best-performing content from across the site. Yahoo engineers generally believed that these big names should have been able to support themselves, garner their own large audiences, and shouldn’t have relied on placement on the home page to achieve large audiences. As a result, they were expected to sink or swim on their own.

“Yahoo is reverting to its natural form,” a former staffer told me, “a crap home page for the Midwest.”

That is why Yahoo! ultimately had to shut down almost all their verticals. They were optimized algorithmically for short term wins rather than building things with longterm resonance.

Death by bean counter.

The above also has an incredibly damaging knock on effect on society.

People miss the key news. “what articles got the most views, and thus “clicks.” Put bluntly, it was never the articles on my catching Bernanke pulling system liquidity into the maw of the collapse in 2008, while he maintained to Congress he had done the opposite.” – Karl Denninger

The other issue is PR is outright displacing journalism. As bad as that is at creating general disinformation, it gets worse when people presume diversity of coverage means a diversity of thought process, a diversity of work, and a diversity of sources. Even people inside the current presidential administration state how horrible this trend is on society:

“All these newspapers used to have foreign bureaus,” he said. “Now they don’t. They call us to explain to them what’s happening in Moscow and Cairo. Most of the outlets are reporting on world events from Washington. The average reporter we talk to is 27 years old, and their only reporting experience consists of being around political campaigns. That’s a sea change. They literally know nothing.” … “We created an echo chamber,” he told the magazine. “They [the seemingly independent experts] were saying things that validated what we had given them to say.”

That is basically the government complaining to the press about it being “too easy” to manipulate the press.

Adding Echo to the Echo

Much of what “seems” like an algorithm on the tech platforms is actually a bunch of lowly paid humans pretending to be an algorithm.

This goes back to the problem of the limited diversity in original sources and rise of thin “take” pieces. Stories with an inconvenient truth can get suppressed, but “newsworthy” stories with multiple sources covering them may all use the same biased source.

After doing a tour in Facebook’s news trenches, almost all of them came to believe that they were there not to work, but to serve as training modules for Facebook’s algorithm. … A topic was often blacklisted if it didn’t have at least three traditional news sources covering it

As algorithms take over more aspects of our lives and eat more of the media ecosystem, the sources they feed upon will consistently lose quality until some sort of major reset happens.

The strategy to keep sacrificing the long term to hit the short term numbers can seem popular. And then, suddenly, death.

You can say the soul is gone
And the feeling is just not there
Not like it was so long ago.
– Neil Young, Stringman

Micropayments & Paywalls

It is getting cheap enough that just about anyone can run a paid membership site, but it is quite hard to create something worth paying for on a recurring basis.

There are a few big issues with paywalls:

  • If you have something unique and don’t market it aggressively then nobody will know about it. And, in fact, in some businesses your paying customers may have no interest in sharing your content because they view it as one of their competitive advantages. This was one of the big reasons I ultimately had to shut down our membership site.
  • If you do market something well enough to create demand then some other free sites will make free derivatives, and it is hard to keep having new things to write worth paying for in many markets. Eventually you exhaust the market or get burned out or stop resonating with it. Even free websites have churn. Paid websites have to bring in new members to offset old members leaving.
  • In most markets worth being in there is going to be plenty of free sites in the vertical which dominate the broader conversation. Thus you likely need to publish a significant amount of information for free which leads into an eventual sale. But knowing where to put the free line & how to move it over time isn’t easy. Over the past year or two I blogged far less than I should have if I was going to keep running our site as a paid membership site.
  • And the last big issue is that a paywall is basically counter to all the other sort of above business models the mainstream media is trying. You need deeper content, better content, content that is not off topic, etc. Many of the easy wins for ad funded media become easy losses for paid membership sites. And just like it is hard for newspapers to ween themselves off of print ad revenues, it can be hard to undo many of the quick win ad revenue boosters if one wants to change their business model drastically. Regaining you sou takes time, and often, death.

“It’s only after we’ve lost everything that we’re free to do anything.” ― Chuck Palahniuk, Fight Club

Categories: 

from SEO Book http://www.seobook.com/newspaperhow
via KCG Auto Feed