Why Google Can’t Just Ignore Bad Links

Posted on May 16, 2013 in

A common complaint with Google’s often punitive nature toward link policy violations goes something like this:  “It’s not fair that there’s negative associated with bad links.  Why can’t they just ignore them”

I’ll come out and say that I support some sort of negative ranking factor for bad links, given the current ranking system.  In fact, any SEO that’s about legitimate hard earned links should think the same.  It comes down to some simple math.

Basic PageRank Model

The first thing that has to be understood is the basics of the PageRank model.  Simply put, each site that links to a target site provides points used to estimate the importance of the target site.  While not all links are treated equally, to keep the model simple in this example they will be.  We’ll refer to these as link points to avoid confusion with true pagerank

There are types of links that do not pass these all important points.  These can either be devalued due to spam dectection by google, marked by the linking site as “no follow” or disavowed by the target site.

A penalty free model

There are two basic types of links we’ll consider here.  The first is a hard fought link, generated through true compelling content.  Most would refer to this as a true “organic” link.  The second is an easily scaled link with little effort that’s against the guidelines.  This is commonly called an “inorganic” link.  These are the links that Google wants to purge from it’s link point system.  These are also the types of links that could be potentially be purchased a thousand at a time for a few bucks.  (here’s an example of this type of junk)

If Google’s detection system of inorganic links was perfect, and all inorganic links were detected the two strategies would get values like this with as little as five organic links and 10,000 inorganic ones.

Organic - 5 links x 100% valuable- 5 link points
Inorganic - 10,000 links x 0% valuable- 0 link points.

Organic wins.  That’s what Google wants, and that’s what most legitimate webmasters would want as well.

However, it’s fully unrealistic to assume that detection will ever be perfect.  Let’s assume a three nines level of detection. Getting 99.9% of links correctly identified as inorganic changes the math quite a bit. As a subnote, I’d put that level of detection as much higher than it truly is.  Now it looks like this

Organic - 5 link x 100% valuable - 5 linkpoint
Inorganic - 10,000 links x 0.1% valuable - 10 link points.

Inorganic wins

Even with 99.9% detection, thousands of spammy links can now beat those hard fought truly organic links.  The junk might have cost $10.  Imagine what thousands of dollars could do.

Model with Penalty

No one besides a spammer truly wants the inorganic strategy to win.  If influence on search engines was purely bought and sold, the results wouldn’t be good for the web as a whole, no matter if we’re talking about large brands or small entrants.

If a penalty for bad links is introduced, now the imperfect detection can be overcome.  If as little as 0.1% of links are noted as so awful to pass a negative penalty the math swings back to organic, even at lower detection levels.  More or less, the penalty level only has to slightly exceed that of the detection error

Organic - 5 links x 100% valuable - 5 link points
Inorganic - 10,000 links x 0.1% valuable  = 10 link points - 10,000 links x 0.1% penalty links = 0 net link points

Organic wins again.  This is the outcome that we want, and it only happens with a penatly that offsets imperfect detection.

The problem with penalties

This model isn’t perfect, even if it’s the one that is likely being used.  The commonly noted reason is an inability to understand the source of link placements.  This opens up the possibility of “negative SEO” where a bad actor points penalty carrying links at a competitor.   While the disavow tool can be used by a negative SEO target, this additional effort is usually not noted until harm has already been done.

In addition, false positives can occur for penalty links.  A site that had been hacked may get it’s legitimate links lumped in with those generated from the hacking.

What we should try as SEOs

Another common refrain I hear, especially when cleaning up bad links, is that it’s not our job as SEOs to clean up the spam on the web.

Actually it is, at least in part.

We should want to help with spam detection and purges, if only for the simple reason that as the detection ratio gets better, the need for penalties lessens.  Penalties are only needed to offset imperfect detection.  A lower penalty ratio means there is less likelyhood of false positives or negative seo.  Everyone that has legitimate value wins in that case.

TL;DR

Penalties are needed to offset imperfect spam detection.  As long as we don’t want spam to win we have to accept penalties, but the long term win comes from better spam detection.

 

By Steve Hammer

Steve is the President of RankHammer. When he's not working with clients to grow online, he's probably looking for a great restaurant no one's heard of yet. He is fully Adwords Certified (Analytics, AdWords and Display) and a graduate of the Kellogg School of Management.