Google’s Continuing War on Paid Links

Aaron Wall put up a post about a new Google filter that causes people with high ranking terms to be bumped down to position #6. There is also a thread at Webmaster World about this phenomenon. This is still reasonably speculative in nature, but there are a lot of people who have seen this.

Aaron offers some really interesting speculation about why this may be occurring. The most interesting theory was the notion that it was an anchor text problem. Here is what Aaron had to say:

I think this issue is likely tied to a stagnant link profile with a too tightly aligned anchor text profile, with the anchor text being overly-optimized when compared against competing sites.

Whether or not this is occurring now, this makes complete sense. It is well within Google’s (or any other search engine’s) ability to detect an unusually high density of one form of anchor text to a given domain. For example, if your site is called yourdomain.com, and you sell widgets, and the anchor text in 48 or your 65 links says “Widgets on Sale”, this is not natural.

Most of the links to your site should be the name of your domain itself (i.e. in this example, “yourdomain”). Such a distribution of anchor text is a flag that the anchor text of your links are being artificially influenced. How is that done? Why by purchasing links, or by heavy duty link swapping.

This is potentially another step in Google’s stepped up war against the practice of link buying. I have long maintained that the main advantage the link buying has over natural links is the fact that people who buy links get to specify the exact (keyword rich) anchor text. used. Looking for unnatural patterns of anchor text provides a backdoor into detecting people who are purchasing links.

It might be a bit heavy handed for Google to ban a site based on this type of evidence, but reducing the impact of anchor text on rankings when there is an unnatural distribution in play still helps them meet their goal. After all, even if the unnatural acnhor text campaign does not represent the result of a link buying campaign, and all those keyword laden links are in fact completely natural, it might still provide better relevance for Google to filter in this manner.

Thinking about this further, this might be a simple search quality adjustment for skewed anchor text distribution. If it affects paid links, from Google’s perspective, this might just be a bonus.

Comments

  1. Thanks for the helpful insight on Aaron Wall and WebmasterWorld threads Eric. It’s Sphunn here: http://sphinn.com/story/21499

  2. Hi Eric,

    While I agree it is well within googles grasp, the theory falls short when old theories are examined such as recip link filters, paid link filters, etc. These filters have always been run once or twice (at great expense in resources to google) to make an example that google “could” do it and scare webmasters into compliance. One has to wonder why that is how it has always happened… The logical answer is that it is simply very resource intensive and not worth doing on a dynamic basis or even a somewhat scheduled occurence like the PR/BL exports.

    One of the main reasons the -30 penalty was disproved was that keyword searches being calculated and ranked on the fly, trying to read profiles dynamicly would add expensive load time to the search query. Googles core function is to serve results to searchers, and the longer that takes, the longer before ads show up for those same searchers to click on. I can not, as a coder myself, see that ever being an “acceptable” loss of time.

    The same holds true for Aarons post about a -6 penalty/filter

  3. William – but isn’t Google already tabulating what is going on with anchor text as part of a ranking signal? I could be wrong, but it does not seem like a major extension to me to evaluate some simple things such as:

    1. What percentage of the links use the site or business name as the anchor text (should be the largest percentage)?

    2. What percentage of the links use a highly competitive keyword (when it is not the company or site name) as the anchor text?

    If these two simple things are out of balance, you would have a pretty simple filter you could implement.

    Also, I’d think that any filter would be run as a background process instead of doing it at the moment of each search query. With that approach, any filtering action that needed to occur at the time when a user entered a specific query would be a specific table lookup.

  4. Right Eric. I would think they we already factoring and filtering like this a long tie ago. Also if they are doing this this statement:

    “you sell widgets, and the anchor text in 48 or your 65 links says “Widgets on Sale”, this is not natural.”

    sounds perfectly fine to me. Why would this not be natural? You can control how people link to you. Are we going back to the theory that you can Google bowl someone down the rankings again?

  5. Jaan – It’s been shown for some time that the great majority of people (who have not been influenced or controlled by you) will use your company name, site name, or even “click here” as the anchor text to your site or web page.

    If instead, they link to you using keyword rich anchor text that is not a part of your site name or company name, it makes it likely that the nature of the link was influenced by you (which is not what Google wants).

  6. This is problematic then for those of use with domains under our own names. Where this filter will REALLY screw people is when they pick up that I sign all my comments, like this one, with “Jack.”

    Google could catch me up in this fiasco if they determine that I am trying to artificially inflate my link popularity for my domain jackhumphey.com. Why wouldn’t I want to be #1 for “jack?”

    That right there will hopefully make Google think thrice about such a move.

    The entire point becomes moot when you write good content. The links referring to your well received pieces will vary so much in anchor text that you’d never be able to buy or control enough links to set off a filter like this.

    I have 50000+ links and precious few of them are anchored the way I’d “like” them to be.

    Guess the domainers have to be worried. But real web sites shouldn’t be much alarmed at this stuff. Unless my “jack” theory proves out. Then I guess scoble would be in more trouble than me. lol

  7. I have to agree with Jack. A simple tactic would simply be to:

    1- Get tons of natural links with social sites by writing good piece of content to get, let’s say, 1-5k of new links / month

    2-Buy something like 100-500 anchor text oriented backlinks

    That way, they would be extremely diluted and wouldn’t trigger anything, especially if you change your keywords every time you go out for specific anchor text.

    But heh, they got a few thousands engineers against the whole Internet world looking at them… You can raise your black, white, blue or grey hat to them that they are getting more and more control.

  8. Could it be that Google recognize a pattern amoung this inbound links comming from the same domain? If a domain have a lots of customized links the link from that domain is set to zero value! Instead of banning the site that get the link they set the value of the domain giving the link to zero.
    I think this is what Google have been talking about for a while now, to find sites selling links and set the value to nothing for that particular link.

  9. Jack & Guillaume,

    Maybe a better way to think of this is not as if it’s a penalty. From a mathematical point of view an unnatural amount of anchor text skewed to certain keywords need to have it’s value adjusted downward a bit.

    Remember how site wide links used to have incredible value? You could get 10,000 links from one domain and you were off to the races. Well you can still get site wides, but the majority of the value you get from the links from that domain are from the one link on the home page.

    I could see Google making a very similar adjustment with regard to anchor text.

    Peter – your suggestion is interesting as well. Google could recognize when you have a bunch of links from one domain all going to the same page on your site, where all these links have different anchor text. That would seem a bit unnatural as well, wouldn’t it?

  10. David Eaves says:

    If what you have said were true then sites with a keyword rich name would be have even more of an advantage. It would be illogical to penalize sites in that way, in a non commercial world maybe, but the truth of the matter is that in the real world 90% of links are manufactured, not necessarily paid for but manufactured via directories, articles, link requests and other white hat link building techniques which could involve plenty of keyword rich anchor text. Try building natural links to an FSA governed insurance site.

  11. There is a very popular assumption in the SEO world that each page on a website should be optimized for just one search term and that the page should be written around that term. Of course, keyword-rich links to those pages would tend also to be very narrow in scope, too.

    I have always thought that pages should be theme-related, trying to bring together numerous complementary search phrases and splitting them up only when there are just too many to make things work.

    For example, on a jobs site, one might have various search terms related to employment, careers, recruiting and, of course, jobs. While it would be smart to feature all four words on every page, grouping multiple search terms related to each one on a separate page makes most sense. However, having a page optimized for “find jobs”, another optimized for “search jobs”, another optimized for “Jobs listing”, etc. really looks over done, and this would be reflected in the link profile that Google and other search engines would review.

Speak Your Mind

*

*