Obscuring the Algorithm

The Google algorithm is evolving. Gone are the days when you can read a single patent, or a document like the PageRank thesis writen by Google’s Larry Page and Sergey Brin to figure out how it all works.

That PageRank paper was at the time a revolutionary algorithm, and it launched the dominant search engine of our generation. However, it also launched a generation of spammers. The reason is that people could read the paper and understand how it worked, and then make SEO decisions based around manipulating the algorithm.

But no more. The major search engines will all continue to publish patents, in fact they publish lots of them. As a result, too many new possibilities have arisen, too many new potential rankings signals have been identified.

It becomes difficult to separate the wheat from the chaff. You can read through all the patents, but they are not all going to be implemented. For example, is Google going to use SearchWiki to provide them input for rankings purposes?. If so how? And how much weight will it get?

One possibility is to use it as a validation of what link data tells them. A publisher manages to spam their way to very high rankings in Google using some combination of methods for acquiring links. But the web site experience is not very good, and the products or services are perhaps even worse. So over time, the site accumulates lots of negative votes and few positive ones. Can you imagine a search engineer looking at that data and not lowering the rankings for that site?

A short aside on patents

Patents are their own competitive landscape and game. Major companies (such as the search companies) use them as negotiating tools. Many times businesses file patents based on an idea they never intend to implement. It is the intellectual property equivalent of a land grab. It may be a good idea, or even a great one, but that does not guarantee its implementation by the company who files the patent.

However, perhaps one of their large competitors will build something that makes use of concepts that are covered in the patent. That would be a big win, because they can then sue the competitor for patent infringement. Of course, this brings up the next layer of the patent game, which I will illustrate with an example.

Company A sues Company B for violating patent 1243. Company B then delves into its patent portfolio to figure out which patents they can argue that Company A is violating. If Company A and Company B both have large patent portfolios, chances are good that both of them are violating one or more of the other’s patents. As long as there is a balance in the level of violations, the companies work out a cross licensing agreement and move on. Why go through all this trouble? Because it creates a huge barrier to entry for new rivals.

The consequence of this is that all the search engine companies are publishing dozens of patents, loaded with interesting ideas for ranking signals for one of more aspects of their respective algorithms. Yet only some of them will be implemented.

Bottom Line for SEOs

Links are still a huge signal, and they will be for a long time to come. But, search engines are going to introduce more and more signals that will help them improve their algorithms over time. One of the most important aspects of this is the fact that these changes are not known to the public. It is much harder to spam something when you don’t know how it is designed. Getting the algorithm to be secret and unknown again is a strategic objective for Google. Finding a set of signals that offsets the inaccuracies introduced by the practice of buying links to influence search rankings is a must for them.

So while it may sound a bit trite, they are bound and determined to create a world where the winners are the ones that combine the best user experience with the best promotional plan. This is, after all, what is best for users.

Comments

  1. Nice stuff Eric. Last year I wrote a piece called ‘the Magic Bullet’ that was essentially talking about not reading too much into the world of patents. As a fella that watches the space I can’t tell you the sheer mass and variety of filings from Microsoft alone (covering all of their interests) is amazing. Each week… maybe 30 or so? For me oddly, it’s all just interesting reading.

    That being said, also looking into research papers (past and present) in concert with patent stalking can add yet another layer into the way search engineers work and potential avenues for tomorrow. If we then apply this to our testing, the data tends to be more complete. At very least, one can seek to better understand the mind set…gain some insight.

    I find it also interesting to note the parts on the desire for ‘secrecy’ – even if we actually knew which elements were actually deployed, we still wouldn’t know the implementation, weighting or dampening factors. That’s the, ‘secret sauce’ and make up the ‘300 or so’ factors that go into the organic ranking mix.

    If we look at the example of PageRank, not only are there slight variances in the original papers and subsequent patent filings (was a revision back in early ’08) we also aren’t looking at the tweaks and revisions to the system since inception (not the least of which being the Kaltix purchase and personalized PR back in ’03)… A recent Google video had reference to the PR algo being significantly modified since the early days.

    Links are a problem, at least in my mind. But there are also equal problems (both quality and resource wise) to many of the other approaches that might take the weight off. Who knows, maybe the new infrastructure can better handle more complex layers that can offset the link addiction…

    Anyway, rambling… good post… thanks!

  2. Yup, Google algorithm is changing. Gone are the days when all you need is a lot of links to your website. When Google becomes perfect, SEO industry will be dead. :(

  3. This article is a perfect example of why I read your ramblings. You pushback against the ‘mystic SEOs’ who claim to have the secret flame of ranking glory.

    I like reading patents because it gives me insight into the thought process and conversations they are having at Google.

    And yeah, linking will always be important–it’s the primary way the algorithm learns.

  4. This article is a perfect example of why I read your ramblings. You pushback against the ‘mystic SEOs’ who claim to have the secret flame of ranking glory.

Speak Your Mind

*

*