Interview with Google’s Adam Lasnik

Adam Lasnik and I spoke about paid links, duplicate content and more late last week. The paid links conversation was very interesting. One of the things that Adam made clear is that Google is not looking to detect 100% of paid links. Their focus is much more on the links that are being sold for the purpose of passing PageRank.

We also talked about how the authenticated spam report form is going to be used. It turns out that this will not be used to decide on immediate penalties for sites that get reported. The information is simply going to be used for input into the search quality team at Google. Great news for webmasters who were worried about getting reported, as no immediate action will be taken.

Not so good news for those who want to level the playing field with competitors that are getting away with murder. Resolving those issues will still require patience.

There is a ton of good discussion in here about duplicate content, and other aspects of SEO too. Check this great interview with Adam out.


  1. says

    Hey Eric,

    Welcome back! Hope you had a great vacation. This post couldn’t have come at a better time. I actually have a client that has a site and they just launched a .com site. Duplicate content all over the place.

    One thing that Adam mentioned got me thinking. He said that google would show the site to UK searchers and the .com site to US searchers. Would they do this even though there is duplicate content? What happens if the .com site is hosted in the uk?

    Just thought I would see what your thoughts are on this.

    Scott R

  2. Eric Enge says

    Hi Scott – I am not sure how they trigger it, but I would guess that they trigger on the domain (.com v.s. and/or the IP address of the hosting location. These things are easy for them to detect.

    I doubt that differences in language are enough for them to detect, e.g. color v.s. colour.

  3. says

    Eric’s right.

    Hosting location and TLD are both key signals for our algorithms.

    We’re exploring additional options to fine tune geolocation, but no ETA at present.

  4. says

    Thanks for the responses. So if they are hosted at the same place (same host/IP) then the will always show since that domain has way more trust built in. To have the .com site show up do you think they need to move the hosting to the USA and also make some changes to content? I’ve never dealt with this issue before. It’s a great learning experience!


  5. Eric Enge says

    From you comment, I am assuming that both domains are hosted at a UK hoster.

    To be safe, I would move the .com domain to hosting in the USA. I would leave the in a hosting arrangement in the UK.

  6. says

    So one more point on this. If the .com is moved to be hosted in the US and the is hosted in the UK. And if the .com site starts to acquire several high quality links from the US then there won’t be any duplicate content issues? Google would just understand that this is the same site with a US version and a UK version?

    Thank you very much for helping with this. This has been a very valuable lesson!

    Scott R

  7. says

    We go with the tld first. So, for instance, a site hosted in the US would be considered a UK site. For non-regional tlds, such as .com, we go with IP location. So, a .com hosted in the US would be considered a US site and a .com hosted in the UK would be considered a UK site.

    And as Adam pointed out, our algorithms do a pretty good job of serving up the right content to the right location, so you don’t really need to worry about duplicate content when you have a legit situation with content that you’re serving up for separate country sites.

    I did a blog post a while back on this here:

  8. maxDon says

    Nice interview. Shame that no one pushed the last point of going in and out of the index with all the -30 -950 stuff people are talking about. Would really like to hear some more on these effects.

  9. says


    Thank you (and Adam too) for clearing up the duplicate content issue as it relates to blogs. I usually post a good-sized snippet of each new article to my blog (which is a subdomain of my site) and then link it to the rest of the article.

    That way, subscribers to my RSS feed get a quick overview of the article. They can decide whether to read the rest of it and stay updated without having to check the main site for new content.

    I guess the key for most webmasters is that it really is all about the user experience and therefor if you’re doing a bit of duplication to better serve the customer, then it’s a good thing!


  10. says

    Thank you so much Vanessa. I’m actually a long time reader of your blog. Sorry I missed your post on this subject.

    Thank you so much Eric, Adam and Vanessa! Your input cleared a lot up for me and helped me to better understand this issue.

    Thanks again!

    Scott R

  11. Steve says

    Great interview, in particular the clarification about code bloat, most of us know that its not useful but there are times when its difficult to avoid, I’m talking about old hand coded sites with many pages.

    I’m with maxDon on the so called -950 penalty, most webmasters know when a line has been crossed and usually take great steps to keep everything above board, the cause of this particular penalty doesn’t appear to be something that can be fixed or avoided by keeping everything as white hat as possible as the possible reasons for it happening are unknown.

    Some clarification from Google at some point on things to avoid would help a lot.

  12. says

    Google Search Engine Algorithm needs to be more Webmaster conscientious so Webmasters can provide a positive experience to end users.

    If Google continious to police the Webmasters with a hamer greep Google will lose its relevency for search experince as people shift to social media networking tools.

    Algorithm abuse needs to be balanced in a homogeneous matter to continue serving prodoctivly the Internet Global Village.

Leave a Reply

Your email address will not be published. Required fields are marked *