Drupal and Search Engine Optimization

Drupal is known for being a very SEO friendly content management system (CMS). The way it assembles its pages is crawler friendly. This makes it a popular choice for people looking to build dynamic web sites. However, there are a number of potential SEO problems with Drupal as well. These need to be dealt with to ensure that you get optimal results. The very fact that Drupal is such a dynamic system is a factor that leads to some of its SEO problems. The content is stored in a database and retrieved ... Read More >

15 Things About How Google Handles Duplicate Content

Duplicate Content is one of the most perplexing problems in SEO. In this post I am going to outline 15 things about how Google handles duplicate content. This will include my leaning heavily on interviews with Vanessa Fox and Adam Lasnik. If I leave something out, just let me know, and I will add it to this post. Google's standard response is to filter out duplicate pages, and only show one page with a given set of content in its search results. I have seen in the SERPs evidence that large media ... Read More >

Vanessa Fox’s Last Google Interview?

Shortly before she left Google, I spoke to Vanessa Fox about what's going on with Google Webmaster Tools, and, we also spoke for a while about duplicate content problems. While I was working on polishing up the interview transcript Vanessa left Google. So I may have had the last real interview she did while at Google. After another week of vacation, she will officially make the leap over to work at Zillow. Best of luck Vanessa! We talk at length about key parts of Webmaster Tools, and we also talk ... Read More >

12 Ways Webmasters Create Duplicate Content

At the recent SMX Advanced Conference in Seattle one of the big sessions was on duplicate content. There is great blow by blow coverage in posts by Vanessa Fox and by Matt McGhee. You can also see an older post about dupe content here by Chris Boggs. At the start of this session, the search engines all talked about various types of duplicate content. But let's take a deeper look at the way that duplicate content happens. Here are 12 ways people unintentionally create dupe content: Build a site for ... Read More >

Interview with Google’s Adam Lasnik

Adam Lasnik and I spoke about paid links, duplicate content and more late last week. The paid links conversation was very interesting. One of the things that Adam made clear is that Google is not looking to detect 100% of paid links. Their focus is much more on the links that are being sold for the purpose of passing PageRank. We also talked about how the authenticated spam report form is going to be used. It turns out that this will not be used to decide on immediate penalties for sites that get ... Read More >

Adam Lasnik Clarifies Google Stance on Duplicate Content

Google's Adam Lasnik has offered up a post today on Dealing Deftly with Duplicate Content. In it, he offers up the official Google stance on the issue. In this post, he addresses the following key issues: What is duplicate content? What isn't duplicate content? Why does Google care about duplicate content? What does Google do about it? How can Webmasters proactively address duplicate content issues? However, there are issues that are not addressed in Adam's post (fyi - these are issues which it's ... Read More >

Googlebot Detection and Combatting Copyright Violations

We live in a world where it's increasingly common that other sites will copy your good content and re-publish it. This causes concerns that you will be flagged for publishing duplicate content, or that the search engines will not correctly recognize your site as the original author of the content. So what can you do about this problem? Matt Cutts just posted on the Google Webmaster Blog one part of the answer. Google has now specified an official way to recognize the Google Bot. There are a few ways ... Read More >

Keyword Duplication

Mapping site design to target keywords is a fine art. The goal is to create a site architecture that is mapped to providing keyword rich content that is structured to provide quality content for users, and rank well in search engines. One of the key mistakes people make is that they create multiple pages that end up competing for the same keywords. For most sites, this is a bad mistake. By competing with each other, there is a strong chance that the search engine will see the pages as duplicate ... Read More >

The Cost of Duplicate Content

Let's talk about the cost of duplicate content. At first blush, it seems like a relatively minor issue. In principle, a search engine wants to include only one copy of a page in its index. So if you have multiple pages with the same content, the search engine picks only one. This means one copy of your content is ignored. So far it does not sound too bad, does it? However, there are other less obvious consequences to duplicate content. For example, it can't be good that crawlers some to your site ... Read More >

Affiliate Programs and Duplicate Content

Not too long ago I was working on a site that had a pretty active affiliate program. A very strange thing happened - One of the affiliates unintentionally hijacked the search results of the source site. Let me illustrate what I mean with an example. The site used to come up very highly in Google for a particular term, let's call is "discount blue widgets". The page that Google was listing was "http://www.yourdomain.com". One day I went back to Google and looked at the current rankings for "discount ... Read More >