Matt Cutts and Eric Talk About What Makes a Quality Site

Matt Cutts on what constitutes quality content.

An interview with Google Distinguished Engineer Matt Cutts, by Eric Enge of Stone Temple Consulting:

Eric Enge: I would like to review an example scenario with you. I often use this in my presentations on SEO. The scenario is one where a user searches on “frogs”. The first result looks promising, so they click on it, and they get something that looks like this:

However, they don’t see what they want, and they return to the search results and they click on the second result. Here is what they get:

The resulting page isn’t a duplicate of the first, but the information provided is the same. So they go back and click on the third result and get yet another non-duplicate page that still does not have what they want. At this point, they’re very frustrated. It turns out the information they’re looking for is what frogs eat, and they’re not finding the information they’re looking for.

The reason I use this example is I am trying to show clients that being non-duplicate is not enough, and they need to do more to expect to rank in the search results.

Those other sites are not bringing additional value. While they’re not duplicates they bring nothing new to the table.

Matt Cutts: That’s absolutely right. Those other sites are not bringing additional value. While they’re not duplicates they bring nothing new to the table. It’s not that there’s anything wrong with what these people have done, but they should not expect this type of content to rank.

Google would seek to detect that there is no real differentiation between these results and show only one of them so we could offer users different types of sites in the other search results.

Eric Enge: Of course, one thing that might make one of these sites a bit different is if it represents Jane’s opinion about frogs.

Matt Cutts: It might make it different, but that may not be enough. Without meaning any offense to Jane, but if Jane is just churning out 500 words about a topic where she doesn’t have any background, experience or expertise, a searcher might not be as interested in her opinion. In the case of movies, for example, a lot of people care about Roger Ebert’s opinion so that is an example where a person’s opinion could be of great interest.

Eric Enge: I also see a lot of e-commerce sites and aggregator sites out there. What about them?

It’s really the same type of issue. They need to ask themselves what really is their value add?

Matt Cutts: It’s really the same type of issue. They need to ask themselves what really is their value add? That does not mean they cannot create something that works, but they need to figure out what it’s that makes them special.

Eric Enge: There’s a lot of discussion in certain circles that Google loves brands. Some people even suggest that this is primarily because Google wants to reward its advertisers.

… whether someone is an advertiser doesn’t help in our web search rankings at all.

Matt Cutts: First off, I just want to emphasize that whether someone is an advertiser doesn’t help in our web search rankings at all.

Google does try to mirror the real world. We try to reflect the real-world importance of things as we see that reflected in the web. Brands sometimes are an indicator that people see value, but it isn’t the only way that people see value. There are many other possible indicators that something is important and worth surfacing in the search results.

A brand could be potentially useful, but it’s certainly not the only lens to interpret the world. There are lots of signals we use to try to find the results that bring the most value to users. And whether or not someone is an advertiser does not matter at all.

One of the great things about the web is that it still offers up-and-coming businesses opportunities to build their own reputation online. This can enable them to succeed even though other companies may have large advertising budgets.

Eric Enge: Going back to the way we started this chat, Google has compelling reasons for offering diverse results. Understanding this could offer new online businesses a way in the door. In the past people have referred to this as Query Deserves Diversity.

Matt Cutts: Yes, that is a part of what our algorithm does: work to find quality diverse results that help solve problems for users. I would discourage people from thinking about from an algorithmic perspective though. What they should focus on is looking at the overall landscape of their market.

If it is already a crowded space with entrenched players, consider focusing on a niche area initially, instead of going head to head with the existing leaders of the space.

If it is already a crowded space with entrenched players, consider focusing on a niche area initially, instead of going head to head with the existing leaders of the space. This is probably what you would have done if there were no search engines, and it’s often still the best approach. Find something that the entrenched players do not do well, and focus on that. Establish a reputation in that niche, become a leader in it, and then expand from there.

One great example of this is hipmunk.com for travel. They offer great visualizations of what your trip is going to be like. It’s a fantastic UI, and it’s attracting the attention of a lot of people.

Eric Enge: It dawned on me recently that link building is an interesting phrase that has misled people. It’s a bit of a cart before the horse thing. It has led people to think about links as something they get from the “dark corners of the web”. Places where no one ever goes, so it does not matter what you do there. So by thinking of it this way, as link building, you are off on the wrong foot even before you get started.

regarding the concept of link building: “It segments you into a mindset, and people get focused on the wrong things.

Matt Cutts: That’s right. It segments you into a mindset, and people get focused on the wrong things. It leads them to think about links as the end goal. It’s important to think about producing something excellent first. If you have an outstanding product, world class content, or something else that sets you apart, then you can step back and start thinking about how to promote it.

Eric Enge: So instead of thinking it as link building, think about it as PR/marketing.

Matt Cutts: Sure, that is a way to think about it. Whatever you choose to call it, I think the big thing is how you go about doing it.

Eric Enge: We always speak to our clients about focusing on activities that are brand building. Examples of places people have gone to get links that aren’t brand building include:

  1. Article directories
  2. Cheap directories
  3. Link wheels
  4. Blog networks
  5. Any sites that don’t care about the editorial quality of their content

Does that make sense?

By doing things that help build your own reputation, you are focusing on the right types of activity. Those are the signals we want to find and value the most anyway.

Matt Cutts: Yes, it does. By doing things that help build your own reputation, you are focusing on the right types of activity. Those are the signals we want to find and value the most anyway. Just promoting your site on a spammy blog network that no one would ever choose to visit is not a good strategy.

It’s wild to see some blog networks just repackage the same spammy sites and services and have the nerve claim that their content is “Panda and Penguin compliant” when the quality of the network is clearly not at the level that even a regular person would choose to read it.

Eric Enge: Let’s talk a bit about link bait. Many years ago there was a company that created this article on the x things you did not know about death. It was a big hit and they got tons of links. It’s slightly off topic, but not unrelated. How do you think about programs like these?

Matt Cutts:This kind of content can be useful as a promotional tool. Obviously it works much better if the content has a close relationship to your business. But if this is well done, and you use this as a tool to create visibility for your business, with an emphasis on something that people really like, you will typically be OK.

Eric Enge: What about infographics?

Search Engine Complexity Infographic

Matt Cutts: This is a discussion that makes me a bit troubled. I do agree that there are ways that infographics can be created and that represent an OK form of promotion, but the challenge is that as soon as I say something like that, people are going to use this as justification to do whatever it’s they want to do. They will push the limits, and that isn’t OK.

In principle, there’s nothing wrong with the concept of an infographic. What concerns me is the types of things that people are doing with them. They get far off topic, or the fact checking is really poor. The infographic may be neat, but if the information it’s based on is simply wrong, then it’s misleading people.

The other thing that happens is that people don’t always realize what they are linking to when they reprint these infographics. Often the link goes to a completely unrelated site, and one that they don’t mean to endorse. Conceptually, what happens is they really buy into publishing the infographic, and agree to include the link, but they don’t actually care about what it links to. From our perspective this is not what a link is meant to be.

Any infographics you create will do better if they’re closely related to your business, and it needs to be fully disclosed what you are doing. The big key is that the person publishing the infographic has to know, and agree with, including an endorsement to your site as attribution with the infographic. Even then, there is reason to believe that the link is more about the barter to get the infographic than a real endorsement of your site.

I would not be surprised if at some point in the future we did not start to discount these infographic-type links to a degree.

This is similar to what people do with widgets as you and I have talked about in the past. I would not be surprised if at some point in the future we did not start to discount these infographic-type links to a degree. The link is often embedded in the infographic in a way that people don’t realize, vs. a true endorsement of your site.

Eric Enge: There’s one thing that I believe a lot of people missed with Penguin (and Panda). They represent entirely new algorithm capabilities for Google. Using Penguin and article directories as an example, you can imagine that someone manually developed a list of bad article directories.

Then you could algorithmically evaluate the backlinks of those sites, offline from the main algorithm, and that analysis could focus on determining whether or not you see a pattern of concern with the link profile. If you do, then you can adjust the ranking weight of something for that site.

My point is, that you can choose to do any of these types of manual analyses offline and use this type of mechanism to enhance the quality of the overall algorithm. I know you can’t comment on my example scenario, but does that make sense?

… you are right that these algorithms do represent new types of capabilities for us.

Matt Cutts:I really can’t comment on it in any specific way, but you are right that these algorithms do represent new types of capabilities for us.

Eric Enge: Let’s switch gears a bit. Let’s talk about a pizza business with stores in 60 cities. When they build their site, they create pages for each city.

That information would be great on a top-level page somewhere on the site, but repeating it on all those pages does not look good.

Matt Cutts: Where people get into trouble here is that they fill these pages with the exact same content on each page. “Our handcrafted pizza is lovingly made with the same methods we have been using for more than 50 years …”, and they’ll repeat the same information for 6 or 7 paragraphs, and it’s not necessary. That information would be great on a top-level page somewhere on the site, but repeating it on all those pages does not look good. If users see this on multiple pages on the site they aren’t likely to like it either.

Eric Enge: I think what site owners may argue is that if someone comes in from a search engine and lands on the Chicago page, and that is the only page they see on the site, they want to make their best pitch on that page. That user is also unlikely to also go visit the site’s Austin pizza page.

Matt Cutts: It is still not a good idea to repeat a ton of content over and over again.

Eric Enge: What should they put on those pages then?

Matt Cutts: In addition to address and contact information, 2 or 3 sentences about what is unique to that location and they should be fine.

Eric Enge: That won’t be seen as thin content?

Matt Cutts: No, something like that should be fine. In a related situation, I had a writer approach me recently and ask me a question. He has this series of articles he provides to gyms that own websites. He wanted to know if there was a limit to how many times he could provide the same content to different gyms, yet still have it be useful from a search perspective for his customers. Would it be helpful, for example, if he kept on rewriting it in various ways.

Even before you get to what search engines think, users aren’t going to understand what the difference is between these 4 places.

It gets back to your frog site example. The value add disappears. Imagine 4 gyms in the same small city all offering exactly the same advice. Even before you get to what search engines think, users aren’t going to understand what the difference is between these 4 places. As a user, after reading your content, why would I pick one over the other? For search engines, it’s the same challenge.

Find a way to differentiate and stand out, so that people want to try your product or service and see what they think. When they try it, give them something outstanding and earn yourself a customer.

Eric Enge: Let’s talk about content curation. I am seeing these services emerge now, and it seems to me they’re going to be pushing sites to use content curation as a way to get rich text content on your site.

Matt Cutts: As with all the other things we have discussed today, what is the value add? Does it really make sense that visitors to your site are going to be looking to see your opinion on the latest content out there? If the information stream is coming from a third party service and you aren’t involved in anyway, except for providing a place to publish it, this may be something that you might want to decide not to do.

We might not view this as spam, but it’s sort of shallow and we would tend to not want to rank this type of content as highly.

We might not view this as spam, but it’s sort of shallow and we would tend to not want to rank this type of content as highly.

Eric Enge: Any last comments you would like to make?

Matt Cutts: The main thing is that people should avoid looking for shortcuts. In competitive market areas there has always been a need to figure out how to differentiate yourself, and nothing has changed today. Think about how you can create compelling content or a compelling experience for users.

… our ability to detect poor quality links, or spam links, is greatly improved. Our capabilities in these areas are only going to continue to improve further over time.

I was also heartened by comments on the “Ask the SEOs” panel at SMX Advanced. People are beginning to see that we are able to do a pretty good job at detecting spam. One of the panelists commented that we were not just talking the talk, but walking the walk now (Greg Boser). The point being made was that our ability to detect poor quality links, or spam links, is greatly improved.

Our capabilities in these areas are only going to continue to improve further over time.

Eric Enge: Thanks Matt!

Matt Cutts: You’re most welcome Eric.

About Matt

Matt Cutts joined Google as a Software Engineer in January 2000. Before Google, he was working on his Ph.D. in computer graphics at the University of North Carolina at Chapel Hill. He has an M.S. from UNC-Chapel Hill, and B.S. degrees in both mathematics and computer science from the University of Kentucky.

Matt wrote SafeSearch, which is Google’s family filter. In addition to his experience at Google, Matt held a top-secret clearance while working for the Department of Defense, and he’s also worked at a game engine company. He claims that Google is the most fun by far.

Matt currently heads up the Webspam team for Google. Matt talks about webmaster-related issues on his blog.

Comments

  1. Although I understand Google is taking actions against web spam, I would really love to see MFA sites out of serp’s. It’s insane the amount of trash I see in Google Portugal just so they can make a quick buck with Adsense…

  2. Was loving this conversation until Matt gave his closing comments. If Google’s ability to “detect poor quality links, or spam links, is greatly improved” then why aren’t they penalizing the crap out of all the terrible websites out there now in the top ten?

    I do a lot in reviews for example. One niche I follow had 8 legit review sites in the top ten. Now there are 2. The others? A post that is 5 years old of products no longer being sold. Two pages from Consumer Reports, a site you can’t even access unless you pay a subscription. Another has reviews of just 4 brands when some sites have reviews of over 30 brands. And one site in there hasn’t been updated in 4 years and none of the internal links even work.

    This is the quality Google can detect? These resources are useless. And some are littered with free directory links, links from spun articles and blog comments with spammy anchor text. Yet Google punished the sites that had previously been in the top ten because why? They did more of it? They offered less value?

    I think of it like this. Site quality can either be poor, fair or good. Link building can either be poor, fair or good. Google decided that poor sites with fair link building are better for users than good sites with poor link building.

    What do you think the user wants to see more? The good sites or the poor sites? Users don’t see backlinks nor do they care. If Google is really all about the user as they claim and really wants webmasters to be all about users, their only concern should be the quality of content. If you have more and/or better content, you should outrank those sites with little and/or awful content, even if some shady company 5 years ago built some spammy SEO links for you.

    • Eric Enge says:

      Hi Dan – I think the big issue is what is the best way to identify quality sites. It’s a huge challenge because evaluating the content itself is not really possible to do algorithmically, and that is where links came into play.

      That said, I think the comment Matt made on their increased capabilities were very important. You can expect a lot more updates to be coming from them which are intended to address poor quality links.

    • @Dan: I fully agree with you on content quality issue. It seems that Google is just concentrating on links. Discounting these links, penalizing those links. Better approach will be to see if the site is offering value or not (through content)

      Matt mentioned Hipmunk but the fact is that site got viral, got written about and linked back. What was Google’s role in determining its quality and unique UI?

      Google is unable or unwilling to monitor content. Because if that was the case then millions of rubbish sites with adsense would be banned.

      Why should bad links matter? If a site is good and deserves exposure then Google as a search engine should list it higher.

      • Casey Schletz says:

        “Link Building” isn’t 100% of the rank weight… Even if the content is 5 years old, that specific website was set up correctly, AND contains content “different” from others, and generates a lot of traffic. So instead of getting mad at those websites, study them.

        • Casey Schletz says:

          On a further note, you sound like someone who is upset you’re not ranking in the top 10. I urge you to study those websites that are in the top 10 (even if they are poor websites in your opinion) because they’re doing something Google likes. I personally rank higher with all my websites than companies like Home Depot, 15+ year old domains, PR 6 websites, all with a month old website, PR 0 and only 100-200 backlinks. Mainly because I take the time to study the top 10 before I do anything and look at every single detail.

    • Matt Cutts says he got a thick skin, so he doesn’t care. But I do agree I’m not happier with the search result today than it was before. I use Bing 99% of the time after a series of animalistic update. Why?

      Bing ranks websites with quality content and nothing more.
      Google ranks websites with quality links and all other crappy signals.

      You decide. Its why Spiderman use Bing. End of discussion.

      • I like the spiderman reference, but he was paid to use it right? :)

        In my niche the only players now are ones that have a large adwords account. I know that is totally seperate but it’s hard to ignore. I would compare the quality of these large sites to an ehow. It’s somewhat useful seemed to be made for the users, and to some extent they are but mainly made for the engines.

        • I don’t believe Adwords is at all connected, more so that Google is now weighing branded link profiles much higher. Those with large paid search spends are generally also those who have continued online marketing/pr in the traditional sense and thus have a more natural, branded link profile.

          • Matt Bennett says:

            Exactly. The suggestion that companies who spend more on PPC receive higher rankings doesn’t make any sense – if there was a commercial tactic at play here then surely it would be Google reducing the natural search visibility of sites with high PPC budgets so they tank more of that budget and quicker!

            But the main point is: why would Google devalue their search results by bringing in factors that offer no qualified evaluation of quality or relevance?

            In reality big budgets in one channel usually mean big budgets in another, which will result in more brand awareness and more naturally occurring ranking signals. I personally think this is making it much harder for SMEs to break into competitive and lucrative verticals. Which is a bit crap as I’m all for the big corporations loosing out to the little man!

    • I don’t think they are doing a great job with the spam. It is a patchy effort. One thing I really dislike – that affects us – is that they seem to be penalizing our original content because of something else on the site. Eric, your answer to that might be – then fix it, but that’s what I’ve spent 18 months doing, yet not one single thing I’ve done has made any difference at all. I’ve deleted 1.7 million pages, removed all duplicates that I could find, as far as I can tell, there isn’t anything else other than the fact we carry business press releases – and now we’re a lot more selective about what we use. After 18 months of doing that, I’m worn down, and there are dozens of other projects I wanted to do, but theres a limit to the number of hours in a day.

      What is your take on “something on the site” affecting everything, even what we think is our very best efforts?

    • Dan,

      I believe that Penguin was all about “leveling the playing field” in a sense that spam links shouldn’t give you a huge advantage in the SERP’s. IMO it worked quite well and they (Google) are only going to get better at this.

      The fact that there are crappy sites in the top 10 is irrelevant: the algorithm doesn’t differentiate between crappy and good sites. They wiped out sites that used spam to rank.

      Give it some time and the top 10 quality is going to improve – some old players are going to rebuild and there’s always new websites/people entering various niche markets every day.

      Just my 2c.

    • “Greatly Improved” does not equal Perfect.

      Not too many people would argue that Search Results continue to improve (although not as quickly as most would like)

      The statement “Users don’t see backlinks nor do they care” is absurd at best.

      Ex: Go right now to http://www.cnn.com/ and try to not click on a link.

      Did you not see a single link on a page? Did you not care about any of the stories?

      Maybe you’re more of a Sports Fan – try not clicking on any of the stories at http://espn.go.com/

      The fact is that links are VERY IMPORTANT and are not only to build trust with search engines, but to give more information that will not fit on just one page, direct readers to articles that you’ve given your opinion on, just to name a couple.

      How many times has Google said that Quality Content will win out? I’d be interested to hear from Dan if the results have gotten better for him or if they’ve remained the same.

      Google’s algorithm is nowhere close to perfect, but there’s a reason why more people use it than any other Search Engine. Fast forward 5/10 years, will it still be the most used? Only time will tell.

      One last note – let me know if you saw the links above to CNN and ESPN =-)

      • You’ve confused links with backlinks. You said ‘The statement “Users don’t see backlinks nor do they care” is absurd at best.’ Backlinks are the links INTO the page. You are talking about links ON the page. No, I don’t think people really care that much how many links there were to the page they are on, not in and of itself. They just want quality content.

  3. Couldn’t agree more regarding the comments on “link building” being a deceptive term in and of itself. We use that term because that’s what people think they need, but at the end of the day, “content marketing,” “branding,” and “online PR” are much more appropriate for meaningful link development and beyond.

  4. Great info by Matt Cutts here!

    Google is doing a wonderful job eliminating bot farms and spammers step by step. Let’s hope that eventually this will lead to an internet busting with quality to the average user.

    A good idea would be to totaly change the way websites rank….backlinks and PR will always “force” some people follow the dark side….

  5. that was really great.. but I still did not get content curation thing… is it good , is it bad.. still not sure.. What I believe is still information is not reaching to the mass.. so if someone is curating content to share .. i think it is still worthy….

    • Eric Enge says:

      Hi Ketan – I believe the main point there is that including a content curation feed from a third party on your web site might not bring a lot of value, particularly if they are making that same feed available to others.

  6. Nick says:

    Couldn’t agree more with Dan, the returns I am getting for searches are really lousy. If all we wanted was the type of sites that are showing up we could use bing. The only reason people liked google was bec. you got exactly what you were searching for, instead of some site that’s on the web for years without up to date info.

    • And many people do use Bing (and still Yahoo although I believe they just got passed)

      If enough people believe they are getting more relevant results on Bing (or any other SE for that matter) than G, then it’s just a matter of time before G falls to the wayside like AltaVista, AOL, etc…

  7. Fantastic interview Eric, very important resource, thank you! The part that surprised me the most was about the infographics, as I am just working on one myself, but I guess I should have seen it coming :/ But he’s replies do not suggest to stop doing infographics, you just have to do them right, with emphasis on relevant niches and without deceiving those who link to you, they should know exactly who they’re linking to.

    • Eric Enge says:

      Hi Orit – yes, the focus should be on doing them the right way. In the ideal world, if Google were to take some action in this direction, AND they get it right, those should continue to have value.

  8. Great article on spam and how to maintain the quality of a website. What I perceived from the whole interview is that Google is looking for user experience and user satisfaction. As long as the content adds value to users it would be recognized by Google however playing with mechanical algorithms just to gain mileage is not going to work any more.

    However it would be something bad if Google starts discounting links on the Infographics. Thanks to Eric for offering this comprehensive and appealing content.

  9. I do appreciate that spam is a big problem but a pizza chain is a pizza chain what you are asking is people on each of those sites to write something unique in regard to the city, but all that is doing is encouraging content without thought, it’s a pizza their selling they are not tour guides and people are not interested in knowing the ins and out of a town or place they already live in they want pizza .

    Really appreciate the fact that really does that pizza chain need 60 websites all saying the same thing apart from the address which could easily be addressed but a search for your local address off the one site, but my point is its still a relevant search which is only going to get replaced with an unwarranted search term that’s not relevant

    • Eric Enge says:

      Hi Paul – Matt still wants the content to be relevant, but would like people to minimize the repetition of the same content over and over. Short and sweet seems to be OK here. Now the question is, what is OK to do then? Perhaps some info on finding the location itself? I don’t have the answer clear in my head yet, but I am working on it!

      • I’m totally with Paul on this. In the real world, you don’t re-write your sales copy when you pitch to a customer in a different location. Why should we have to do that in the real world? I want my top level copy on each location based page – I know this is my best sales pitch, and I want to use it regardless of where my customers are based. What we need is markup allowing us to declare that this is a location targeted page offering the same product or service.

        • Further more, why don’t AdWords advertisers have to write different copy on each of their landing pages, that often gets to the thousands? Let’s see how Google will make them do that! Oh wait!!! They’re actually filling Google’s pockets so they would never have to abide by such silly and irrational rules.

          All the people who agree with Matt on interview posts like this one are usually simply trying to promote their shitty website/company through the comment link. Seriously… don’t you people know prostitution is illegal in most states? Licking Google’s #@$% for links won’t get you anywhere.

          • Why not just iframe ‘boilerplate’ pages, say with a unique location paragraph on each city page, followed by an iframed ‘pitch’ panel?

            Surely that gives the user what they want AND gives a multi backlink boost to the iframed page for generic visitors.

      • The problem is that many of these types of stores are franchises and have to maintain a set description. Thus, it becomes duplicate content. Another problem is that if a company has 15 stores in different locations with differing menus, they need to list many menu items 15 times, one for each store.

        Google seems to want them to only list it once which causes a huge issue as not all stores have the same menu.

        Conforming to Googles requirements is causing a decreased experience for the customer. Either all menu items are listed only for you to find out that your store doesnt have it, or they link to a pdf which is the menu for that store.

        It is a royal pain as we are being forced to make things work worse just because google want it to be that way. Some businesses need duplicate content and Google needs to realise that.

        I have used google as my search engine all my life, but I am now moving away as I am sick of junk showing at the top of search results. We seem to have taken a step back and not forward.

        For example, the other day I was searching for the first powerplant that early man had. So I searched for “first powerplant” “early man” but it did not just search for “early man” but also searched for “first person” which is totally different but could not be removed from the search results without using advanced search to instruct it not to include it. Why on earth is it using search terms that I did not enter? Assuming they are the same really ruins the results and makes it almost impossible to locate the info you need. No wonder the results are so stuffed with junk at the moment. These updates are supposed to be to remove junk but all it seems to have done is inject junk.

    • matt cutts lover says:

      Because this type of repetitious pattern is used by large scale content generators (what Google calls “spam”) and Google does not have engineers smart enough to differentiate between a legitimate local pizza chain and a thin website with 1,000,000 automated pages.

      I do think its funny how much insanely eclectic SEO knowledge like this a small business owner needs now just to have a presence in Google. I really hope Google updates their old SEO guide which could be 100 pages long by now ;)

  10. Having a USP is the most important thing?! Lets look at all the clothing sites out there that are all the same. What makes one better than the other? We usually dont care which one we buy it from, the most important thing is price. Unfortunately google usually ranks the one which has the most of that brand or has the most links pointing to it.

    • matt cutts lover says:

      Which is exactly why Google wants to roll up the web, creating a space run by monopolistic competition, and no duplicate content. Considering user “Reviews” or a cute interface to be the uniqueness is really shallow IMO.

  11. After reading a lot of articles about this and reading Matt Cutts comments on the subject, I am very torn between agreeing with him and being very annoyed with him.

    On one side the quality content is very important, indeed – on the other hand there are only so many ways you can describe a pizza place and Google can’t judge which of these are describing themselves the right way. Especially not, when they are worldwide and that various cultures have various meanings to specific formulations…. I don’t know… I think I might be frustrated!

    Anyone have comments or points that could make my day less frustrating?

    • David says:

      Not from me I’m afraid. I long for the days when we built websites for users, not for Google nor the concept of what users want that Google people seem to think they know so well.

      Unique content on a store locator – really? I tell ya, trying to please Google has become job-creation. I’d rather print 10,000 flyers and stuff them in mailboxes. Yeah, content duplicated 9,999 times, **** on that G-man!

  12. “people are not interested in knowing the ins and out of a town or place they already live in they want pizza”

    Completely agreed Paul. Google should be more concerned about what the site has to offer, and just like the pizza example, how would the ingredients of a pizza in different cities be worth a good user experience? Or to remove thin content and make it more unique on every single page, how would telling something about the city be useful to the user, who came looking for a pizza, and not sight-seeing !!

  13. Simon D says:

    Google created web spam monster by allowing spammers to earn money with MFA pages of junk content. Now there are billions of web spam pages and google is sinking in the mire they created.

    So when Matt Cutts talks about web spam laugh loudly HA ha HA ha Ha….. shhh don’t tell anyone.

    • Martin Cleverley says:

      You obvioulsy have the same opinion and sense of humor as me. You are 100% right, they created it, let them deal with it! I now think of nothing but the user experience when creating a site, **** Google!

    • Wow. I’d arrived here by providence (trying to understand why my personal niche page was penalized), and after examining the interview, re-reading, pondering (and performing upteenth random searches for my site that has since been buried by wiki, spam, random irrelevant mentions of my niche keyword, etc), I re-read a fourth time and proceeded to these comments. I actually will side with the minority here.*

      Google just does not taste as good as I’d remember. It’s no longer ‘fresh’ and that is due primarily to the bvllsht these Google webpolice project upon what was once, a fairly straightforward user experience. Now conformation to Google? No thanks, as these same webpolice have ‘algorithmed’ their engine to some amazing downhill location that I am not just interested in figuring out anymore.

      So G will continue to penalize, why? Given there is legitimacy that G actually contributed toward the spammer issue & adwords mire, and now G antispam police absolutely squander away either time or valuable user-equity as G attempts to rectify G’s own creation? Who looked out for me? Why am I getting the feeling I have fallen down an SEO rabbit hole (that shows the true extent G has fouled its original mission statement? All I’d wanted to do, was grow my niche, but I’m rethinking given these comments and my own effort to appease not my potential demographic, but Google?

      *I’d stated that I side with the minority, only since those of us who are ‘feeling torn’ or somewhat disenfranchised by G changes, are becoming the silent majority. If we feel ‘torn’ we need only trust our gut feeling that these “google analitics (sic)” simply penalize rather than merely providing a no frills platform, that was the Google niche in the first place. Lest these boys at Google forget, we CHOSE to use the Google engine years ago, and we can CHOOSE to find another, now.

  14. Martin says:

    MC has been preaching the same old crap for years but websites rise and fall at random now. Instead of seeing him repeat this “natural” content crap for the hundredth time, I’d like to see him talk about the broken search engine. SERPs are terrible from an actual user perspective.

    Let’s talk about the random penalties. Let’s talk about some of the most respected websites on the web being slapped down. Let’s talk about the obvious spammy content that ranks high. Let’s talk about negative SEO. Let’s talk about people that run local businesses and don’t have time to study SEO theory, link building, run twitter, facebook, G+, author profiles and all that crap that is now “SEO.”

    So sick of hearing his same old crap when it’s clear that owning a natural website is no longer enough. Stop feeding us the same old rehashed crap and tell us when you’re going to fix your damned search engine.

    • David says:

      +1 (That’s a positive vote, not a G+, by the way)

    • Enticing Designs Publishing says:

      I totally agree. I also believe google plus one is the stupidest thing I have ever saw and the only reason people are using it is to get their websites to rank. I started using Bing months ago.

    • Hell Yeah says:

      Couldn’t agree more. Tired of reading the same old bland PR from google. They’re describing the world as they want it to be, not how it really is.

  15. The pizza chain discussion is just not that simple. We provide a service to music and dance studios that includes a website, and the fact is that chains, licensees and franchises all have similar USPs and their differentiation from each other is simply the territory they serve.
    Their differentiation from competitors is another matter, but it is the same in every location.
    For google to penalize hundreds of franchises because they have mostly the same content is simply a deficiency in their algorithms.
    I hope Matt and his colleagues realize this (and fix it) even while downplaying it in interviews such as this one.

    • Eric Enge says:

      Hi Gerry – I agree with the discussion taking place that this is a hell of a problem. Not clear to me how the site publisher can create the right content, and how Google can decide how to value one more than another.

      • OK, I’m no genius but surely something similar to a canonical tag could identify a franchise chains network; this way Google knows that its not a competitor stealing content, or they owner of the network of websites is just being lazy.

        For example, if Dominos were to create a landing page for each city they’re located in, I can tell you right now they wouldn’t be writing unique and new content for each city or town. Companies have a brand, and with a brand comes a message – they won’t change that message for each store.

        Am I right or wrong? I think I’m right!

  16. The problem with google is, they tell you that you can do anything with a great product and great content and no marketing budget, but the reality is without going into grey areas you probably won’t succeed because by the time you start to establish yourself there are other players coming up with much more marketing money, taking your idea and push your out of your niche.

  17. Anonymous says:

    I agree with those especially frustrated by the local pages answer. As people mention – can you really differentiate 60 pages about a pizza location? Even if you can, does this not contradict the things that Matt Cutts says about quality content?

    I’ve worked with a company that thinks that it is a good idea to produce local pages for every city/town in the US because they sell nationwide. Let me just say here – I completely disagreed with this idea…but the CEO insisted! We do not have locations in every city (only 4 or 5). However, they still consistently rank in the top 10 for these localized searches. We literally copied and pasted the content into each city page and just changed the city name and title tag/description to reflect it. They also added keywords to the page and a ton of city names and states to the bottom. It was painful to watch.

    NO VALUE TO THE CUSTOMER. But, our competitors were doing it and they’re still doing it and it’s producing a lot of traffic….but it’s spam! Try searching “stair lifts los angeles,” or “stair lifts las vegas” or for example. You’ll find AmeriGlide there. They plan to do this for their whole network of mobility websites that sell all the same stuff (about 25 websites).

    • Just checked the content and it is “unique”, i.e they/you haven’t just duplicated the paragraph and changed the city. Chicago page has something about the “Windy City”, which a find and replace wouldn’t have done. I’ not saying the content is good or that there is value to the searcher here, but the text appears to be unique enough for those that I compared.

      Sample size: Two.

  18. Ok. To end the pizza-discussion:

    The Chicago restaurant gets the text:
    Our restaurant here in Chicago is furnished in a tasteful vintage style that tells the story of this the first restaurant we opened. Bring your whole family, and sit down in one of our private booths, or let your kids run free in the play area. When you’re here, don’t miss our special anchovy pizza!

    The New York Restaurant gets the text:
    Welcome to our latest restaurant, situated just next to Central Park. Have a nice meal and watch people run by on their way from a hectic day at work. Here you can really feel the pulse of the city, and we have made our best to make the restaurant reflect the greatest city on earth.

    Now, how hard was that? ;)

    • Eric Enge says:

      Hi Aaron – this is definitely headed in the right direction in some respects. You probably want some more “pizza” context in the the New York example – no not suggesting keyword spamming here by any means. It just seems a little “forced”.

      • Of course this is forced. It’s just an example. The point being that it’s not that hard to write up a couple of short texts thats unique for different pizza places, or whatever it is that you have a chain of. :)

        • wheel says:

          And now you’re forcing people to dance to Google’s tunes for no good reason. McDonalds location A is exactly identical to McDonalds location B – and when I’m searching for restaurants in location B then a page form McDonalds location B with content identical to site A except for the address is a perfectly valid search result and what the visitor wants. Unfortunately, Google’s decided that the cart has to come first, so you’re forced to provide ‘unique’ content when in fact unique content is NOT what the website should be providing. Now people are simply justifying ‘unique’ content for Google’s purposes, not the user.

          Quite simply, Cutt’s argument that they should have unique content is wrong and that it should be on a main page. It should NOT be on an upper level page. Information on Big Macs is entirely relevant to every single location of McDonalds and someone visiting page A should be able to get that information on that page. They aren’t going to look at site B which is for an entirely different searcher, and McDonalds shouldn’t be forced by Google to not provide this information to the visitor looking at site A. Again, doing so is simply to comply with Google’s algorithm, not the visitor’s best interests. And it’ll work that way until Google decides that in fact something else is in the user’s best interest.

          The opposite is also true. Some businesses deal locally and others do the very same thing nationally. If Sears installs furnaces nationally, then why shouldn’t Sears main site rank for ‘furnace installers in Podunk’? No, Google has decided that if you deal locally but are national that you should be in second place behind people with ONLY a local presence. What kind of sense does that make? So the only solution for the national presence to fix Google’s shortcomings is to start bullcrapping pages with local content.

          • What if it’s a teakeaway???

            On the subject of frogs if I wanted to know speciafically what they eat I would search “what do frogs eat” on BING

      • How about some content around what the most popular pizza is in that area? Or how long deliver takes to that area? Short, pizza related content that adds value :)

        • Does that add value though? Does anyone give a sh*t what the favourite pizza in that area is? Maybe, maybe not, but I don’t think it DETRACTS from the page that that info isn’t there. How long it takes to deliver the pizza is something that does add value though I’d say.

          But the point is that Google can’t tell whether this or that piece of info adds VALUE to the page, unless someone decides to externally link that page – which obviously won’t happen. So they just rely on the fact that Page X and Page Y are DIFFERENT, because that’s the closest they can get to determining value without the help links.

          In this particular case social signals aren’t going to help either, because I’m not going to ‘Like’ this page – what would my friends think if I start Liking the opening hours of a Dominoes or a McDonalds? (if I had any).

          “Wheel” above you is correct in that if you’re adding text for the sake of it then you’re just playing the game to appease Google.

    • The city descriptions have FA to do with the pizza, but the ingredients, the preparation, the pricing – everything else, in fact – by necessity will be the same in a franchise scenario. So the actual things that are completely relevant to pizzas are the very things that will get you penalised, unless you’re lucky enough to have a central franchise site that you can refer to. I’m sure there are many cases where this just isn’t the reality.

      Additionally, Google is absolutely rubbish at determining who published something first and therefore who should have authority on the content that is apparently “lacking value”. My own content has been scraped and republished dozens of times and Google gets it wrong time and again. I have stopped checking, it just depresses me.

    • It’s a fair point Aaron makes. However, try doing that when you have 4 stores somewhere like Hull in England. And then if you’ve got 300 stores across the UK, with multiple stores in many cities. It’s easy to write a bit of unique text for Chicago and New York. But try doing that for Hull restaurant 1, Hull restaurant 2, Hull restaurant 3 and Hull restaurant 4. You wouldn’t even consider doing this in the real world. Google needs to fix algorithm or allow us to declare that this content is duplicated copy targeted at customers in a different location.

  19. This article certainly clarified a lot of confusions and misconceptions

    however, one thing tht still remains unanswered is that how do we do SEO of a Forum?? As the content in Forum is user-generated, a lot of duplication occurs and its thin content… so how do we do SEO of a Forum??

    • Eric Enge says:

      Hi Karan – I think a forum that has a high volume of participation will sort itself out. Users will create a ton of context and different content. If the volume is low, that is a much harder problem.

  20. So its not just unique but it also needs to provide values? How, if I may ask, can an algorithm judge a value of a page?. Maybe that’s why those MFAs are still ranking

  21. If Google thinks they have improved search results, they cannot be more mistaken. Searching on Google for anything in particular these days is a waste of time.

    Over the last month or so I find myself giving up in exasperation on Google and heading over to Bing to find what I am looking for. Like there is a whole page of results from the same website. Seriously – Mr Cutts is talking about bringing something different? A simple choice in the search results would be valuable to the searcher, for starters.

  22. I for one am glad to hear Matt’s stance on Infographics. I have always thought that these were overblown for the most part and only really loved by the upper echelon of SEO people who have full time graphic artists on staff or are otherwise willing to outsource a project like that at a substantial cost.

    It seemed like everyone was so gitty over those graphics when personally, I found them to be harder to interpret than just simply writing the damn stuff down using simple sentences.

    As far as content curation goes: If it is well done where the goal of the site is to educate the visitor or add tremendous value in some other way, then I think those sites will always remain in favor. People love to find one stop sources of all the information they need on a topic. However, if your idea of content curation is to just copy other people’s work while adding very little to it, then I would think that model is destined for future failure.

  23. Walking the walk? I can’t believe he said that. Search results are lower quality than they were 6 to 12 months ago. Maybe I’m using a different Google? What URL is MC using? He talks with pride about their algorithm accomplishments the results are dismal. Google is broken. The king has no clothes.

    • wheel says:

      The search results may be of lower quality ( or be perceived to be of lower quality) for a very simple reason that is being covered by this smoke screen of an interview.

      Ranking in Google is still almost entirely due to links.
      Penalization in Google is due almost entirely due to links.
      Great, expert level content will not get your ranked.
      Mediocre content with great backlinks will easily rank.
      Great, expert level content means absolutely nothing if you are penalized due to links.

      Google will happily take the most authoritative site on a subject and penalize it right out of the serps for no other reason than they don’t like the anchor text – and let all manner of lower quality sites rank. Let me put it another way – Google will specifically and deliberately serve lower quality content sites to their visitors for no other reason than they don’t like something about the backlinks of a site with better quality content.

      In short, it’s still all about links and if you’re focusing on content like that’s going to matter, you’re buying into the Google kool-aid. It’s bullshit that’s spewed out to webmasters. The algorithm still ranks primarily on backlinks – I know it, Google knows it, and you should know it too. All the talk of authoritative works on frogs is about as funny as George Carlin. Google has no idea what an authoritative article on frogs is. And of course go google George Carlin and what ranks? WIkipedia, a thin content page, and a cut and paste of some of his quotes. George Carlin’s site is way down the list. How’s that for authoritative content?

      • I agree that Wikipedia is no way can become an authority according to my own standard. There is no originality on that place, all content is copied,derived and based on other sources. If I’m an expert on something, write technical books and expert articles published on my “official unpopular website”.

        Here it comes, a wiki editor rewrite the content, put the citations and sources to my site and then Matt Cutts and his team ranks it higher than my site??

        How that for authoritative content? That is the correct question for Matt Cutts. Is it because I only have one link pointing to my site? and that Wikipedia link is “nofollow”. That is just ridiculous.

        Google is broken. Guys, time to move to Bing for a real and authentic search result.

  24. G has gone to crap. I’m sick of seeing this guy go on about value. Sorry I run a small business I don’t have time\money to pump out gold value nuggets all the time. I have relavant great information but g is just throwing things at the wall to see what sticks. There are a lot of quality non spammy sites getting wiped out and this guy is trying to make the searches sound improved. Its a full-time job just to please the monopoly g monster.

  25. When is the Google webspam team going to deal with the fact that alot of the spam on the internet, such as forum links and blog links, come from spam teams using gmail addresses? I run a number of vBulletin forums and I can tell you, most fo our spammers use gmail addresses to register accounts.

    At the same time… What is the value added of all those slightly differently worded pizza place pages? I agree with the value added part, but all Matt is doing is telling us that individual location pages likely fly in the face of duplicate content and Goggle hasnt figured out how to handle it, so they are tossing it back in our lap to deal with.

  26. rico_suarez says:

    in case of frogs, Google has hedged smartly – you will get Wikipedia result on first page which will cover most questions you have about frogs – what they eat, where they live, what colors they are, how long they live etc…

  27. Matt states 2 or 3 sentences of unique content is enough to rank, but that you shouldn’t repeat a ton of content over and over. Does the unique content counteract any other repetition further down the page?

    We offer a range of extreme/adventure sport activities in different regions, each regional page has 200-300 words of unique and compelling content at the top of the page, however the offering (or products) for each can often be repetitive, and while not usually 100% identical one region to the next, there can be some similarity.

    Will Google recognise that we’re not just duplicating content across the site, and that we’re creating relevant regional landing pages which all have significant search volume to justify building them or will it ignore the top 30-50% (depending on the number of products) of unique opening copy, and just see the bottom half of the page as being duplicated?

    For now these pages rank quite well, although I’m concerned that Google will one day class them as duplicate.

    • A year ago Mike, I would have been right with you on worrying about dupe content, híghly original writing – completely new, new images, blah blah blah. We pounded keys and researched like no body’s business. Of the 25 top PR CEOs in the world, I sought and interviewed 20 – anything and everything you can imagine to step up the game.

      Fast forward to now. A de-indexing via Panda – re-indexing through praying to God, lost aggregate traffic and several thousand hours work, several thousand dollars author pay, and I think we are looking at a Google search that had a heart transplant when all that was needed was some cough medicine.

      Sorry to horn in Eric (guys) but we addressed all these “SEO” issues, rolled like good boys and girls, and Everything PR News gets more organic traffic from freaking BING! And we are in Google News!

      Another taboo for me is putting links on contemporary’s sites, but in case you want to walk into the light.

      http://productforums.google.com/forum/#!category-topic/webmasters/crawling-indexing–ranking/fwrplSaw97A

      If anyone out there has been a bigger proponent of Google defeating SPAM than I, Matt Cutts even, show me he or she! I feel very sorry, very sorry indeed, for a throng of good guys citizen journalists hung out to dry here.

      Do I sound angry? Well, I am not just angry for me and mine. I always played fair. It’s getting close to time for “watch this” in my book.

      The sad thing here, is that the very people who respect what Google has accomplished, advocates even, get hammered into nothingness. But then, as long as somebody can spin a yarn tweet a tweet….

      Always,

      Phil Butler
      Editor of
      A bunch of stuff

  28. Hi! Dear Matt Cutt….asper google Paid links to be penalised …but why google encouraging paid links and giving ranks and top position…what does it means…The manager where i’m working said with out any deliberate changes on site on page & content he’ll get Top1 position with paid links.

  29. “In addition to address and contact information, 2 or 3 sentences about what is unique to that location and they should be fine.”

    It’s pretty hard to do this without using thin content if your business is exactly the same in each location. Surely there needs to be some kind of allowance for location based pages containing duplicate content, (perhaps through markup?) in future? It’s unrealistic if you have a pizza shop in 200 cities that there is going to be anything unique – at least that’s worth saying – about each store. They all sell the same pizza, at the same price; the whole idea of the chain is that it’s uniform and standardised, that nothing is different except perhaps for the names of the staff!

    It’s a great point that you want your top level copy to be presented to users regardless of their location. But at the same time, you want to send them to a page dedicated to their particular location. It’s absolutely reasonable and in some cases essential to have duplicate content here. I’m not expecting each of these pages to rank for “pizza”, but I absolutely think that duplication of top level copy should not stop a page from ranking for “pizza [location]“.

  30. (…) we are able to do a pretty good job at detecting spam.

    Yeah, right – they’re nowhere in making spam not profitable.

  31. I’d murder a pizza right now… Anyway I’m about to embark on another project that seemingly requires localised town targeting.

    I am fairly new to the SEO game but what amazes me is the huge contradiction of creating “a great user experience” and then creating X amount of pages with mostly the same content, except for a bit of guff about the local area to get the rank.

    The amount of sites that have spammy, horrible looking sites in my area is ridiculous, it is so irritating.

    Every quality content, user-centric bone in my body doesn’t want to create location specific pages but I feel I am forced into it. Would you still create location specific pages for new sites?

  32. mateo says:

    I find the questions about the „Austin pizza page“ interesting. I guess the same problem applies for Press Releases. (Duplicated Content)

    Assuming you would hire a PR firm, which creates an interesting story. A amazing and valuable story which took 8 hours to write. Now, based on an hour rate of 100 USD, you come to almost 800 USD for something valuable. Now, the point is: what do you do next ?

    Solution 1) you send it to 10 newspaper and magazines. 5 of them will publish it because it’s valuable.
    Solution 2) you send it to one, and only one will publish it on their website.

    Which solution would google prefer and would solution would make more sense speaking from a brand value? What do you guys think ?

  33. Seems to me, a simple, future-proof SEO strategy is to publish content that serves and delights a real audience! Thanks for sharing the interview Eric.

    • Yeah, great. If those real audience don’t link to you, its not a future proof SEO strategy. It’s been said many times, quality content does not make you rank in Google, links does. But…here is the sad thing:

      If you care to market your content by getting links? Hello Penguin.
      If you care to market your content by infographic? Hello _____: animal update not yet named because Matt Cutts says they will devalue this soon?
      If you care to market and push your content forward? You are self promoting it. These type of links are not rewarded by Google because it implies you are pushing it. Not true reward of great content? Not true editorial nature?

      There is no future proof SEO strategy based on this analysis or even a bright future in the web for small business owners if Google continues this assault. Put your money in stocks, gold and real estate instead. They will surely grow with value, not the value Google wants in the web.

    • Publishing content is important, but content with no audience is like putting a message in a bottle and then throwing that bottle in the ocean. Nobody is ever going to see it without significant promotion (backlinks).

      To me, the most intriguing question is, what exactly does Google consider to be low-quality backlinks?

      • AlexandEf says:

        If they are using simple recursion, then it’s not good.

        In my opinion, for the purpose of valuing backlinks, sites containing those should only be rated on the content originality, and further outgoing links should not be taken into account at all. Unfortunately, we’ll likely never know what is the actual algorithm.

  34. Hi Eric, great interview. I love that you challenged Matt several times. Especially in regards to the pizza company with multiple locations.

    I don’t feel Matt had a suitable answer for this, which means they don’t cater for what I see as an exception.

    If a user were to land on the Chicago page, then they’re highly unlikely to then visit the Texas page too, why would they?. The information that Matt states should only be on a top level page, would very likely be useful to visitors to the individual state pages.

    What makes their pizzas unique and awesome will be the same whether in their Chicago store or their Texas store. But unfortunately under Matts reign as head of the Google spam team, this kind of information cannot be given to the customer, without fear of penalisation.

    It’s utter nonsense. I fully appreciate what Matt goal is, but the way he’s going about it is all wrong.

  35. I find the pizza scenario hard to believe.

    Say I have a 65 page site for my 60 nationwide pizza restaurants. 5 Pages have loads of great content including menus etc, and then 60 pages each have contact and location info and 2 or three sentences of text only, I have a very hard time believing that these 60 pages wont be considered thin content.

    I agree that they should NOT be considered thin, anything more would probably just be created for the search engines, not the end user, but I have a hard time believing that Google would view this the same way and that the site would be safe.

    • I completely agree, they shouldn’t be considered thin. But these pages are often landing pages. If I search for Pizza Chicago, then the most applicable landing page is most likely going to be the companies Chicago page, which I believe should have a mix of unique and duplicate content.

      The address, phone number etc maybe unique, while a description of what makes their pizzas so awesome, would be duplicate text.

      It’s not in the users interest, to leave out vital information. Yet Google are pretty much forcing companies to do this, which goes against their apparent goal.

      Stolen content should be the focus, because chunks of duplicate content are often useful to the visitor.

      To abide by Googles guidelines, I’ve had to remove functionality, instructions and god knows what else, all of which lead to massive numbers of complaints from my users. It really annoys me.

      Many sites face scenarios that Google just haven’t taken in to consideration. It maybe that to do so, is just too complex and as we’re all just pawns, so what if our websites are unfairly destroyed in the process.

  36. Individual locations of a chain of [pizza restaurants|lube shops|any type of service with local presence] should rank based on IP geolocation information G collects anyway as well as the town’s name, ZIP and perhaps state entered together with the search query. In the absence of name/ZIP in the query IP geolocation alone should be enough to get the searcher reasonably useful results.

    I don’t understand the entire premise of value content in the particular example. Often times all the content that’s needed is their address of their phone number and not at all pizza recipes. That info will by necessity be unique to the local page – so what’s the big deal about adding 2-3 paragraphs of customised and unique content? Why adding anything at all, if all you cared about was user experience? Most reasonable people would understand the word “chain” in a literal sense, expecting a restaurant in a location A to be an almost exact replica of location B and that goes for their sites as well.

    Actually, I think that this example exposes a weakness in the Google’s algo that’s dealing with content duplication, and MC cannot just say “duplication does not matter to us. We cannot reliably identify the original source and therefore we use different signals anyway” because that would open floodgates of new scraper sites and G will have to deal with increased workload of DMCA requests. And if there’s anything G hates, it’s increased workload that cannot be automated.

    So, anyhow, deficiencies of Google’s algo is something that I never heard MC admitting to and that’s why he has to invent some BS reasons and speak high-minded words like “value added content”. We all know if quality of content meant much, eHow would not be making a killing on re-hashing other people’s content and pulling out of thin air articles like “How to remove a sock from your vacuum”. I am not kidding you, they do indeed have an article about removing a sock from a vacuum and it ranks on vacuum related terms (I just didn’t care to check, perhaps it ranks for socks, too :) )

    I don’t know why people keep interviewing Matt Cutts. He has never told anything that wasn’t self-evident and he has indeed injected quite a few bits of mis-information in his remarks in the past. If he didn’t, he would lose his job and people would stop interviewing him … Is this a special form of entertainment for SEO-minded people?

    • I don’t think anybody suggested pizza recipes would be helpful. But, if I’m searching for pizza, I obviously don’t have a particular company in mind. So the page has to sell their pizzas to me, what makes them so yummy. Simply showing me opening times and address, isn’t going to inspire me to call them and order my pizza.

      IMO it’s counter productive to have to reword this for every single location. Especially if the company uses a particular kind of language with their customers.

      Interviewing Matt Cutts, more often than not, doesn’t lead to anything new, because he skirts around subjects and rarely gives a good answer. I like how Eric questioned Matt. But I think most interview Matt, because it’s good link bait.

      • Paul, you make a good point about competition in the pizza business (cut-throat, I’ sure) but I would say that if they did IP geolocation right, they would eliminate all other instances of the [almost] exact duplicates belonging to the same chain because they do not serve the same location, at least not often. And so Google would be left with ranking different pizza pages, by necessity not duplicates – a normal situation for any search, pizza or not. So, his remark about adding 2-3 paragraphs makes no sense.

        Actually, the whole pizza example makes no sense: they already do geolocation reasonably well and when they talk about pizza chains what they really want to say is “almost identical affiliates without certain geographical location”. I don’t know why MC is so shy about calling things what they are but the pizza example is very misleading because people that would really benefit from listening to his remarks about adding 2-3 paragraphs could very well be competing with each other using the same sales copy from the product manufacturer. I guess MC does not want to be seen helping online affiliates.

        And also ditto on why people interview MC: of course it is a link bait! But I find it *incredibly* ironic that Matt Cutts’ own “added value” is actually a link bait and not at all the “content” of his speech.

  37. I think that until there is some way to actually tell if the content is any good from an editorial standpoint, things will continue to be a mess. One of the niches I write in is fairly small, and there’s a handful of really good sites written by dedicated people. Then there’s tons of pure affiliate sites. Which ones do you think get to the top?

    I notice that the ones in the first page tend to change frequently, but they are rarely if ever the quality sites. It’s just a matter of which site has the most questionable back links that haven’t been devalued yet. That and press releases, I go crazy from seeing so many press releases in my alerts, surely Goog will devalue those at some point, then people might stop with them.

  38. I think there was some mistake made here. Matt is talking about how search/SEO would work in some ideal future world, right? Because when I look at the competitors out ranking me (using Link Detective) I see that the get tons of links from comment spam, low quality content, forum profile spam and article marketing sites. Anyway, I hope Matt’s dream becomes a reality someday. Until then, spamming is a crime that pays.

    • Hard to believe isn’t it. In Matt’s eyes, he was talking about right here, right now. But unfortunately in his mission to improve search results, he’s completely destroyed them IMO.

      More and more people, who know nothing about SEO, are telling me they’ve switched to Bing, because they are finding it too difficult to find what they’re looking for on Google. I personally find it harder to find answers to my questions and keep finding myself on trashy, spammy websites.

      I don’t even know what the answer is anymore, but I do know that they’re going in the wrong direction.

    • Robert says:

      Matt Cutts and his webspam team looks like pure idealists. But welcome to real world where google now not have enough sites to rank because of penguin penalties.

      I really start reading M Cutts interviews/videos as some kind of seo entertainment (circus with clown). No real information (extremely low amount), but how he enjoys himself and idealizes google algorithms!!!

      Pure Narcissus!

  39. I find it fascinating that self-styled SEOs can’t work out how to differentiate 60 pizza outlets. It’s completely simple.

    Each outlet has their own signature pizza that they feature on the page. They talk about the ingredients, the name, who created it.

    The encourage customer reviews (good and bad)

    Have different daily deals.

    Feature the staff, what pizza is their favourite and some interesting facts.

    Some of you SEOs may think its too much work, but you need to think outside of your own remit. Showing useful, unique and interesting content is often the catalyst for improvements within a business as a whole. Being relevant to your customers is absolutely key.

    Not only should SEOs differentiate content but they should help business owners differentiate themselves from the competition. Customers demand to feel special in a sea of commercialisation, content is the first step on that road.

    Quit moaning and do what’s right for your customers and not what’s easiest for you.

    Lee

    • Lee gets it. While some complain, some question, some are frustrated (including me). Lee quietly takes his client to the number one position across the continent.

      What this reply reminded me of was a mindset. I can waste time crying foul or look at the challenge with a little imagination and win.

      • I think the point is that we shouldn’t have to make those pages unique, should we? The offering at that location is the same as the other locations? You’re ways to make the page unique are good though, plenty of scope there.

    • Peter says:

      I agree completely with Lee’s comment. You can and you should target your local community but there’s so many options besides the common content.

    • I’m with you Lee.

      For the whiners out there… There are thousands of Dominoes pizza franchisees across the globe. What would you do if you had one different store owner from each major metropolitan market in the US contract with you to create a site that would rank. Could you handle that?

  40. Amazing, all this talk about quality sites adding value and relevance, yet the initial case study by Eric isn’t even accurate!

    TOADS ARE FROGS !!!

    No wonder most SERPs in Google suck!

  41. Interesting exchange on duplicate content for local pages. I can’t count the amount of times I’ve seen this on clients’ sites. Thank you for bringing up the topic, EE

  42. I run an infographics and data visualization design company, and Matt’s comments worry me a little. I agree that there are tons of bad infographics on the Web, but you may love an infographic that I think is a bad design. The comment about “fact checking is really poor” can be applied to all methods of communication (text articles, videos, audio/podcasts and infographics). Bad infographics could be defined as misrepresenting the data, visualizing the data incorrectly, getting the statistics incorrect, topics unrelated to the host website or just poor design. How can Google possibly determine good vs. bad infographic links?

    Some fantastic infographic design work has been done by some great designers that absolutely should be driving valuable links, traffic and metrics for their host website. However, since they are all just JPG files to the search robots, Google would have to discount all image links regardless of their content. That would be a horrible outcome because visual communication can be so powerful and effective.

    Thanks Eric for asking Matt some great questions!

  43. To everyone that says they think results have got worse since this or that: I disagree. Maybe for the certain niche you obsess over (I obsess too btw) things have shuffled round and you’re not seeing what you deem to be the best sites, but for the most part Google are still giving people what they want (which may not be your client’s site, sorry).

    I actually think if you were served up Bing’s results but in a Google interface, no would would blink.

  44. Good interview Eric. It’s a lot of worth-while info here, some old and some the traditional “we suggest this” style Matt-we’ve-come-to-love style answers. It’s going to be interesting the updates, when they happen and how they will value links and the levels of juice and relevance they will pass.

  45. Love the insights guys!

    Bottom line is, stop writing crap on the web and actually put in some EFFORT into your content!

    Shoot for good structure and 1,500-2,000 words for each post. They will stand the test of time and you’ll be rewarded.

    I didn’t pay attention to SEO at all for the first two years of running Financial Samurai. Now it gets over 200,000 pageviews a month and I just engineered my own layoff!

    Best,

    Sam

    • Robert says:

      write very good content, and next penguin or zebra will penalize you.
      Not because of your content (keyword stuffing/etc), but because backlinks.

    • “Bottom line is, stop writing crap on the web and actually put in some EFFORT into your content!

      Shoot for good structure and 1,500-2,000 words for each post. They will stand the test of time and you’ll be rewarded.”

      Penguin update was about backlinks so you can write all the content you like

  46. For search results to reflect real world requires greater maturity of the searcher. In the real world you wouldn’t go up to someone and say “frogs” and expect an answer – you’d ask what do frogs eat, to which I’d probably suggest try Google!

    We work with many SMB’s and the level of understanding about how to achieve relevant search results (or for what results to try to rank for) is incredibly murky. Of course everyone wants to rank for the shortest possible keyword organically without paying a cent … and few understand that there is a cost to achieving organic.

    Maybe I should create an infographic on that!

  47. I had a gut feeling Google was doing something with SEO spam back in 2010-11 when everyone (including myself) was saying how Google was falling behind in catching up with directory and easy-to-get dofollow links. All this time they were just getting ready for the knockout punch.

  48. Google has chrome now, they can see more things than before, so it is harder to cheat them now, but I think if everyone must do really white hat SEO as Matt Cutts want, SEO will be no possibilities job.

  49. Great Article Eric, it is time to make something more meaningful to the people without do spam and create a website in accordance with the guidelines of Google.

  50. I’m a big fan of this interview. Saw it on SEOMoz email list, so congrats Eric on the quality site and growth.

    I have a few real-life friends on different sides of this issue: the guys trying (like myself) to continuously produce excellent content and the guys who make it so we cannot have nice things.

    I know there is a lot of “other” stuff in rankings and success, but if you go in it with the mindset of helping someone and producing good information to help someone, googlebot will accept and possibly reward you for that.

    The friends who spend money on links and spam and auto-whatever are fighting an uphill battle with a big robot. The friends who spend time and money on great content, user friendly design, and networking with other content producers tend to win. But slowly.

    Thanks

    CG

  51. Aloha.. very elaborate post in the concern about what we are posting and what is expected from the big G. It seem as if Google is expecting highly valued content, rather than gibberish. Which makes a lot of sense. Matt, keeps reiterating about repetitive use of content and where you are exposing your content at. I would think with so many companies available, who do you identify who is the good guy giving Google what they want or even to others, with linking to or article writing. It’s like a catch 22 marketing online or damn if you do and damn if you don’t.

    Although it is good to know some what of what is expected from the bots. I do like the part he carefully says that if he say something than there is a Heep of folks rushing to do what he said. I’m glad I came by and thanks for sharing! Mahalo, Lani :)

  52. Jared says:

    Eric, excellent interview and post. I did appreciate what Matt said about local search and the need for a business to differentiate pages at a local level.

    Regarding “link building” and content curation, if a company has a marketer responsible for publishing great content, but it’s not 100% relevant to the site, by getting that content placed on another site with a link in the byline back to the site, do you think that is still helping the backlink profile or have Panda/Penguin now addressed this situation? Also, will the weight of the links change if the anchor text is different than just the website name?

  53. Steve says:

    How do about.com and ehow.com still rank so well everyone knows their content is so thin and misleading? They are clearly cheating the system and google doesnt seem to be able to stop them.
    I cant tell you how much those sites annoy me.

  54. Its glad to see that Matt Cutt’s is really hammering home the need for good, well thought out content rather than straight link building. I have found that since the recent Panda and Penguin updates many of our clients sites, where we have written good copy have started to jump up the listings and this is without them actively link building.

  55. Matt Cutts and Google are full of crap – he contradicts himself so much, the pizza store example is excellent – 2 or 3 sentences should be enough to rank – that is thin content, how could it possibly rank, how could that possibly be useful to users, why would a user arriving deep into the site not want to see the same info that, unknowing to them, is also useful info for another place?

    It’s complete bullcrap and Google / Cutts dictating what the internet should be, what sites should do, how to make pages etc. Hipmuck example uses tech that isnt SEO friendly and relys on PR, social signals and brand search for their ranking. Who’s to say that their results are more useful that a long standing player – i can imagine the geeks at google love it, but personally I dont find it as useful as other services.

    His point about PPC sites not getting benefits – I know for a fact that this isnt true, may not algorithmically improved but you can get a direct line to a google techician if you regularly drop enough cash.

    Basically – what I’m trying to say is – I wish they would just stick to cleaning up the web and spammers, not dictating what the web should be, penalising anyone who doesn’t conform and providing misleading and completely confusing and contradictory advice on what they specifically value and what they don’t. Google seems to be full of megalomaniacs supported by fawning SEO’s and their own quest for self importance and fame (Cutts, I’m talking to you).

    If this continues, we will end up with a bland internet, with big brands ruling the web and no chance for any independents or fresh thinkers to break through. F U Google!

    • He never said that you would get penalized by using thin content.

      In your comment you are trying to dictate what is right and what is wrong, but in the interview Matts Cutts only say what is better to the user. Not what you MUST do.

      By the way, GREAT intervew Eric

  56. Thank you for this interview. I really hope that Matt is not just being political correct. He said what we already know (at least most of it) but the thinking is to bring a really high quality user experience, with great articles, opinions or other great information that the user can really use.

    Thank you Eric!

  57. This sentence stands out for me:

    “The main thing is that people should avoid looking for shortcuts. In competitive market areas there has always been a need to figure out how to differentiate yourself, and nothing has changed today.”

    This is as it always should be, but alas there’s too many lazy SEOs who just want to get a quick fix and not concentrate on the long term.

  58. I think there is an increasing divergence between what Matt says and what Google actually does in 2012. Matt has always stated that good quality original content is the most important thing for Google, and for the last couple of years I would say that was indeed reflected in the search results. However our view in the last 6 months is that Google has gone back to counting links.

    Another consequence of all this tinkering is that I don’t regard Google in the same rosy and trustworthy light I did this time last year.

  59. Batman says:

    Wouldn’t a user simply type “what do frogs eat” into Google instead of such a broad term such as “frogs” if they are looking for what frogs eat? I know it’s just an example, but this is clearly an error on behalf of the end-user and not any given website.

    Enter a generic search term, get generic search results.

    • @ Batman

      Just because the user types a generic search term does not mean they should get *duplicate results.

  60. Fred Waters says:

    What Matts says conflicts with many of the low quality, spam web sites that have risen to the top after the last update. This update was a boom for web sites from east Europe and India, where the english is barely discernable. Matt you have a lot of work to do to get it right.

  61. I haven’t used Bing for ages, but curious about the comments in some of the previous posts, I have just done a few simple tests (with cache cleared), and sure enough, Bing does seem to be returning more accurate results than Google. If this is happening on a wider scale then it suggests Google has accidentally or deliberately downgraded accuracy as a factor in its results. I reckon that is a mistake – for most users accuracy is the number 1 priority, and if Bing is more accurate than Google then users will soon desert Google.

  62. Could she not of just done a search on what frogs eat?

    Seriously though if hipmunk.com is doing great things why doesn’t it rank very well . Also isn’t his point about infographics the same what youtube does it embeds a video with a link back to youtube. If someone is watching a guide on how to fish there are no checks to see if that info is correct.

    But I agree with a lot of the points, lets stop link building and start customer building.

  63. Rick Package says:

    The whole pizza thing just shows how arrogant Matt Cutts is.

  64. ok, sounds really curious that Cutts says that we would never be penalized for thin content. After testing on some sites the thin content system to deliver the information quickly to the user, we always rank low. For me, ranking low it’s a sort a penalization, no ?

    After adding more words to make beautiful sentences but diluting the content substance and playing violons to match 300 words….

    >> Drop in ranking to #1 to #5 ….

    Google likes many words for nothing and judge the high quality sites by decreasing the user experience.

    I don’t know if it is the same in the US but in France it seems to be like this. I’m afraid to produce so much content for nothing.

  65. I think that the main problem is, as others have stated, the short cut. People do not want to pay a lot, so they tend to go with the cheapest. Unfortunately, it is hard for someone who knows nothing about SEO to be able to accurately tell which provider is the best, or rather, which ones are so terrible, it would be a complete waste of money.
    The cheapest of all options is to do it yourself, but you need to know what you are doing. However, most people do not know SEO and have purchased an expensive pdf or video series which they think teaches them everything they need to know. Such as those “google engineer speaks out” stuff which is clearly untrue.
    It is like hiring a PR agent, if the information they release and the way they do it is damaging, it could destroy your public image and your business. Fixing issues such as these would be very costly and it would have been cheaper to hire someone who knew what they were doing in the first place.

    Sadly, I have seen both of these examples and some PR agents are even using SEO to provide online PR. However, they are clueless and do more damage than good.

    It is a complete minefield and most customers do not understand what they are ordering. Unfortunately, I can not see an easy way out.

    As for the algo, I hope that they add the changes that we have all been talking about. I would expect them to keep up with Bing who have recently revamped their dashboard and I think it looks a lot better. Ball seems to be in Googles court now.

  66. Talking about Web spam, what do you say about sites which copy and spin content?

    There is a spammy blog which is copying and spinning original content not only from my own blog, but also from other authority sites such as SEOmoz. Nonetheless the spammy blog ranks well in the SERPs.

    What to do about?

  67. Fascinating discussion and guess I still fall in the ranks of those who don’t believe that Google’s only goal is the user experience. I work with home professionals who are all local (pizza example) and yet, the directories and lead generators dominate SERP. … and I mean like 7 or 8 of the 10 supposedly organic results are nothing more than computer generated directory listings that while unique, provide little to no value. They only succeed in pushing down the real websites.

  68. Instead talking about what should NOT rank for the term FROG it would be way better to take a look what actually is ranking right now

    Terrible!

    1. A design firm
    2. the usual wikipedia listing
    3. another firm
    4. a old (first FROG dedicated website) web 1.0 site. not updated since 5 years anymore..
    5. a CMS company

    There are well made web 2.0 about frogs out there! but for sure not in top 10..

    what about communities?

    here is one: http://www.frogforum.net – experts, forum, articles, live cam, at this moment 300 users online! (No it is not my site).

    It toke me LONG and advanced techniques to find some good sites about frogs!

    MC – do you have kids? Since penguine you send them better to the local book library if they want to know something about frogs.

  69. matt cutts lover says:

    @ Matt: So discounting a legitimate link-building method because of a few irrelevant/poor-researched infographics?

    What it sounds like infographics are catching on, and you are just looking for the next seo tactic to kill off, regardless that the majority of infographics are good and relevant.

    Maybe instead, you should just analyze the infographic image content with OCR, and surrounding text, or encourage the use of things like “alt img” so that your super bot can interpret them.

    The “well researched” portion of it is a cop-out as well. You love to rank things like Wikipedia, blogs, thin news sites, etc…., even though so much of it is very lacking in quality, accuracy, citations, and full of subjective opinion.

    Seriously, you didnt even give this much thought; simply because the big brands are not using it nearly as much as the non-brands/”SEOs”.

  70. It seems to me the point of a brand is a consistent message. Meaning… the same. McDonalds is successful because it is the same everywhere.

    Expecting to rank for Chicago pizza with two or three lines of text is a joke. (IF YOU DON’T HAVE A BRAND.) Therefore, you essentially must have a brand.

    In the beginning, you could create a brand through Google. Now you can promote a brand you already have.

    The definition of brand in a websearch world: Something that generates links without you actually needing quality content.

    Of course, it could be that their quality content is not words, but actually is the pizza. We have no way of knowing that digitally. We only know what people say.

    And that’s all Google knows. And that’s all they ever did know.

  71. Great interview.

    I’ve been using Bing as my primary search engine for around a month and I’m finding (to my surprise) it produces a lot more relevant results than Google.

    Google does produce a LOT of crap in its top 10 rankings in recent times, however I don’t think it’s so much that Bing has got better in the last few years (although it undoubtedly has) but more that Google has got crapper.

    I hope it’s just a transitional period for Google as they try to combat all the linkspam out there (that, incidentally, they created themselves by making links such a strong ‘signal’ in their algorithm).

    Dan

  72. The only one line of MC I can appreciate is “The main thing is that people should avoid looking for shortcuts” but rest of all are old negative things which is expected from MC.

    Though nice interview Eric, much appreciate your effort.

    Debarup

  73. What a great conversation. I agree with Matt that people shouldn’t look for shortcuts. I think human edited article directories are still useful to some degrees.

  74. “If you have an outstanding product, world class content, or something else that sets you apart, then you can step back and start thinking about how to promote it.”

    I disagree with Matt here. No one develops a product or service and THEN thinks “Oh how shall I market this?” If you’ve got any business sense you’d stop and think about the risks before committing time and money in to making anything “WORLD CLASS”.

  75. Rick Package says:

    Actually, Matt, that’s typically how it’s done.

    Why would you promote crap?

    • Well exactly Rick, that’s what I mean!

      To create quality takes time and money and you probably want to have some idea how you’re going to promote it before create something awesome that no one ever hears about and it just sits there in your garage so to speak =)

  76. Google aren’t able to tell the difference between a high quality site and bad one. Their search results are evidence of this.

    I just wish they would stop the BS.

  77. Paul Jackson says:

    Another load of PR bull from Cutts.

  78. Is this a joke or what? Google thinks “good sites” are those that advertise. Just after Panda, Google has cut down on organics and increased clicks on ads by 35%. On top of that, Penguin increased them by another 42% or doubled them (compound). So Matt Cutts can yap all he wants for that pay check and GOOG RSU he gets to lie. Google want s cut before sending you visitors, so far they “only” send 65% of clicks for commercial terms to Adwords.

  79. Jethro says:

    Regardless of whether Google is being sneaky, or whether this shake-up is good, the underlying issue for companies, and businesses is that they should try and generate as many Brandvocates as they can. Google will change the rules again, and there might be many reasons to get the services of an SEO company, but every company needs a base of loyal informed fans.

  80. Many good points, but make one think how fair google actually is

  81. The single biggest problem with Search is the lack of human intervention. It is only in severe cases of abuse that the human element of scrutiny is applied. So yes, the top results in most cases are either over SEO sites or have very deep pockets and plenty of deep links in places that just can’t be deleted!

    Google is great for finding things but ask this question, have results become more favourable to brands over the last 5 years compared to the first 10 years of googles life?

    Are the results found on google today diverse and interesting or are they the same tried and tested snippet of boring content?

    The only places that you will find unique quality content these days are behind closed doors or paid for subscription sites, the semi closed sites that google doesn’t like.

    So Google is great for allowing businesses to compete with one another either in the organic market place or throug cpc either way google and the search industry wins. human visitor comes first, how would an alogorithm know?

  82. Brian says:

    Matt Cutts – “I do agree that there are ways that infographics can be created and that represent an OK form of promotion, but the challenge is that as soon as I say something like that, people are going to use this as justification to do whatever it’s they want to do. They will push the limits, and that isn’t OK.”

    HOW ARE WE SUPPOSE TO KNOW WHAT THE LIMITS ARE IF YOU AREN’T CLEAR ON THEM.

    You can apply this scenario to every single thing Google has done for their searches like for instance infographics help determine the ranking of a website and only later put parameters on it like Matt says he will be doing. This is how you destroy the small businesses chances of ranking by not being clear with your requirements and only later dropping their rankings. You are penalizing small companies for only trying to compete and they didn’t know any better BECAUSE YOU WEREN’T CLEAR!!!

  83. Sorry but this is a too narrow site approach from Google, too technocrat.

    Google should be caring about 1 thing only, providing best results for users.
    It is impossible to compare 2 posts of excellent writers, google can identify poor quality content but can’t really rate content of good quality content.

    Google monitor the traffic patterns of each site via Chrome users, google toolbar etc, analytics (yes they said they dont use it but I am sure they are, they can ban your network if you have pne based on analytcs data cross with other data), they know the bounce rate, number of pages visited, total time at site and much more. Google also knows how to differ sites based on function. Users will spend less time at a site when they look for information while spending more time at site they purchase etc.

    Google knows which links are bought and which are “natural” which is kinda bullshit as most commercial niches out there, there are no real “natural link building”, everything is paid but some are more obvious than others.

    Google QA are too busy in their holy war against buying links that they punish great sites with excellent content and low bounce rate just because SEO/employee/owner bought several links (among thousands nicely built) The SERPs looks like a bad joke, it is our livelihood but Google, this is your main product, instead of educating the website owners, try to improve your product, Google employees, did you ask yourself what is Google’s added value here? it looks like Bing are asking it and their product is better (at least for commercial keywords)

    When you have all the info above, why don’t you combine all efforts? See which sites are selling/buying links but before you kick out the site and content from the SERP’s, cross the info with traffic behavior.
    Classify the unnatural linking into 10 criteria, and if it is a light SEO offense, and if users love this site should be a temp ban, when I say temp I mean a week or 2 and no action required from site owner. Most experienced siteowners stopped doing reconsideration requests, there is no real communication and it is kinda Google likes it this way to save manpower or I don’t know why. Reconsideration is almost impossible mission.

    Remember, Google became what is today because 11 years ago Google algo preferred great content over commercial results (altavista anyone?) and it seems it is the opposite these days as most mom and pop sites do seo and less sophisticated than big brands (which also get natural links). Mom and pop sites are resposbile for the best content out there. Brands do very bad and boring content which Googlers might call it quality but simple guys will disagree.

    And last comment, long term SEO is dead, if you actually build links to your site , sooner or later you can make a mistake, or someone can do harm to your SEO efforts, resubmission won’t work as they will check everything you ever did and nobody is an angel (unless you really luck to be in a very unique niche that links are easy obtained).
    What I learnt from Google Panda (but mainly Penguin) is to think short term, I would rather build 10 sites and 1 one mega site, and if I get burnt, I will simply redirect the links to new site and work on it more. It is optional for you work with several niches so even if 1 site is burnt, The 5 other niches keep you inhouse and not thrown homeless at the streets. Result will be worse content, worse SERP’s as spam is still doing well while legit sites get slapped.

    My 4 cents, good day.

  84. Thank you Eric for this awesome interview with great questions and therefore interesting answers.
    It confirms one thing to me: Google likes the brands.
    They want differenciation and they’re giving more space to the brand.
    But that does not give us any clue about link building… not all contents get links naturally…

  85. With having so much written content do you ever run into any problems of plagiarism or copyright violation? My blog has a lot of completely unique content I’ve either written myself or outsourced but it seems a lot of it is popping it up all over the web without my permission. Do you know any methods to help stop content from being ripped off? I’d genuinely appreciate it.

    • Eric Enge says:

      Hi Williams – the first thing I try to understand is whether it matters. If the site stealing the content is a clear junk/scraper site, it may not be worth worrying about. However, if a more authoritative site copies your content you have a bigger risk of them stealing your content. For these, you can wither file a DMCA request with Google, or you can use a service like DMCA.com to do it for you.

  86. Chad Andrews says:

    Really Matt Cutts? Could he not reference his article about having a unique url for each location that he wrote back in June 2010 http://www.mattcutts.com/blog/give-each-store-a-url/. That exact experience he had with Pinkberry is what we are all wanting, what every franchise wants and what every SEO wants. According to their new animal updates, this would cause a website to be penalized….Really…

    How about this scenario, since no one is bringing it up, if I owned Franchise locations, I would NOT want my users finding my location page at myfranchise.com/location-searched/, instead I would want them finding my full website on a unique URL like “location-searched.myfranchise.com”. Or even better yet Matt Cutts, since you tell us how important keyword loaded domains are, I would equip a franchise location on a keyword domain, and make it where everything is local to visitor, the phone number on every page, contact form going to the right location and that would be the ultimate user experience. So now I am hearing if I invest more efforts in making the user experience more localized and relevant (hence keeping them on Google because they found such a great site that had exactly what they wanted on it), but of course still using the same brand, same great content, and same great company, I am going to get penalized for it? Bullshit…. How can this be a penalty, it is not like I want everyone in US to find my Austin location website, I only want Austin visitors to find my Austin website. Feel free to penalize me in Chicago searches for my Austin site, as I don’t want Chicago visitors going to that site….That is how it should be valued.

  87. I should just go back to my old GoDaddy website that was 6 pages with an online order form. I will rank on the first page again on organic search. Right now i have a site that is 10x better, from the graphics to the content to the ordering process. Right now my site is on the 4th page. I’ve tried a few things, but it has either made it worse or not helped at all. Competitors with weak sites are still on the first page of Google organic in my niche. :((((

    • Brent, maybe after the next search engine algorithm update your web site will be back on the first page of Google.

  88. Sorry, but Matt Cutts is talking out of his arse again. Wheel said it (7/11/2012 at 8:56 am), its still all down to links.. just need to have the right quantity and anchor text now. If you search the UK google for Mazuma which (currently) has 301,000 local monthly searches, but (again, currently) in 11th place is mazuma .co .uk – a website with NO content what-so-ever .. LITERALLY NONE!!!; and its been hovering around there for quite some time.

    Additionally, I’ve lost count of the amount of technical articles I’ve searched for that don’t have a single keyword I’ve searched for in the html of the web page – its therefore been selected by Google for me based on anchor text keyword. So frustrating, so I click back and try the next search result down, but whats this?… another page from the same domain… LOL! So much for the domain diversity tweaks too!!

    HA HA. Epic fail.

    Unfortunately (and luckily for Google), their “competition” is donkey too, so Google still remains my S.E of choice until a real competitor comes into the fray, but I don’t think it will be long until I switch to Bing (ptwhcheeeww)

    I can honestly see that they are trying to pick the web up by its feet and drag it kicking and screaming to superb content as there genuinely is a lot of crap of out there. Their intentions are good, but for Matt to come out and say you’ll rank on great content alone is utter ****.

  89. Hi Eric,

    I have come to your blog for the first time and when i look at the name i thought, Its Eric ward but then i read the name again. :)

    You have asked excellent questions from Matt, I have seen that he usually do reply on double figure. He never answers any thing directly and he never stated any algo of Google.
    How ever excellent work to sum up this post. :)

    Thank you

    saif

  90. Splatcat says:

    There are two types of wedding photographer. One who records what happens the other who keeps on interrupting what s going on in order to tell the bride and groom what to do.

    Google has become the photographer who thinks their job is now more important than the wedding itself.

  91. Robthespy says:

    The problem is Google is a business and I feel like people don’t fully realize that. Search is the foundation of their business and they’re gong to get as “intrusive” as their customers/users will let them.

    I blame Bing, Yahoo! and all of the others who basically just rolled over for the last ten years.

  92. Hi Eric,
    I’m fairly new to the industry and I read this interview after seeing it on your Twitter feed. It’s helped me understand a few things better so thank you very much.
    Do you hold regular interview with Matt Cutts, or was this a one off? And did you take any guidance from anyone else with your questions?
    I’m having issues with the follow/no-follow element of links which I’d love to get some advice with!
    Thanks.

  93. Matt and Eric, I was wondering how Eric got this interview of you. Certainly he pulled out a plumb.There are a ton of lawyers with web sites on the internet I’d love to see you discuss what you find appealing and not appealing with these. Although this article in general made a lot of sense, as a lawyer in a crowded space where can we help ourselves and where can we hurt ourselves.

    When you say adwords doesn’t affect position many are doubters of that. Do you have any real evidence you could cite?

    Finally you do a great job of explaining the way things work for a simple guy like me. I would like to see some more dicussion on conversion although I know that page one in search is the start.

  94. It’s a nice interview. I believe Google is doing great job for catching web spam. I really feel sorry for the people when they say after panda/penguin update Google is not showing good results or it’s doing all these just to increase advertising revenue.

    I firmly believe that one should have to worry about SEO or any Google update as a web master. I am not saying this because I haven’t been affected by the update. My sites are also affected by the update but that doesn’t mean it’s the end. As a web master, your job is to promote the web site without thinking SEO. You just do that and you will be fine. Google will definitely reward you.

  95. A big clap to Eric for this wonderful talk with Matt. Google is taking too much care in there business to avoid growing spams nowadays. Its good too see Matt’s concept along the article for better results in SERP. Fakes should be wiped out.

  96. Nothing is perfect, so is Google’s new algorithmic approach. One way or the other website owners will feel the impact in good or bad way. Good for good and bad for bad. But, its sad that many good ones are also treated as bad. However, we can say Google is moving for betterment of web search experience.

    Now, in this post, hipmunk and Matt’s blog were able to get a link and that is what I call branding.

  97. Great post Eric. I agree with you on the “thin content” issue and feel little uncomfortable with the solution provided by Matt for sites that wish to rank their stores around the country on local searches.

    Would it be enough to provide a few sentences of unique content that differentiates your pizza franchise in Chicago from that in New York, to rank at the top in local search?

    What if there are more than 10 local pizza stores in Chicago who have great websites full of useful unique content and social media buzz? Do you think you still stand a chance to be on top with your thin content but the big brand appeal?

    If yes, than we can safely assume that big brands will almost always have a better chance to rank ahead of you even if you have a great site.

    • Eric Enge says:

      Good question. I think the key for the local pizza shop is to emphasize their locality. They have no chance of ranking for “pepperoni pizza”, but a real chance of ranking for “boston pepperoni pizza”. The key is to really find a way to express the local aspect in the content.

      Note that I don’t mean with information about the history of boston, because no one will care about that on a pizza page. You have to find a way to introduce the locality in a relevant way.

  98. I fully agree with you on content quality issue. It seems that Google is just concentrating on links. Discounting these links, penalizing those links. Better approach will be to see if the site is offering value or not

  99. I think the Almighty Google created all this mess in the first place. But the message is very clear, we should focus or earning links rather than building links that Google would consider spammy every time the roll up their updates. Content marketing is the future of SEO.

  100. I find this is much more difficult with e-commerce sites. Re-writing 10,000 product descriptions is a lot of work and doesn’t really bring anything new to the table (other than original content for Google).

  101. Too many comments against Matt..but if we think logically,he is right. Why to write the same content for different website? How would a customer differentiate? But one question if anyone can answer..What will happen if the content goes viral? Are we gonna be penalized for Viral content?

  102. I agree with Matt, That its about producing something excellent first. You need to be creative and your content needs to have that power which attracts the eyeballs, And even one or two lines can do this magic.

Speak Your Mind

*

*