New Clarity on Reconsideration Requests from Tiffany Oberoi

photo of Tiffany Oberoi

Key Interview Points

I am going to keep the key points summary short in today's interview. Tiffany's responses bring new clarity to the reconsideration request process. Here is what Matt Cutts Tweeted about the interview:

Matt Cutts Tweet about this post!

Read on and enjoy!

Interview Transcript

Eric Enge: Thanks for taking the time to address our questions!

Tiffany Oberoi: Sure! I know that reconsideration requests can be stressful. We want to do our best to clear up any misconceptions about the process.

Eric Enge: The reconsideration request process is an incredibly important tool for those whose sites have been impacted by a penalty.

Let's start by understanding a bit better the types of penalties. The most extreme penalty is a banning of a site from the index. I usually think of this as something you can recognize by search on the site brand name or domain name and not getting the site to show in the results, or where a site: query shows no results. If you can tell me, are there other types of manual penalties that may be assessed?

Tiffany Oberoi: We do have a few different manual actions that we can take, depending on the type of spam violation. We would tend to handle a good site with one bad element differently from egregious webspam. For example, a site with obvious blackhat techniques might be removed completely from our index, while a site with less severe violations of our quality guidelines might just be demoted. Instead of doing a brand name search, I'd suggest a site: query on the domain as a sure way to tell if the site is in our index. But remember that there can be many other reasons for a site not being indexed, so not showing up isn't an indication of a webspam issue.

Eric Enge: The other major type of penalty is an algorithmic penalty. The algorithms make some determination of a problem behavior and adjust the rankings in some fashion. Is that a reasonable short description?

We try to take an algorithmic approach to tackling spam whenever possible

Tiffany Oberoi: Spam algorithms are essentially computer programs that engineers have written to classify webspam. We try to take an algorithmic approach to tackling spam whenever possible because it's more scalable to let our computers scour the Internet, fighting spam for us! Our rankings can automatically adjust based on what the algorithms find, so we can also react to new spam faster.

And just to be clear, we don't really think of spam algorithms as “penalties” — Google's rankings are the result of many algorithms working together to deliver the most relevant results for a particular query and spam algorithms are just a part of that system. In general, when we talk about “penalties” or, more precisely, “manual spam actions”, we are referring to cases where our manual spam team stepped in and took action on a site.

Eric Enge: Do reconsideration requests have any value in the case of algorithmic penalties? Or are they only valid for manual penalties?

Reconsideration Request

If a site is affected by an algorithmic change, submitting a reconsideration request will not have an impact

Tiffany Oberoi: If a site is affected by an algorithmic change, submitting a reconsideration request will not have an impact. However, webmasters don't generally know if it's an algorithmic or manual action, so the most important thing is to clean up the spam violation and submit a reconsideration request to be sure. As we crawl and reindex the web, our spam classifiers reevaluate sites that have changed. Typically, some time after a spam site has been cleaned up, an algorithm will reprocess the site (even without a reconsideration request) and it would no longer be flagged as spam.

Eric Enge: As a related question, is a reconsideration request helpful after addressing possible panda issues?

Tiffany Oberoi: Panda is an algorithmic ranking change targeted at promoting high quality sites over low quality sites. Because reconsideration requests will not change the way an algorithm sees your site, a reconsideration request won't help in this case. We recommend focusing your efforts on improving your site so that it will be classified as high quality in the next Panda update. Amit Singhal had some great tips for how to improve your site in this post from Google's Webmaster Central Blog.

Eric Enge: Does it ever happen that a reconsideration request get accepted, but then the same penalty gets applied again (perhaps after a subsequent crawl)?

Tiffany Oberoi: This is definitely possible if the bad behavior comes back. For example, we see sites getting hacked repeatedly. The webmaster cleans up the hacked pages, but doesn't close the security hole. They might even submit a successful reconsideration request, but if the security hole is still open it is likely to be exploited again.

Eric Enge: Is there a potential downside to making a reconsideration request for sites when they are not entirely sure if they've been penalized? In other words other issues are discovered in the process?

Tiffany Oberoi: While in theory it's possible that spam could be uncovered while processing a reconsideration request, that's not the goal. The people reviewing a reconsideration request are first and foremost interested in whether the violation of our quality guidelines has been fixed. I wouldn't let this stop you from submitting a request if you think there is a chance that your site had a violation. But before submitting a reconsideration request, I do recommend a detailed review to make sure your site does not violate any of Google's webmaster guidelines.

Eric Enge: What would you recommend the structure of a reconsideration request look like? In other words, what major issues should it address? Are there things to avoid?

Tiffany Oberoi: Here are a few tips:

1. Be specific. Carefully review Google's webmaster guidelines and tell us what issues you found on your site and how you fixed them.

2. Avoid hiding information. This is the time to address the issues head on. For example, a reconsideration request that says, “My sites adheres to the guidelines.” is not as useful as one that says, “I had some hidden text at the bottom of my homepage, but I have removed it now.” The second example makes it clear what the initial problem was and what has changed. The more detail you can provide, the better. It helps us assess the situation more fully.

3. We want to be assured that we aren't going to see these spammy techniques again. It's helpful if you can include details about steps you've taken to prevent it from happening again, policy changes, etc. The people who review these requests want to be confident that the spam techniques have been removed and are not likely to return.

4. Don't mention how much you spend on ads. The team that handles reconsideration requests only cares about search quality. It's irrelevant and doesn't help your case to mention buying ads or being a partner or customer of other products.

5. We get a lot of reconsideration requests from webmasters that are not even affected by a spam issue, so my other advice is to explore other possible issues as well. For example, check Webmaster Tools for crawl errors. Make sure your robots.txt isn't blocking Googlebot from accessing your site. Here's an article with a detailed discussion of other possible ranking problems.

Eric Enge: Should the person submitting the request expect to get a response? Does Google ever provide explicit feedback on the problem(s) found?

… we are currently running an experiment to provide more specific information about the outcome of the request.

Tiffany Oberoi: We generally send a message to Webmaster Tools after the request is received and again after the request has been processed. In the past we've gotten a lot of feedback from webmasters who want to know what happened after we processed the request. We listened to that feedback and we are currently running an experiment to provide more specific information about the outcome of the request.

For example, in some cases we can communicate back to the webmaster that we were able to revoke a manual action based on their reconsideration request. Or sometimes we let them know that their site is still in violation of our guidelines. This might be a discouraging thing to hear, but it helps webmasters diagnose what's going on if they know that there actually is a spam issue.

In the majority of cases, we're able to let the webmaster know that they aren't affected by any manual spam action at all. This allows the webmaster to focus their attention on other areas instead of submitting multiple reconsideration requests and wondering why they aren't seeing results.

Eric Enge: If the person submitting the request does not hear anything and nothing changes, should they resubmit?

We do send a confirmation after we receive your request, so as long as you got that message then your request is in the queue to be reviewed.

Tiffany Oberoi: You generally don't need to resubmit. It can take us days to weeks to process requests, and then more time for changes to go into effect, especially if we need to recrawl and reprocess your site. We do send a confirmation after we receive your request, so as long as you got that message then your request is in the queue to be reviewed.

I don't recommend sending multiple reconsideration requests in a very short period of time or submitting reconsideration requests for tons of sites all at once rather than one site at a time. We can take that as a sign of bad faith. But if you haven't received a follow up message saying that your request has been processed after 2-3 months, it would be reasonable to submit another request at that point.

Over time, the reconsideration request process has improved substantially. We've made a lot of progress on making our assessments and the entire reconsideration review process more transparent. I'm excited that most webmasters can find out whether their site has been affected by a manual action, and that they'll know the outcome of the reconsideration review.

About Tiffany

Tiffany is a software engineer on the Google's Search Quality team. She joined Google in 2006 and focuses on webspam issues and webmaster communication. Prior to joining Google she worked as a software engineer at Computer Associates and a high school math/engineering teacher in Harlem, New York. She earned her bachelor's degree in Computer Science from the University of Virginia.

Other Recent Interviews

Mona Elesseily, July 18, 2011
Vanessa Fox, July 12, 2011
Jim Sterne, July 5, 2011
Stephan Spencer, June 20, 2011
SEO by the Sea's Bill Slawski, June 7, 2011
Elastic Path's Linda Bustos, June 1, 2011
SEOmoz' Rand Fishkin, May 23, 2011
Bing's Stefan Weitz, May 16, 2011
Matt Mickiewicz, January 8, 2011
ex-Googler Adam Lewis, October 10, 2010
Bing's Mikko Ollila, June 27, 2010
Yahoo's Shashi Seth, June 20, 2010
Google's Carter Maslan, May 6, 2010
Google's Frederick Vallaeys, April 27, 2010
InfoGroup's Pankaj Mathur, April 5, 2010
Matt Cutts, March 14, 2010

zp8497586rq

Comments

  1. says

    I was badly affected by Panda and still am. If my site is rubbish since 12th April so be it but I do get mightily sick of Google employees spouting the same old drivel, specifically:

    “Panda is an algorithmic ranking change targeted at promoting high quality sites over low quality sites . . . focusing your efforts on improving your site so that it will be classified as high quality in the next Panda update. Amit Singhal had some great tips for how to improve your site . . . “

    I suspect a lot of sites have asked for reconsideration since Panda, in fact I suspect Google has been swamped by requests to reconsider sites hence this interview to try to stop any more requests coming in.

    All webmasters, from a variety of sectors, are seeing search results returned by Google which are complete rubbish. My sector is cooking and this is what I recently compiled as top results just for the word “cooking” (which you have to think, in the first instance, would be about the cooking of real food):

    1. Cooking games (not a cooking site)
    2. Wikipedia (to do with cooking)
    3. Jamie Oliver (celebrity chef)
    4. BBC (food site part of BBC)
    5. Cooking games (not a cooking site)
    6. Cooking.com (cooking site)
    7. Open Directory (hardly a search result)
    8. Science of Cooking (cooking site)
    9. Cooking games (not a cooking site – this is in fact how to learn cooking in the online game Warcraft)

    They do fluctuate but this sort of pattern has been going on for three months now. If rubbish sites like games sites are coming up in the top 10 results then is it any wonder that people start to think their own sites may have been penalised and ask for them to be reconsidered.

    In a lot of cases asking for a site to be reconsidered is an act of desperation. From all I see and read there are thousands of sites out there adversely affected by Panda and reconsideration is about the only way to communicate with Google. People are suffering out there, a bit like being punished for a crime but you do not know what crime you committed.

    Google is pretending to communicate through interviews like this, through Matt Cutts interviews and posts, through Amit Singhal’s posts, etc but they are not actually saying anything. Google has the means to communicate with most webmasters (because most run AdSense) but they choose not to. Until Google talks straight to people about what is going on, and whether this is how it is going to be from now on, they are going to continue to be swamped with requests for reconsideration regardless of how many pointless interviews like this that staff members do!

    • says

      What I find most frustrating is that the core question that David asked about the poor quality of search results does not get addressed. On top of just plain bad results, and Google claiming that their intention is to get rid of bad quality sites, their words ring hollow – especially when they talk about MFA (Made for AdSense) and parked domains – both programs THEY CREATED. (Just look them up if you don’t believe me. I’d add the links but I don’t know the policy of this site on sharing links.)

  2. says

    Funny, my top 10 results for “cooking” after removing personalized search are:

    Food Network
    All Recipes
    Cooking.com (selling cookware)
    Wikipedia entry for cooking
    Epicurious
    cookinglight.com (Cooking Light Magazine)
    Cooks.com (recipes)
    Cooking Games.com
    Williams-Sonoma (cookware)

    Those seem relevant to me. Are you sure you’re not seeing highly personalized results? Also, rankings are a poor indicator of SEO success and have been for a long time.

    • says

      Fran I think you have searched Google.com whereas I was using Google UK.

      The results look better than the UK one but even so how can a cooking games site be in the top results?

      I don’t see how I can be seeing personalised results as I am not signed in and I have my location as the UK.

      I am not looking at this as an indicator of SEO success but as a measure of how Google search is working. Prior to Panda I had been in the top 10 Google results (UK Google) for over four years which was good. I always thought there were other sites that were better than mine so not upset by getting pushed down the results (although page 25 yesterday was a bit far to fall).

      It is the why that concerns me and the lack of straight answers from Google. I was at a meeting with Google people in early June and specifically asked the question as to why my content was good on the 11th April and why it was then regarded as rubbish on 12th April (Panda roll out for UK). They would not answer and closed the meeting. Seemed a reasonable question.

      You can’t repair something if you do not know what is broken and, what with all the numerous SEO people who are saying contradictory stuff and Google staff doing interviews that really don’t say a lot, where do you start putting things right.

      • says

        All Google search is personalized search, unless you add &pws=0 to the end of the string that’s generated during the original search result or clear your cookies and cache before every single search. David, you are trying to outrank 41 million other pages for the word “cooking”- and there are a lot of simple best practices that aren’t being followed on your site. The domain is not related to the content in any way. There’s no robots.txt or sitemap.xml files. Link structure is poor. I could go on and on.

        The bottom line is that when I look at your site, I don’t see a quality, well-organized, easy to navigate site- I just see a lot of content that isn’t arranged in a particularly useful or well-organized way. Feel free to contact me if you’d like some suggestions but simple best practices will get you a long way.

        • says

          Thanks for the follow up. I was interested about what you say about Google search. I tried it and got a different batch of results. It all confuses me as I often click on results for my own pages when doing different searches so you would think I would be well up there for “cooking” under personal search instead of drifting further and further out this week.

          I am not particularly bothered about fighting to get to the top of the pile for “cooking” as I accept what you say about it being very competitive and I also accept there are some very good sites out there.

          Your other comments do concern me. I can’t do anything about the domain name. It all came about by accident and to change now would be a nightmare (starting again?) not that there are any good domain names left.

          With regard to robots.txt or sitemap.xml files I have always understood that there was no need to have these as part of a web site unless they were parts of the site you want to exclude the search engines from spidering. I don’t have any such pages so I have always believed there is no need.

          You say “I don’t see a quality, well-organized, easy to navigate site – I just see a lot of content that isn’t arranged in a particularly useful or well-organized way”. The problem is I don’t see that – probably because I am too close to it and no my way around the site. I look at a lot of other cooking and recipe sites and often think how much more complex they are.

          I would happily change the whole of my site if I thought it would enable me to recover from Panda but at the moment I don’t believe anyone outside Google knows what to do for the best (does anyone in Google?) and there is far too much conflicting advice. I will happily make changes to the site if I believe they will improve the site but I have no idea what improvements I could make . . . otherwise I would already have made them.

          I was invited to a Google meeting at the start of June and when I asked why those of us attending had been invited I was told because they believed we had worthwhile sites that Google could work with! Perhaps they were just buttering me up but then why go to all that trouble when they could have just not bothered to invite me.

          I do know of a UK cooking and recipe web site that has fully recovered from Panda without doing anything . . . I just don’t know why!

          All I want to do is get on with the business of publishing new material on the site which is what I enjoy doing. As long as the site brings in enough to live off then I am happy. At the moment I am wasting countless hours trying to solve something a lot of people, cleverer than me, can’t solve.

          At present any SEO expert who could guarantee a site recovering from Panda could write their own cheque.

          • says

            The Twitter version of my response is that Google isn’t just going to come out and say “here’s how to rank pages”; that’s totally unrealistic. David, you also had a very insightful observation about being too close to the site- make sure you consider how others will use it as you build.

            There is no “recovering from Panda”. We all gotta move past it!

  3. says

    My Top 10 on .de on “cooking” in a clear Cookie ;) free SERP

    1. Wikipedia
    2. Cooking.com
    3. Cooking.com
    4. Cookingclub.de
    5. Cooking Games
    6. t-mobile
    7. Amazon.de
    8. Top Cooking Games
    9. Ehow.com
    10. Chenscooking.de

    ;)

  4. says

    I have a top 10 list like this.

    1. Wikipedia
    2. Cooking.com
    3. Cooking.com
    4. Cookingclub.de
    5. Cooking Games
    6. t-mobile
    7. Amazon.de
    8. Top Cooking Games
    9. Ehow.com
    10. Chenscooking.de

    This is relevant too.

    Thanks

  5. Rob says

    Ok people it’s not about cooking. The point is that Matt Cutts tweeted and recommended this interview for his followers (every webmaster in the world)

    This is not how professional companies run. If information in this interview is relevant then put it on Google.com, their own website.

    When they release an new algorithm that impacts more then 12% of searches and ruins thousands of people’s business, act responsibly and explain what happened RIGHT AWAY!

    Matt Cutts hasn’t said more then a few sentences about the Panda Algorithm in the 6 months since it was released. This is the same guy who release a new webmaster video every week and in general just won’t shut up! Hey Matt, how about saying something useful. Are you afraid you will be fired? Who gave you the orders to shut up about Panda? Whomever issued the orders is evil and whomever follows the orders so that they don’t get fired is just as evil!

  6. says

    The panda update only affected sites which where involved with some grey area marketing techniques. If you where following a solid strategy all along you would not have seen any negative changes in rankings.
    Always think long term when doing your seo and link building. If your business(websites) are built on solid foundations they will keep standing.

      • says

        David,

        I am sorry your site got affected by the panda update.

        But once again this interview is about the panda update and not your cooking site.

        All that I am saying is that efforts when building a website should be to build a long standing business and adding value to your visitors/readers.

        If you want long term results you will not go outside the rules and get low value spam links or just republish other peoples content on your site. You will invest time and money in it to ensure that your site is unique.

    • says

      Hi Werner,

      I’m sorry to be argumentative, but to state as fact, “panda update only affected sites which where involved with some grey area marketing techniques” is unprovable and not what my analysis of the hardest hit sites shows.

      The Panda update targeted large numbers of quality sites that are direct competitors to Google offerings. I encourage you and everyone else to review the data and screen captures I’ve linked to this comment and pay special attention to the text in red regarding Google Product search intentionally hiding products when I search for the money keyword phrases in a specific store who send their entire feed to Google, but showing those products when I use a more generic phrase that does not convert nearly as well.

      We should all be very concerned at how Google has intentionally set out to reduce our freedom of choice and take away converting traffic from small businesses and hand it to their big brand buddies. This is no secret – their own CEO announced that was their intent and every update makes good on his promise. To see that visit the SEOBook post by searching for the now infamous “Internet is a cesspool” comment indicating that “favoring big brands is how you clean it up”.

  7. says

    Its really not about cooking or grey marketing. Yes, Panda targeted specifically content farms and RSS scrapers, but it also affected good quality sites. There have been fixes such as separating the domain into subdomains, but what is extremely irritating is the generic “quality” criteria. Its totally subjective, and thus far, other than giving us subjective questionnaires to follow, there’s been no direction whatsoever from Google.
    If SEO was already somewhat smoke and mirrors to most people, this doesn’t help the industry at all. Try explaining Panda to a small business owner – ‘uh, your site has low quality content, according to Google’.
    Then explain who from Google knows more about window blinds or concrete trucks or jiu jitsu than that business owner to qualify his site as poor quality.

    Forgive the poor grammar as im typing on my phone, and this is more of a rant anyway. Hate not having a solid ‘to-do” .

  8. says

    Thanks for this good article and good advice :)

    I shocked from this point :

    “I don’t recommend sending multiple reconsideration requests in a very short period of time or submitting reconsideration requests for tons of sites all at once rather than one site at a time. We can take that as a sign of bad faith. But if you haven’t received a follow up message saying that your request has been processed after 2-3 months, it would be reasonable to submit another request at that point.”

    I thought that if I send more reconsideration requests I’ll get quick respond .

    Thank again
    Regards

    • Eric Enge says

      mn9or – definitely not a good idea to send in multiple requests, as it runs the risk of annoying the reviewer! Tiffany’s advice on this point is spot on.

  9. says

    I’ve had a successful reconsideration request but it took 8 Months to complete the process, but now i am happy that i am back

  10. says

    Interesting, but I also smell a bit of PR rat there, and I mean Public Relations. The popular view is that Google is arbitrary and callous in its actions and false in its appeals processes – a few nice blog posts won’t change that.

  11. says

    I dont think that she has given much by way of what we already know. Sounds more like an insurance saleman to me. I am wondering aloud here, but how much revenue have they lost with this update as many of these sites ran Adsense?

  12. says

    I have not recovered after more than 1 year after fixing the problem that caused my Panda penalty. I have written an article detailing why my site was penalized, what I did in a vain attempt to recover, and why other sites that I know about were hit.  I have also detailed about how I believe the Panda algorithm works, based on how it has affected my site and others that I know about. I have not seen this information anywhere else: http://goo.gl/b3bv5

    • Eric Enge says

      Hi Bill – I think there are some points of confusion in your post. The most important one is that given the time between when you implemented your fixes, and the current date, it really means that you have not successfully implemented fixes to the problems that triggered the Panda problem in the first place. I.e., you have not identified the problem (or all the problems).

      Another point is that there is no published information on a specific Panda set of calculations and how that formula might work, and it is most certainly not used as an adjustment to PageRank – PageRank has little direct bearing on ranking today. I’d urge you to try and figure out what other problems Google (Panda) may be having with your site.

      • says

        Well, Eric, I have looked at all of the information that I can find on the Panda penalty, so I am certain that it was the IRS articles that caused my penalty. Matt Cutts even cited the IRS as being one of the trusted sites used by the Panda algorithm, so I am fairly certain that I was penalized for the IRS pages. Even if there were some other problem with my site, I would expect at least some recovery because I deleted all of the IRS pages. Furthermore, Amit Singhal even said in his article about what websites can do to recover from the Panda penalty is that if they make the appropriate changes, then they will “eventually recover”. This indicates that the Panda penalty is implemented for a specific amount of time. On the other hand, if you disagree with this, then what is your evidence that it is not implemented in this way?
        Now, I believe that Google is changing this policy. Matt Cutts has just published a video on July 9, 2012 stating that a website can recover 100% if the appropriate changes are made. This is the 1st time that I have seen Matt Cutts saying that a 100% recovery is possible, so I believe they are changing this policy, since any new sites that violate the Panda guidelines will immediately suffer the penalty, whereas previously, the websites that were initially penalized have been violating the guidelines for years, in some cases. Here is the portion of the video, where Matt Cutts says this: http://www.youtube.com/watch?v=8IzUuhTyvJk&feature=player_detailpage#t=952s, so hopefully, I will recover with the next running of the Panda algorithm.
        Regarding how the Panda algorithm is calculated, it is true that Google has never published how the algorithm works, but I am fairly certain that it works as I have said, because although my website has lost ranking compared to other websites, the ranking of my pages within my own site have remained the same. In other words, my top ranking pages before the penalty where the same top ranking pages after the penalty. You say that “PageRank has little direct bearing on ranking today.” I don’t recall ever reading that anywhere else, but I am sure that Google still uses some kind of score to rank the pages, and I am certain that there is a PageRank score — it needs some kind of score to quickly display pages in a calculated order. My point is that whatever the score is, if the website is flagged by the Panda algorithm, then the score for each page will be reduced by a constant fractional amount. This explains why the relative ranking of pages within a website remains the same, even though it loses ranking compared to other sites. I think I provide pretty good evidence for what I say in my article, so I will leave it to your readers to judge for themselves: http://goo.gl/b3bv5