Welcome to WebmasterWorld Guest from 22.214.171.124
Matt Cutts tweeted: A pretty good overview of the stages of Penguin recovery:
Bring out the machete.
Sounds violent, right? It is, sort of. Let me explain.
Once your site has been inflicted by the damage of unscrupulous SEO practices, itís in serious need of emergency services. Itís time to scrutinize your link profile, and do your best to remove any links that could be causing your website to be affected by Penguin 2.0. Hereís how the process works, described simply:
(1) File you upload should be a regular text file only. No syntax, etc. People often upload word docs, excel spreadsheets, etc. Just upload a text file.
(2) Typically, the first attempt by users are to be very specific and fine tuned with their individual urls. Instead use a domain: command and disavow the whole site. That is often better. See our machete story.
(3) Wrong syntax is another common issue, use the right syntax.
(4) Do not write the story on why you are disavowing in the disavow text file. Do that instead in the reconsideration request, not in the text file.
(5) With that, when you do that, they use comment out tags. So don't add lots or any comments, it will increase the chance of errors on Google's parser.
(6) The disavow is not the be all and end all. It will not cure all your URLs. Clean up your links outside of the disavow tool as well, don't just go this route.
Kaspar Szymanski, a member of the Google web spam team said on Google+:
As I'm involved in the reconsideration request process, let me tell you this a very important video to watch for any webmaster who has experienced spammy backlinks issues. [seroundtable.com...]
Several webmasters posted their recovery experiences on the discussions that follow on Kaspar Szymanski's page.
@Whitey - Iceman88: Saw improvements within 2 weeks.
After 4 weeks we noticed traffic starting to return to previous levels, until penguin 2.0 hit of course. [webmasterworld.com...]
For those that reported no recovery on this thread [webmasterworld.com...] it would be interesting to hear with some accuracy what they did/ didn't do.
Webmaster's and site owners can recover from Penguin 1.0 and 2.0 and I'm looking forward to seeing more recovery success stories shortly.
I sucessfully did a recon last year.
1st attempt - with a scalple
2nd attempt - used a machete.
[plus.google.com...]If you have had a manual penalty - unnatural link warning, you would need to file a recon request to have your site removed from manual penalty.
If your site has not been affected by manual penalty, then you do not need to file a recon request because this is an algortihmic penalty.
if you have cleaned up your site, removed links and added some to the disavow tool (ones you could not remove) then algorithmically the site will recover on next refresh of the algo [plus.google.com...]
There's plenty of others if you dig around that correlate to the above inputs. I've personally seen examples of Penguin 1.0 recovery's that took longer due to the amount of work involved. Too early for Penguin 2.0 recovery reports, but the directives are clear enough.
This is a watershed moment for Google's communication with webmasters breaking with previous policy of not commenting on "penalties". There are, of course, many whose lives depend on their site's visibility for income. It deserves special thanks and delivers hope to those effected.
Hopefully this will serve as a communication benchmark for the future.
DS: Weíve had a back-and-forth on links. Iíve characterized them the fossil fuel of signals. I thought my head was gonna explode with the disavow links tool ó I got a link removal request on Tuesday for a link on Search Engine Land.
Why donít you just disavow all the bad links yourself?
MC: An SEO asked me why he had to clean up all these messy links that someone else got? I said to him that itís like a one-time market correction. Everyone should look up the rant Danny did at SMX Advanced last year about how you should want to get quality links, not easy links.
We are going through a transition, but weíre moving to a healthier world where it gets harder to spam every year. The disavow tool is there to help people clean up that mess when they canít get bad links removed.
DS: Now links have been devalued.
MC: Itís definitely the case that now, compared to 6-7 years ago, fewer people are likely to think they have to buy links to succeed. It costs effort to make a great site. Thatís always the intent.
Iíve had black hat SEOs write me and say they canít do it anymore because itís not a sustainable income. Our guidelines have always been kind of steady Ė make a great site so that people want to link to it. Now weíre bringing tools to bear that make that real.
I guess the transition is also a bridge to communicate compliant intentions back to Google ie an improved level of "trust" between webmaster / site-owner and Google.
Martin Macdonald @searchmartin
any plans to allow wildcard disavows? Along the lines of robots exclusion would be great..
Matt Cutts @mattcutts
@searchmartin not that I know of. Most people need to use the domain: operator more though.