Forum Moderators: Robert Charlton & goodroi
I was wondering if some of you could share your thoughts on how to identify the characteristics of a site penalty that would require a Google "request reconsideration" vs a filter that may be lifted just by cleaning up some errors.
A good example is the dust storm that Google kicked up about paid links. In the beginning the penalties were all manual, but relatively soon a good bit became automated.
The issues involved, whether they tripped a filter, a manual penalty or an automated penalty, only give you a better website when you address them. And there's no reason to fear submitting a reconsideration request if you've made a good faith effort to fix the problems.
And there's no reason to fear submitting a reconsideration request
The sheer stupidities that lead to my issue(s) are hard to believe. :)
Could/would you believe that I accidently made and indexed 6 million spammy, thin content duplicate pages in the month of December because I forgot an “=” in some PHP code?
Other similar search terms and other big terms held their ranking while this was going on and didn't experience any drops - has anyone else experienced anything similar after a penalty or have any ideas on this?