Forum Moderators: Robert Charlton & goodroi
It's been repeatedly mentioned in these forums that overall site quality score that Google allegedly keeps for a site has some affect on rankings, including the notorious "minus 30" penalty.
Several of my sites now have been hit with this penalty, so I have an unfortunate "luxury" of looking at some of the commonalities using my own stats and observations.
Long story short - one of the common features of the sites hit by "minus 30" is the fact that they are pretty busy forums and I do delete quite a few articles to keep it free of spam. Couple months ago I implemented a script that returns 410(Gone) HTTP error when a deleted page is requested. I guess 404(Not Found) is a more common implementation. But my line of thought was that these pages are gone forever and there is no need to come back and check on them later. Yahoo Slurp, for example, is notorious for coming back for pages missing for many months and sometimes years. Googlebot exhibits similar behavior.
Anyways, I have just noticed that in your Sitemaps (W.M. Tools) account Google separates "404"s from the rest of the HTTP errors. So, with these 410s that I send out my site show ridiculous amounts of HTTP errors (in 5000 to 10000 range) and something like two 404 errors.
Do you think this many HTTP errors may lower the quality score for the site and "help" the site trip some extra filters on the way to the "minus 30" or any other rating degradation for that matter?
I would be astounded if too many 410 urls alone could generate a minus 30 penalty. But then again, I obviously don't know for certain.
I know that Vanessa Fox confirmed months ago that 410 and 404 were handled identically -- at that time.
Also, I note that we had a similar report about 410 issues:
[webmasterworld.com...]