Forum Moderators: Robert Charlton & goodroi
Why make us guess if we have been penalised? Surely it would benefit SERPS if good sites were able to be notified when problems arise that affect how they appear in rankings? Surely it would benefit Google if webmasters were made aware of duplicate content issues?
Only possible downside I could think of:
1. Black Hats could use it to test individual bad practices to see which Google will/will not notice.
BUT, I can't see that as an actual issue. The length of time it would take to test out black hat theories one by one with so many other variables and with some techiniques de-ranking a site on merit rather than actual filter tripping would mean it would not be an effective spam testing technique, would it?
Black Hats could use it to test individual bad practices to see which Google will/will not notice.
..........................
There's another issue here, too. Webmasters tend to use the word "penalty" rather loosely, compared to the way Google uses it. For example, your page can be filtered out because it looks like a duplicate, and not have that be a "duplicate content penalty". It's just that another site with almost the same content outranks you, and the lower ranked url gets filtered out to keep their end users happy.
Why scraped content can rank higher than the original site that had top ranking for it for a long time - that's a different question.
One duplicate problem that affected many sites in the past was improper handling of a "custom 404" page. Today, you cannot verify your site in GWT until that's handled - you do get a message.
You may find it better to create a true custom page, rather than just re-use the Home Page. It's better for the visitor, who then clearly knows that the URL didn't work, and there's no risk that a server error would start creating dupe content from your home page.