...ooh kay i'll ask again then. Can i quote Adam? I'm not sure everyone read the post ;) if not please edit it out
This filtering means, for instance, that if your site has articles in "regular" and "printer" versions and neither set is blocked in robots.txt or via a noindex meta tag, we'll choose one version to list. In the rare cases in which we perceive that duplicate content may be shown with intent to manipulate our rankings and deceive our users, we'll also make appropriate adjustments in the indexing and ranking of the sites involved. However, we prefer to focus on filtering rather than ranking adjustments ... so in the vast majority of cases, the worst thing that'll befall webmasters is to see the "less desired" version of a page shown in our index.
Doesn't this mean the following:
- If there's a duplicate page on your site ( and nowhere else on the net ), no problem, they'll list one of the urls and make the rest either supplemental or drop out.
- If the site gets a manual review because of its pattern for dupes, and is seen as using this practice to climb up the SERPS with no real content, they apply a manual penalty
- There are no automatic sitewide penalties for dupe content within the same domain