Whitey - 12:41 am on Jun 22, 2011 (gmt 0)
Seems to me duplication through category to product pages is tipping the threshold, especially on aggregated or machine driven content. Taxonomy may be playing into this as well.
If this is true, and these early reports suggest it's at least highly plausible, i don't undertand the rationale behind their elimination in trying to conquer scrapers. Maybe it's collateral damage or is it just a case of let's remove 50% of aggregated content, and not discriminate?
Which could be why some of the better sites using aggregated content have been caught and some of the lowest quality ones have not.