rish3 - 3:40 pm on Apr 30, 2013 (gmt 0)
The solution, in the case of duplication, is Google not indexing/showing duplicates.
I agree with this, but they should also honor DMCA requests when they end up showing duplicates. They turned me down because the offending site was "a proxy". They didn't seem to get the issue at all...the site scraped the content, and altered all the urls in a way that made it look like they were hosting the site in it's entirety.
From my perspective Google should have more "smarts" on their team that approves DMCA requests. I've had DMCA requests initially declined for similar reasons before. For example, one site put all of the scraped content into a hidden <div>, and the DMCA request was denied because "they couldn't find" any copied content.
Both cases are straightforward enough that a simple tool, similar to copyscape, would help them make decisions.
There's a really big hole right now in the algorithm as well. Scrapers look for sites with good content that are currently suffering from Panda and/or Penguin, then scrape them. They instantly outrank the source and get free traffic with little effort.