Planet13 - 7:40 pm on Aug 9, 2010 (gmt 0)
If Google continues to keep scrapers in their serps after they KNOW the content is copied, and lets face it, they DO know it... well then they suck! There, I said it! :-)
a poster on this board mentioned that bing's share of the search market has been growing slowly, and he interpreted the data to mean that people are searching for something on google first, becoming dissatisfied with the results, then turning to bing as a backup.
I am not sure what the best course might be - legal action would be costly and difficult, so maybe we just need to spam google into action; I am thinking that all legitimate webmasters should buy up a bunch of cheap keyword-rich domains and scrape as much content as possible. Only when google sees its user base plunge - and a drop in adwords revenue, will it actually change how it ranks duplicate content.