incrediBILL - 7:37 pm on Aug 9, 2010 (gmt 0)
The REAL ISSUE with scraper sites are sites that are +90 days old and still outrank you, or are even in the engine for that matter!
No, the real issue is the technology exists to stop most scrapers and instead of installing something that will solve the problem, people just keep getting scraped and complain about it.
Google can only do so much and Google isn't the only search engine in town, the problems also exist in Yahoo and Bing (soon to be the same).
At some point the webmaster has to protect themselves and claiming they "don't care about scrapers" yet "care scrapers outrank them" is idiotic because technically they CARE ABOUT SCRAPERS!
We used to have 302 hijackings, back in 2006, and we raised hell about validating spiders until it culminated with Dan Thies and myself raking Google over the coals in public at SES '06 in San Jose to get them to agree to give us the tools to fix the problem. Both Microsoft and Ask jumped up in at that session and promised to do it as well, and they all did, and it became an easy fix.
However, webmasters still didn't install the simple fix to validate the spiders, which stopped 302 hijackings, and just kept complaining.
If Google gave everyone the technology to make sure scrapers didn't outrank their content tomorrow, based on past history, my suspicion is most webmasters still wouldn't install it.
Many in the industry, including myself, don't mind fighting the fight to get things fixed but when the fight is over and people ignore the solution, why bother?