oddsod - 6:05 pm on Jun 8, 2005 (gmt 0) There will always be small time bad guys. An SE that can't protect against the commonest brand is faulty as far as I'm concerned.
Atticus, it's deflecting attention from the source of the problem. Scrapers found a hole and they are exploiting it. People will always exploit holes. As webmasters we should focus on the cause of those holes and concentrate our efforts at the people who can fix them. The more you say that scrapers are bad the more you are entrenching yourself in a position of "This is not Google's fault". We don't like scrapers but they are not bad - they're just a variation on SEO (which almost all webmasters do). Strictly speaking your sites probably violate more Google guidelines than the scraper sites (if you really, really read Google's small print). That's from a purely technical point of view... not a usefulness test.
There will always be small time bad guys. An SE that can't protect against the commonest brand is faulty as far as I'm concerned.