rlange - 1:49 pm on Aug 29, 2011 (gmt 0) [edited by: rlange at 2:36 pm (utc) on Aug 29, 2011]
This is quite worrying. Google doesn't have the mindpower to deal effectively with scraping so now it is, in effect, socialising the problem by getting the public and users to submit the details of scrapers.
Google's never really had the "mindpower" to properly rank pages and sites on its own. Wasn't one of the things that made Google "better" back in the day the fact they used backlinks as major ranking factor? That's just another form of "socializing" a more general problem.
It is a positive development in that it will solve a percentage of the problem however until Google manages to automate the process of detection, analysis and removal, it is still going to have a massive problem.
That's the point of this form. They're not feeding the submissions into an algorithm. They're simply using them to build a large enough data set that they can analyze and then use the results of their analysis to modify the existing algorithm(s).
Edit: Actually, that's not quite correct. It's pretty clear (right there in the OP, heh) that they already have changes to the algorithm and they're looking for user-submitted examples to test those changes against. It's too early...
[edited by: rlange at 2:36 pm (utc) on Aug 29, 2011]