tedster - 8:58 pm on Sep 7, 2011 (gmt 0)
Getting back to the opening post - the idea that a major shake-up in the algorithm is headed our way - I think it's clear that something is needed. There are too many cases where really weak pages that don't address the query terms still outrank good quality, well back-linked pages.
Only Google can see for sure what's going wrong, and the fact that they are looking for webmaster input about high ranking scrapers seems to say they have at least that area in their sights.
I've been thinking that the spam control area of the algorithm have become quite convoluted and their side effects are running amok to a degree. It's like the kind of spaghetti code that evolves over years of revision until at some point, you've got to redevelop from scratch.
In the early days, Page and Brin thought PageRank couldn't be spammed (imagine that) until Matt Cutts learned through working on the child-safe filter that it was being spammed already. So then, once the adult filter was created he started working on spam control. At this time, it seems to me that spam controls have become part of the problem - it's all tangled up.
And so I could see a major update, almost like a new codebase being written. And at that point, we're not in Kansas anymore.