Elsmarc - 3:44 pm on Sep 29, 2012 (gmt 0)
Remember, in all of this, we are dealing with mathematical algorithms. The same type of thing that has been causing the occasional stock market "flash crash" (and played a significant role in the 2007-2008 crash). There are now over 360,000,000 web sites online today of which 20% to 30% are estimated as "active". Google's algorithms are failing to produce the desired results because there are so many web sites, each with lots of pages, and the sheer volume of data is overcoming what a mathematical algorithm can deal with effectively. Now bring in the "social" site aspects and things really begin to blur.
There is also the issue of Matt Cutts saying Google's algorithm is meant to identify "quality" sites. They think they can do this with a mathematical algorithm which, when you think about it, is really impossible. It takes a human with a knowledge of the site topics to evaluate (by actually visiting and browsing the site) whether a site is a "quality" site, and even there subjectivity will enter the picture.
In my opinion Google is hitting the "...can't do that, Dave" in it's expectations of what it's algorithms can do.