ColourOfSpring - 7:42 pm on Apr 12, 2013 (gmt 0)
Penguin and Panda may also be an attempt to wipe out large numbers of sites deemed untrustworthy, just to whittle down the results to something manageable. But I suspect if its humanly possible, Google would prefer the algo be able to crunch all the data and deliver the most relevant results.
100% agree diberry. The index is growing bigger and bigger every single day. Google's job was a lot easier 5 years ago than it is today. Something had to give. Panda and Penguin ARE broad filters - many many many false positives caught in these nets. Google KNOW that of course - they're well aware that such filters will catch a lot of the good as well as the bad. But the granular filters that we all want simply don't scale to today's index - impossible to measure from Google's point of view. Even Panda and Penguin have been manual "runs" - not part of the algo (though they are apparently now part of it just only recently). The bigger the index, the broader the filter (or at least, the broader the effects of the filter).