Ralph_Slate - 1:01 am on Aug 22, 2012 (gmt 0)
At the scale of today's (and tomorrow's) web, it looks to me like algorithms are a necessity. Penguin is an early attempt at a new kind of algorithm (the ground was first broken by Panda) and it's only a beginning, I'm sure.
Yes, they need algorithms - but they need people to step in where the algorithms are not working properly. Google has profits in the BILLIONS. They could stand to implement a team where, upon request, they review the decisions made by their algorithms to see if they make sense.
There is no way a generic algorithm can determine which page, among several, is the "best". In the past it didn't matter as much because Google would show all of the results, and the users would ultimately pick the ones they liked by both clicking and linking to them. But Google is moving in the direction of suppressing pages when they decide that several present "essentially the same information" - and only serving the "best" page. In other words, instead of degrees of winning, there is going to be a single winner and everyone else loses.
Read Matt Cutts' "frog" example: [stonetemple.com ]. Cutts uses the phrase "While they’re not duplicates they bring nothing new to the table. It’s not that there’s anything wrong with what these people have done, but they should not expect this type of content to rank."
Google thinks its generic algorithms can determine the nuanced content differences between pages. It can't.