diberry - 4:10 pm on Jun 12, 2013 (gmt 0)
Great thread, hitchhiker!
I agree with Whitey that there are a lot of indicators that they are trying to be more transparent and open to feedback. It's my belief that attempting to parse the entire web in any meaningful way is beyond the computing capabilities of Google or anybody else these days. So if Google simply can't thoroughly "read" every page and all its ranking factors and make sure it's exactly where it should be, what are they to do?
I think they're resorting to very broad strokes, not because it represents the SERPs they really want, but out of desperation:
--Relying more than ever on "authority/trust" signals (which is probably where the brands are getting an advantage) to sort the wheat from the chaff.
--Penguin 1.0 seemed to just sweep a lot of sites out of Google's way for a year. It looks like with Penguin 2.0 they're trying to get a little more precise - like, maybe that year gave them time to find more efficient ways of doing some of their data processing?
What puzzles me is why some spammers are still ranking. It seems to me that if Google's worked SO HARD to end spam and we're still finding a lot of it at the top, then they might want to reconsider their entire strategy. But all I can think of for that is to dump links altogether and focus on user metrics, and I have a feeling they're not able to do that comprehensively enough yet.