frankleeceo - 10:25 pm on Mar 7, 2013 (gmt 0)
I think if we really do look at the long term picture. Google is in the works of "scaling" to the web. They are testing to "explode" or "focus" on the sites that are bad. So bad that they can get a consistent "bad" sample of these "bad" sites. The intention is not to bury the bad samples, rather letting them float to the surface, and let the crowd or userbase teach Google. But because it is so cheap and fast to create sites, it creates a constant flux of bad results.
Many sites may and do get reshuffled based on some "random" or hidden metrics. I think signals are no longer signals, they are more a mixed goo of mixed signals with loops and tons of if -then statements. Like, if you are an informational site your logo should be X size, should have comments enabled. If you do not your site is "bad". It can be something completely random simply based on "machine observations".
I foresee more shuffling and pain in the future. Many sites will die, and only few will flourish. The algorithm is certainly going in that direction. If somehow your site exhibits signals either accidentally or intentional to that of other "bad" samples, your sites will die. And no one can help you.
Feels like terminator is gonna take over the world, but in this case it is Google's machine learning capabilities. One day "Google" is going to scrape all the knowledge in the world, but we all just need one single website. I look forward to the day that we can order pizza via Google. (being sarcastic)
"1 large pizza, 3 toppings, cheapest price, under 15 minute". "I am feeling lucky"
Google: "Please give me your credit card."
Done ordering, pizza gets delivered. Google WILL decide which pizza you like the most based on your browsing behavior and what pizza you recommended via G+.
Result - all pizza website dead. Only one pizza left. Hell, Google should become a pizza restaurant too. They can cache one in their databases with unmanned vehicles for delivery. No Tipping necessary. More Win.