Robert_Charlton - 5:20 am on Apr 9, 2012 (gmt 0)
My initial instincts are that Google's got the same types of historical tracking on review and ratings acquisition as they do on backlinks. I feel it's only a matter of time before there's enough data to make a statistical evaluation, which I think is basically how much of the Google algo is currently working.
Over the years I've observed companies that have tried to sell fake reviews to clients drop off the map... and I've seen sites that bought the reviews disappear too.
Right now, I'm seeing lots of variation in the results... stuff that looks like crap is in the top 10 for a few days; then it vanishes; then it comes back, sometimes in a different position.
Results before most major changes can look extemely bad. It's been described as letting the junk rise to the surface so they can skim it off. I think that to get a large enough sampling of what they're trying to get rid of, Google needs to err on the side of allowing it for a while.
It's a mistake to regard Google as amateurs in the longterm. Lots of folks have made money, though, exploiting vulnerabilities for a short time. I gather that's getting harder and harder to do.
And a PS on this one... I'll bet that they're correlating ratings data with social and traffic data and with user behavior, but that the correlations will take time to be collected and to kick in. Just a hunch.