superclown2 - 7:18 am on Jun 22, 2012 (gmt 0) [edited by: superclown2 at 7:34 am (utc) on Jun 22, 2012]
I guess Google wants to be able to claim their results are improved because there are fewer cases of terrible spammy sites ranking(which I find to be the case). Is this worth losing very good smaller sites and opening the door for negative SEO? So far Google seems to be very happy with the results.
This may look great to a Google engineer to but to someone trying to find information it's a waste of time when the first sites in the listings have the same plain vanilla superficial data. It's interesting to note though that the sites which are ranking are all well established ones, even though the pages in question are fairly new. This shows that Google places more weight on how much they trust a site rather than how valuable or unique the data is.
Trying to build a better website with better content than the competition is therefore of secondary importance. Establish a brand and try to rank for everything even remotely connected to it is a more successful strategy at the moment.
Should Google concentrate more on the quality and trust of a page, rather than a website? Perhaps it will come, their machine learning is still at a very early stage.
[edited by: superclown2 at 7:34 am (utc) on Jun 22, 2012]