claaarky - 6:57 pm on Jan 13, 2013 (gmt 0)
TMS, just to address your size of the web and discovery question, as you know, google tests the pages of new sites with new traffic all the time. It knows about these through links etc. and they just throw a little bit of traffic at a new site, see how people respond and determine relative quality from that. If the test reveals the new site is good, more traffic is tested and so on.
I'm open minded to other ideas such as user metrics coming from another source but I dont buy the argument that it's possible to programmatically replicate the million decisions a human makes in a second about a site. Flawed as a third of the stats may possibly be, I think it's unlikely man could generate anything even close by crawling a site and making decisions that way.
In my mind it's not a case of whether it's perfect, it's a case of what better options are there realistically. Google is very keen to have everyone using chrome (on mobile as well) - I think that's significant.