I have a number of sites with Google XML-format sitemaps verified at root level. I noticed a short while ago that some have the "Crawl Rate" button enabled on the Diagnostic page. Well, show me a button and I'll click on it.
Three major metrics - pages crawled per day, kilobytes per day, time spent downloading a page. Four month rolling average. Nice little graphs.
Not seen any discussion of this.
Anyway time to download a page - site "A" - milliseconds:
Ditto site "B"
As a past chairman of the UK Computer Measurement Group, I discount the max numbers because they're limited factors. But the ave and min numbers interest me, because site A quite clearly outperforms site B in terms of currency in the index and cache and also search positions for medium-frequency keywords.
Site A's server is clearly 3.2 to 3.9 times more responsive than site B's.
Thoughts? Is it possible that Google is presenting faster sites further up the SERPs, all else being equal?