Welcome to WebmasterWorld Guest from 126.96.36.199
Three major metrics - pages crawled per day, kilobytes per day, time spent downloading a page. Four month rolling average. Nice little graphs.
Not seen any discussion of this.
Anyway time to download a page - site "A" - milliseconds:
Ditto site "B"
As a past chairman of the UK Computer Measurement Group, I discount the max numbers because they're limited factors. But the ave and min numbers interest me, because site A quite clearly outperforms site B in terms of currency in the index and cache and also search positions for medium-frequency keywords.
Site A's server is clearly 3.2 to 3.9 times more responsive than site B's.
Thoughts? Is it possible that Google is presenting faster sites further up the SERPs, all else being equal?
After I fixed the problem, I set the "crawl speed" to the faster option.
All this was a couple of weeks ago. Now the graphs show a very high crawl rate with a very low "time to download a page".
This week, my traffic has increased drastically, to the point my server can't handle it. It very well could be that now it's picking up pages that it hasn't before, in fact, the site command shows more pages.
I would attribute the increase in traffic to this, except it's not proportional. The increase in traffic is no where proportional to the amount of new pages indexed.
But clearly a very buggy server response would mess with crawling itself, if not ranking....
Needless to say..a slow server response or pages loading will kill your traffic and sales (whatever they are)...
If you can combine 99.9% uptime minimum with reliable 1 - 3 second pages loads .. this is probably an acceptable range for good reliable bot crawls and usability...
Remember...if people are finding you via Google .. the very first thing they get is lighting fast responses...if they click on your link and it takes 10 seconds to load..."houston...we have a problem!"