Welcome to WebmasterWorld Guest from

Message Too Old, No Replies

Is server response time a factor?

One site three times slower



4:33 pm on Nov 9, 2006 (gmt 0)

10+ Year Member

I have a number of sites with Google XML-format sitemaps verified at root level. I noticed a short while ago that some have the "Crawl Rate" button enabled on the Diagnostic page. Well, show me a button and I'll click on it.

Three major metrics - pages crawled per day, kilobytes per day, time spent downloading a page. Four month rolling average. Nice little graphs.

Not seen any discussion of this.

Anyway time to download a page - site "A" - milliseconds:
Max 2326
Ave 113
Min 23

Ditto site "B"
Max 2473
Ave 372
Min 90

As a past chairman of the UK Computer Measurement Group, I discount the max numbers because they're limited factors. But the ave and min numbers interest me, because site A quite clearly outperforms site B in terms of currency in the index and cache and also search positions for medium-frequency keywords.

Site A's server is clearly 3.2 to 3.9 times more responsive than site B's.

Thoughts? Is it possible that Google is presenting faster sites further up the SERPs, all else being equal?


4:56 am on Nov 11, 2006 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member

I've heard that Google does keep response time records. If the "rumor" is true and they use that data in the ranking algo at all, it would be a very minor factor tucked away in a trust calculation, I think. But clearly a very buggy server response would mess with crawling itself, if not ranking.


5:00 am on Nov 11, 2006 (gmt 0)

5+ Year Member

but if they used server response times, then that would put sites hosted outside the US at a huge disadvantage.


6:43 am on Nov 11, 2006 (gmt 0)

5+ Year Member

Not if google bot is in the country.

Jordo needs a drink

6:45 am on Nov 11, 2006 (gmt 0)

5+ Year Member

When they added the "crawl rate" feature to webmaster tools, I noticed on the graphs that my "crawl rate" dipped very low while the "time to download a page" increased drastically starting in October. It actually alerted me to a problem that I fixed.

After I fixed the problem, I set the "crawl speed" to the faster option.

All this was a couple of weeks ago. Now the graphs show a very high crawl rate with a very low "time to download a page".

This week, my traffic has increased drastically, to the point my server can't handle it. It very well could be that now it's picking up pages that it hasn't before, in fact, the site command shows more pages.

I would attribute the increase in traffic to this, except it's not proportional. The increase in traffic is no where proportional to the amount of new pages indexed.

Jordo needs a drink

6:51 am on Nov 11, 2006 (gmt 0)

5+ Year Member

And thinking...

I guess it would make since for Google to include retrieval times in their algos.

They wouldn't want their users clicking their search results links and waiting forever to get them. It could be considered the same as providing bad results.


8:53 am on Nov 11, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

But clearly a very buggy server response would mess with crawling itself, if not ranking....

Needless to say..a slow server response or pages loading will kill your traffic and sales (whatever they are)...

If you can combine 99.9% uptime minimum with reliable 1 - 3 second pages loads .. this is probably an acceptable range for good reliable bot crawls and usability...

Remember...if people are finding you via Google .. the very first thing they get is lighting fast responses...if they click on your link and it takes 10 seconds to load..."houston...we have a problem!"


3:34 pm on Nov 12, 2006 (gmt 0)

10+ Year Member

I wasn't thinking about human perceptions, though that's been covered this week as well:


[edited by: Phil_Payne at 3:34 pm (utc) on Nov. 12, 2006]


Featured Threads

Hot Threads This Week

Hot Threads This Month