| 4:56 am on Nov 11, 2006 (gmt 0)|
I've heard that Google does keep response time records. If the "rumor" is true and they use that data in the ranking algo at all, it would be a very minor factor tucked away in a trust calculation, I think. But clearly a very buggy server response would mess with crawling itself, if not ranking.
| 5:00 am on Nov 11, 2006 (gmt 0)|
but if they used server response times, then that would put sites hosted outside the US at a huge disadvantage.
| 6:43 am on Nov 11, 2006 (gmt 0)|
Not if google bot is in the country.
|Jordo needs a drink|
| 6:45 am on Nov 11, 2006 (gmt 0)|
When they added the "crawl rate" feature to webmaster tools, I noticed on the graphs that my "crawl rate" dipped very low while the "time to download a page" increased drastically starting in October. It actually alerted me to a problem that I fixed.
After I fixed the problem, I set the "crawl speed" to the faster option.
All this was a couple of weeks ago. Now the graphs show a very high crawl rate with a very low "time to download a page".
This week, my traffic has increased drastically, to the point my server can't handle it. It very well could be that now it's picking up pages that it hasn't before, in fact, the site command shows more pages.
I would attribute the increase in traffic to this, except it's not proportional. The increase in traffic is no where proportional to the amount of new pages indexed.
|Jordo needs a drink|
| 6:51 am on Nov 11, 2006 (gmt 0)|
I guess it would make since for Google to include retrieval times in their algos.
They wouldn't want their users clicking their search results links and waiting forever to get them. It could be considered the same as providing bad results.
| 8:53 am on Nov 11, 2006 (gmt 0)|
|But clearly a very buggy server response would mess with crawling itself, if not ranking.... |
Needless to say..a slow server response or pages loading will kill your traffic and sales (whatever they are)...
If you can combine 99.9% uptime minimum with reliable 1 - 3 second pages loads .. this is probably an acceptable range for good reliable bot crawls and usability...
Remember...if people are finding you via Google .. the very first thing they get is lighting fast responses...if they click on your link and it takes 10 seconds to load..."houston...we have a problem!"
| 3:34 pm on Nov 12, 2006 (gmt 0)|
I wasn't thinking about human perceptions, though that's been covered this week as well:
[edited by: Phil_Payne at 3:34 pm (utc) on Nov. 12, 2006]