Welcome to WebmasterWorld Guest from

Forum Moderators: open

Message Too Old, No Replies

Googlebot behaviour on timeout errors

Our server was having problems while a googlebot crawl

3:46 pm on Sep 14, 2004 (gmt 0)

Junior Member

10+ Year Member

joined:Nov 20, 2003
votes: 0

Last week we had a problem with our server. During one or two days the server crawled at snail speed and it was almost unaccessible.

Today we found out that almost all the pages of our site have only the link indexed at google with no content. Normally that would mean a link waiting to be indexed. But in this case, having all (or 99%) of all our pages indexed with only the link, makes me think that last week googlebot tried to access our pages and received a TIMEOUT error or something similar so the content was not refreshed or indexed.

Does anybody know about google policy on timeout errors or inaccesible pages? I don't mean 404 errors, but timeout errors or other server messages that indicate that the page exists but it is inaccesible at that moment.

Thanks in advance,


4:21 pm on Sept 14, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 26, 2000
votes: 0

G will try several times within a crawl cycle to fetch your pages. If it gave up this time around and only shows links, it will try again next time. As long as the problem is fixed, you will get back in.