homepage Welcome to WebmasterWorld Guest from 54.237.54.83
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Visit PubCon.com
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Googlebot behaviour on timeout errors
Our server was having problems while a googlebot crawl
Eugenios




msg:192753
 3:46 pm on Sep 14, 2004 (gmt 0)

Last week we had a problem with our server. During one or two days the server crawled at snail speed and it was almost unaccessible.

Today we found out that almost all the pages of our site have only the link indexed at google with no content. Normally that would mean a link waiting to be indexed. But in this case, having all (or 99%) of all our pages indexed with only the link, makes me think that last week googlebot tried to access our pages and received a TIMEOUT error or something similar so the content was not refreshed or indexed.

Does anybody know about google policy on timeout errors or inaccesible pages? I don't mean 404 errors, but timeout errors or other server messages that indicate that the page exists but it is inaccesible at that moment.

Thanks in advance,

Enrique

 

WebGuerrilla




msg:192754
 4:21 pm on Sep 14, 2004 (gmt 0)


G will try several times within a crawl cycle to fetch your pages. If it gave up this time around and only shows links, it will try again next time. As long as the problem is fixed, you will get back in.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved