Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Sitemaps reports robots.txt timeout

         

andrewshim

10:34 pm on Aug 13, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been getting this "URL timeout : robots.txt timeout" error periodically for a couple of weeks now. I wonder what is causing it.

User-agent: googlebot
Disallow: /rss/

User-agent: Googlebot-Mobile
Disallow: /rss/

My robots.txt file is all lower-case. Should I also make sure all the contents are typed in lower-case?

Is it caused by a server setting that can be changed?

Does it have anything to do with my web host?

Does the length of my sitemaps file have anything to do with it?

tedster

2:57 am on Aug 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Some people have reported this error as being Google's own bug, and it clears up without them doing anything. Can you access the robots.txt file regularly on your own, with no timeout?

andrewshim

3:46 am on Aug 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



yep. No problems at all.

tedster

4:02 am on Aug 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well, a timeout error's got nothing to do with the content of the file - it means the file isn't even being sent in response to googlebot's request.