Hello there, One of my website has encountered an error
http://www.abc.com/: Googlebot can't access your site
The detail message is as
http://www.abc.com/: Googlebot can't access your site October 20, 2012
Over the last 24 hours, Googlebot encountered 1 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 33.3%.
In fact I have uploaded robots.txt file yesterday which was not there before..
Out of three tries, they got it twice and couldn't find it once. Not a problem. They're just waiting for a best three out of three. If they looked repeatedly and could never find a robots.txt, they would assume you really don't have one, and that would have been fine too. It's the mixed answers make the googlebot uneasy.
But not as uneasy as if it met something other than a 404* or 200. Then it just wouldn't dare crawl at all.
* Has anyone in history ever served up a 410 when asked for robots.txt? :)