| 5:03 pm on Sep 15, 2012 (gmt 0)|
Welcome to WebmasterWorld, poojabhatt.
Try accessing your robots.txt with fetch as googlebot from within webmaster tools and note the results.
| 8:58 pm on Sep 15, 2012 (gmt 0)|
welcome to WebmasterWorld, poojabhatt!
is that a recent message?
google will typically show this when the request for robots.txt returns a non-404/410 4XX or a 5XX status code.
for example if your server was temporarily down and supplied a 503 Unavailable response.
| 9:14 pm on Sep 15, 2012 (gmt 0)|
It's a nightmare when that happens. That message sticks around for weeks and months, long after it ceases to be true.
If you're very unlucky the crawl errors numbers box and crawl errors graph below that completely disappears, and you get no crawl error data for the next few months either.
| 2:39 am on Sep 16, 2012 (gmt 0)|
Hi jimbeetle, How to Try accessing robots.txt with fetch as googlebot? I don't know this. Can you tell me the process?
| 3:07 am on Sep 16, 2012 (gmt 0)|
If you're in Google Webmaster Tools you're most of the way there. It's
:: shuffling papers ::
"Fetch as Google" under the "Health" tab. You can also try "Blocked URLs" on the same tab. This shows you the robots.txt that g is currently using. (They took away the "make a robots.txt" part, but the rest of the page is unchanged.) If there's a robots.txt displayed at all, then you know they are lying in their teeth and they had no trouble finding the file.
| 4:21 pm on Sep 16, 2012 (gmt 0)|
Hi please check out my webmaster status. Google fetch I have already done that before. and I don't know about blocked urls. I have a print preview how can I show up the attachments.
Also let me know- what to do with :: shuffling papers ::, where I have to put that.