Forum Moderators: Robert Charlton & goodroi
We can't currently access your home page because of a robots.txt restriction.
The robots text I use is
User-agent: *
Disallow:
Would this in anyway stop the Google spiders from index my homepage?
There seems to be other people having the same problem according to the Google sitemap forum.
Has anyone else seen this in their Google sitemaps account?
Thanks
# Disallow all spiders from *no* pages (i.e. "Allow all")
User-agent: *
Disallow:
# Disallow all spiders from *all* pages (i.e. "Allow none")
User-agent: *
Disallow: /
Note that you could serve a blank robots.txt or delete it entirely to achieve the same "Allow all" result as the one you're using now.
Ref: [robotstxt.org...]
Jim
Nothing changed (everything with the sitemaps was OK and then last night I noticed on the Google sitemap both sites had the red ERROR and defined it as " URL restricted by robots.txt ". My robots.txt file allows all robots and all pages as written above. This same exact robots.txt file worked fine for a long time.
I saw in another post someone recommended removing the .htaccess file just in case something quirky was happening so I did that, re-submitted the sitemap and an hour later it crawled it and came back with the same error.
Just in case I reloaded the robots.txt file (ASCII text mode as always) and that did not correct the problem...
This is odd...
"Several of you have noticed an issue with robots.txt reporting. Thanks for letting us know about this. We have fixed this issue and you should see updated status in your Sitemap account shortly. Our latest blog post contains more information.
by Google Employee - 5:41pm"