Hello, and apologies if there is a similar question posted, I was unable to find the answer otherwise.
I attempted to add a sitemap for my new url. When I submitted it, I received a message stating that the file was blocked by robots.txt I found this odd as I didn't have a robots.txt that I was aware of. So I generated one allowing all robots. I uploaded it to the root directory and resubmitted. Again, I received the same exact error message. I went to the 'crawler access' tab and I was surprised to see that my robots.txt file as displayed on the Google webmaster tools...tool had a line added - "User-agent: * Disallow: /
I did not add "Sitemap:....." to the .txt file, it just appears on it's own. Where do you suppose this is being generated from? I have other domains hosted through my webhost, and this is the only one that is having this problem. I have removed the sitemap.xml.gz and only the sitemap.xml remains. (Don't even know what the .gz is) I am using WordPress and the google xml sitemaps generator plugin, which I've used with other domains with no problems. Thank you again for your much appreciated help!
[edited by: tedster at 5:40 pm (utc) on Aug. 19, 2009] [edit reason] switch to example.net [/edit]