Hi, I have a weird dilemma. Webmaster Tools (both FETCH and SITEMAP retrieval tools) cannot see 2 of my websites.
I consistently get a "Unreachable Robots.txt" file error from both. Moreover, I received a Google alert that 2 of my 7 GoDaddy websites have been unable to retrieve this file for over 24 hours. I get this error from the WMT FETCH tool, regardless of which pages on these websites that I try to retrieve. All 7 websites are hosted on a GoDaddy shared plan, and are on the same server. 5 of them are not exhibiting this problem at all.
I can see the website, robots.txt, and all associated pages fine from my Browser and FTP, and GoDaddy can as well. Moreover, with the shared hosting plan that I have, all my other websites on the same server can all be found by Google without a hitch. These effected domains have been in existence for many years, and this problem has manifested since Sunday.
To be safe, I uploaded robots.txt again, in the event that my robots.txt file was corrupted somehow. This did not remedy the problem at all. Removing robots.txt and augmenting it to contain a Googlebot User Agent didn't remedy it either. My robots.txt file is a simple 2 line
file that contains a "*" for User-Agent, and the 2nd line lists the URL of my sitemap. Again, these had been working for years, without a hitch, and hadn't been changed.
I have talked with GoDaddy and asked them about permissions and firewalls and such, and they say that everything is fine on their end.
Does anyone have an idea as to what this problem might be?
Thank you in advance.