|Webmaster Tools - "Error while trying to access your Sitemap"|
I'm seeing a lot of red crosses lately in GWT for my sitemaps.
Upon investigating, the only error message displayed is this rather vague one:
|We encountered an error while trying to access your Sitemap. Please ensure your Sitemap follows our guidelines and can be accessed at the location you provided and then resubmit. |
The strange thing is that the error usually goes away, and then comes back, sometimes it is one XML file, and other times, it's another. The XML files validate just fine when I check them. They've never given me any errors until recently.
Is anybody else experiencing the same problem?
|the error usually goes away, and then comes back |
That says to me that it is an access problem, rather than a file validation problem. The technical reason for the access problem might be on your server, and it might be in Google's infrastructure.
I'd check the server logs and see what is happening from your side of the equation when googlebot access is attempted. If you can rule out your server having intermittent problems, then it's a Google-side issue.
Thanks, I'll check it out
Having checked the logs, I'm seeing Googlebot successfully getting the sitemap files (status 200 or 304), but I'm still getting the error.
So I'm thinking it may be a Google issue, although so far it has been limited to the same domain (www.example.com and blog.example.com) but on different servers, which makes me wonder if DNS is playing a role here (since this domain changed DNS servers about 2 months ago).
To be precise, you can see that your server receives those requests and accurately sends googlebot the right response. But if Google is having tech troubles of their own, they still might not get the response.
As for DNS troubles on some requests, that can happen. Can you match up your server logs with the errors Google is showing you?
I got one of the red crosses too. Old site (been in WMT for a long time) with settings 'display URL as www.domain.com' BUT when submitting an XML sitemap, for our new blog, it only lets me add one without the www extension. Therefore throws an error!
Google reports how many URLs in the sitemap, but shows indexed=0.
Anyone know a fix for this.
Domain was originally added to WMT as domain.com
Settings were set as display URLS as www.domain.com
Submit a sitemap did'nt pick up on the display settings but defaulted to the domain originally added to WMT (without the www.)
Bit of a flaw, I think.
So I removed domain from WMT and re addedd as www.domain.com
Now I can submit sitemaps with the www. extension (that ties up with th URLs in the sitemap so doesn't throw an error).
|Can you match up your server logs with the errors Google is showing you? |
Not sure what you mean?
My server logs show successful 200/304 requests for the said sitemap files, but all Google tells me is the following:
"Error: The last attempt at downloading the Sitemap failed. The details below are representative of the last successful download."
And below that is error message in my original post. There's no mention of an error with the XML file, no line number indicated, nothing about DNS related issue.
For the affected sitemap file, Google seems to be reading it 4 times a day regularly, and it managed to do so (all 4 times) on the day that the error was reported.
By tomorrow, the error will probably be gone again, and then it will come back (probably another sitemap file) in a day or two.
I had this happen to a client's site once. The sitemap had been fine in WMT - then the red X came. I validated the sitemap, resubmitted and still an X.
The site had a WordPress blog in it. Eventually, I discovered that some malicious code was added as a directory within the WP theme.
Once I removed that the Red X went away.
I initially discovered this by looking at a cached version of the site in Google and saw a link for an online casino.
Just putting it out there because my Red X was really coming from malicious code in WordPress creating a spam link.
The same error has now popped up on more sites (different domains, different servers, although the same DNS server), again random XML file every time it seems. I'm going to put one site on a different DNS server and see if the problem persists. If so, then it's probably a Google thing.