Forum Moderators: Robert Charlton & goodroi
[edited by: tedster at 5:59 pm (utc) on Mar. 9, 2008]
This surprises me because I've heard reports of googlebot returning, even after a 403. The W3C defines a 403 Forbidden error [w3.org] as:
The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.
So it sounds like googlebot is taking that quite literally in your cases - even though the spec does not say "must not be repeated."
Sitemaps have been known to get locked into 403 errors inside Webmaster Tools for some reason, although it's been a while since I heard a new report of this. Last year in this thread [webmasterworld.com] one member reported success with basic site verification by completely logging out of Webmaster Tools, clearing the browser cache, and then logging back in to request site verification.
If the site itself is already showing as verified inside Webmaster Tools and there's a sitemaps problem only with 403, I'm thinking a "Reconsideraton request" to explain the situation would help.
Anyone else have experience fixing an issue like this?
So, it's pretty much the case that I got several 403 errors any time I set up a new tools account, and I've never, ever had a problem after I let them in to validate the site.
Certain other factors may be at play here; The sites have generally already been indexed, are of high to very-high quality, and already have strong, on-topic inbounds. This is just one data point.
Jim