Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Getting googlebot to crawl again after a 403 forbidden error

         

jackc1

9:26 am on Mar 9, 2008 (gmt 0)

10+ Year Member



if google bot come visit your site, and your site return a 403 forbidden error, google will not come again, even your website is running fine afterwards. Some of my clients have this problem with google for very long time, even after a year, google still won't come to visit again, submitting sitemaps it will just say 403 forbidden, even there is no problem at all. The only solution i found is to change the server or change the ip address of the server, then google will like your website again. why google bot so stupid? anyone have other solution please share. thanks!

[edited by: tedster at 5:59 pm (utc) on Mar. 9, 2008]

tedster

6:45 pm on Mar 9, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Welcome to the forums, jackc1,

This surprises me because I've heard reports of googlebot returning, even after a 403. The W3C defines a 403 Forbidden error [w3.org] as:

The server understood the request, but is refusing to fulfill it. Authorization will not help and the request SHOULD NOT be repeated.

So it sounds like googlebot is taking that quite literally in your cases - even though the spec does not say "must not be repeated."

Sitemaps have been known to get locked into 403 errors inside Webmaster Tools for some reason, although it's been a while since I heard a new report of this. Last year in this thread [webmasterworld.com] one member reported success with basic site verification by completely logging out of Webmaster Tools, clearing the browser cache, and then logging back in to request site verification.

If the site itself is already showing as verified inside Webmaster Tools and there's a sitemaps problem only with 403, I'm thinking a "Reconsideraton request" to explain the situation would help.

Anyone else have experience fixing an issue like this?

WiseWebDude

3:49 pm on Mar 11, 2008 (gmt 0)

10+ Year Member



Hm, what about trying to submit the site to Google: [google.com ], maybe that will kick it in? Worth a shot at least...

jdMorgan

6:03 pm on Mar 11, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'll go with clear cookies and completely-flush browser cache on this one. I use some very strong anti-scraper logic on my sites, and basically, I almost always got several 403-Forbidden responses when setting up Google, Yahoo, or MSN webmaster tools -- that is, until I relaxed the 'filters' to let them in with their various User-agents and IP address ranges to 'validate' the sites.

So, it's pretty much the case that I got several 403 errors any time I set up a new tools account, and I've never, ever had a problem after I let them in to validate the site.

Certain other factors may be at play here; The sites have generally already been indexed, are of high to very-high quality, and already have strong, on-topic inbounds. This is just one data point.

Jim