Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

By Mistake send a 403 Forbidden

Will it recover?

         

wernizh

5:42 am on Dec 13, 2006 (gmt 0)

10+ Year Member



Due to a configuration issue I made a mistake and sent a 403 Forbidden to Google.

This was 7 hours ago. Now I got aware and repaired it. But within this 7 hours Google tried to spider about 400 of my 4000 pages. All of them got the 403, of course.

- What will be the effect? Will these pages be removed from SERPS
- Will Google spider this pages again?
- Should I tell google? (e.g. with a reinclusion request).

Is Google aware that somethimes things are going wrong by mistaken, by technical failure etc. and try again?
Or does Google take it very hard (like "..if it's forbidden I will not touch it anymore..."

Thanks for your input.

[edited by: wernizh at 6:20 am (utc) on Dec. 13, 2006]

wernizh

6:01 am on Dec 13, 2006 (gmt 0)

10+ Year Member



By the way:

I found now many errors in "Google Webmaster Tool" about this "forbidden". Also in "Sitemaps" it says something like "... there was a problem accessing your sitemap, [status forbidden].... repair ... and then submit the sitemap again...."

Maybe this helps for everything?

tedster

6:32 am on Dec 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, as long as the technical problem is short term, your site should suffer no problems. Even if the problem goes on for several days, recovery can happen rather quickly once your server is fixed.

wernizh

6:37 am on Dec 13, 2006 (gmt 0)

10+ Year Member



Thanks Tedster

If it was a 404 (not found) I would mind so much.
In this case I would assume that Google just tries again after some time.

But I'm concerned because it was a 403 (forbidden).
Maybe some people gets angry if they forbid something and then Google just keeps going to try accessing it.

g1smd

4:41 pm on Dec 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It will likely fix itself after several more crawls of each affected URL.

wernizh

5:18 pm on Dec 13, 2006 (gmt 0)

10+ Year Member



It looks good so far.
Google spidered my main page already 4 times, but did not the /blog/ subdirectory.

So it looks like google did not take the "forbidden" too seriously for this short time.

koan

7:59 pm on Dec 13, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just wish Google would obey robots.txt rules. I have bot traps pages that are forbidden by robots rules, yet I keep getting Google IP addresses in that trap, even though the rules have been there for months.

maccas

8:21 pm on Dec 13, 2006 (gmt 0)

10+ Year Member



Yeah I have never had the courage to use that "spider trap", if I did I would have it 2 levels deep, the first a page with noindex nofollow tag with a link (rel="nofollow") to another page with the code and both files disallowed through robots.txt.