Welcome to WebmasterWorld Guest from 188.8.131.52
After about 3 weeks, the 500s stopped and the normal pages in the cache and live results began to appear. We did nothing as we found no solution to the problem. It looks like Google may have "fixed" something.
The problem is that over the three weeks our rankings slowly went down due to all of the 500s which produced the same page. The rankings have not been restored since the "fix".
I kinda feel like this may have been a bug on Googles part and we are being penalized.
I'd sugggest that anyone using .NET 2.0 install Firefox and the UserAgentSwitcher add-on so they can try a page from their site as Googlebot and discover if they get a 500 error. It's a quick health check-up that is well worth the little bit of time it takes.
I also note that mikey158's problem "fixed itself" with no apparent changes on his server - so possibly the Google crawl team is taking some action here when they see too many 500 http errors. All I can imagine them doing, however, is re-crawling with a different user-agent string, one that does not include "Mozilla".
Does anyone see that in their server logs?
create a "genericmozilla5.browser" file in your "/App_Browsers" folder in the root of your application... This will match generic Mozilla compatible browsers and spiders with user-agents strings such as Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
There is a simple work-around:
1. In your ASP.NET's App_Browsers directory, add a file called BrowserFile.browser
2. Add the following to the BrowserFile.browser file:
<capability name="cookies" value="true" />
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Then go to your site. If you can browse it, then you don't have a problem with this bug.
The exact conditions that trigger this bug are still not 100% clear to me, and neither am I clear whether either Google or Microsoft have taken steps to try to compensate for it. The most I can say is that it seems to be triggered at times by using .NET's native url rewriting, for example, the HttpContext.RewritePath Method [msdn2.microsoft.com]
The third party re-write utilities such as ISAPI Rewrite do not seem to trigger it. But I do want to emphasize that I'm using language filled with weasel words like "seems to". I recommend people do this quick check with the Firefox add-on, or check your server error logs to make sure googlebot is not generating 500 error codes.
":" in google bot's user agent string was breaking it!
The workaround I posted above, adding a .browser file to your asp.net's App_Browsers directory, was recommended by Microsoft after I submitted the bug to them in early 2006. The problem went away completely and I have seen no side effects at all.
I would think that others in this community would have experienced this problem bigtime, considering that many of us use Context.Rewritepath for SEO purposes.
That said, I am unable to recreate the problem today with the test case that I submitted to Microsoft. So, my assumption here is that a patch may have come out for ASP.NET that has addressed the problem. (I have since upgraded to .NET 3.0, maybe that's what did it.)
My website is ASP.net compatible however, so could I still create the browser file in App_Browsers? (i have just tried this and the error occurs - though i cant restart the site manually to properly test this).
I believe that this bug has been fixed, because I cannot get it to reproduce anymore. This bug was over a year old, and I was running on ASP.NET 2.0 in early 2006. After a few 2.0 patches and the installation of .NET 3.0, I can no longer reproduce the error.
I'd be curious to know if Mickey158 has installed .NET 3.0 or at least has patched 2.0. I think that alone might fix his issue instead of the workarounds.