Forum Moderators: Robert Charlton & goodroi
Since about April 11, 2006, googlebot has been experiencing "issues" with robots.txt files.
I know because Google's "URLs restricted by robots.txt" tool showed 3 of my URLs blocked even though my robots.txt was wide open at the time.
Subsequently, I read in another post here on webmasterworld that the tool can only report 3 URLs -- ergo, there could be quite a few more.
From April 12th on, Google Analytics shows my overall Google Referrals, down by 46% -- by 96% on important AdSense keywords. Could be this, could be something else. Lot going on with Google right now.
You can read the post by Vanessa Fox of Google Engineering at the Inside Google Sitemaps blog. Hopefully the link for that -- [sitemaps.blogspot.com...] -- will be okay by webmasterworld. If not sticky me and I'll give it to you.
// my top secret bot
if (robots.txt exists and is world readable){
**parse the file, go ahead and read allow sections now
**later grab disallows from the special bot IP address
**yada yada
**bunch of code
**silly webmasters
} else {
**just rip the site
}