Samizdata - 3:21 pm on Mar 30, 2012 (gmt 0)
Are these accesses you're talking about from the bot or these are due to manual review or another validation mechanism.
The accesses are never from Googlebot, but they are not human either.
exactly what any reasonable webmaster would expect them to be doing
Indeed, the difference being that stealth bots do not add the files to the search index.
The question is, why make this "public service announcement", and why now?
Matt Cutts says "please... unblock... if you can... you don't need to do that now".
If he means that Google will no longer use unidentified stealth bots, he doesn't say so.
If he means that Google will penalise sites that restrict access, he doesn't say so.
If he means that Google's ranking system is not as smart as they like people to think, and that it is being gamed far too easily, and that he wants webmasters to help his company out by making changes to their sites, he doesn't say that either.
But anything to do with robots.txt compliance is a charade anyway.