tedster - 1:51 am on May 29, 2010 (gmt 0)
I never thought about this before, but we certainly know that googlebot works off a cache of the robots.txt most of the time. Otherwise it would need to ping robots.txt right before each URL request, and that would get nasty pretty fast.
So apparently, 24 hours is what John is saying is the length of the cache time. Good to know. When Disallowed content gets place online, this is one precaution I never thought about.