lucy24 - 12:03 am on Dec 23, 2011 (gmt 0)
Google may have already indexed your robots.txt-- this is common though not universal-- and in that case there's not much point to blocking humans. If you wanted to keep humans from knowing what's there, you could set a timer on robots.txt so the file only stays open for, say, half a second. Or a millisecond or whatever. Loads of time for a robot to assimilate it, but not enough for human eyeballs and brains.