Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
As for blocking msn, it has been known to spider excessively for no obvious benefit to the webmaster (while consuming the webmster's bandwidth allocation) so some sites block it.
By "the one that you see" did you mean "the one that your browser might get"? Because browsers don't make requests for Robots.txt at all.
What type of server gives up different versions of the file for different requests/user-agents/spiders?
Should all of them be listed in the robots.txt file, or is it a moot point?
> What type of server gives up different versions of the file for different requests/user-agents/spiders?
Mine do. It's one way to cut bandwidth consumed by robots that don't understand multiple-user-agent records. Detect those UAs and serve them a simplified robots.txt with their UA string inserted. A combination of mod_rewrite and some simple cgi scripting on Apache can be used to do this easily.
Some "bad" robots are in fact spoofs of legitimate user-agents. In cases where the legitimate robot visits but is considered to be of no practical use to the site owner, it may be Disallowed in robots.txt. It is in fact necessary to take stronger measures for the spoofers, but having the robots.txt disallow helps identify the spoofers (because they don't fetch robots.txt, or they ignore the contents of robots.txt even though they do fetch it. So no, it's not entirely a waste of time.