Msg#: 3134879 posted 1:11 am on Oct 26, 2006 (gmt 0)
I can't speak for all bots, but Googlebot follows the line aimed at it, if there is one. So, in this case, it would interpret the file as allowing it access to everything. I would recommend something like this:
You can always verify how Googlebot will interpret a robots.txt file using the robots.txt analysis tool in Google webmaster tools. You can just add the site you're interested in to your account, paste the test file in into the tool, and check specific URLs to see if the test file would block or allow them.
Msg#: 3134879 posted 5:18 am on Oct 26, 2006 (gmt 0)
Be prepared also for ancient-bots, quasi-bots, and broken-bots that can't handle the (valid according to the Standard) multiple-user-agent records. I suggest backing up your robots.txt with 'stronger stuff,' such as mod_rewrite user-agent checks, if possible.
There are plenty of badly-coded 'bots out there that are not really malicious, just incompetent...