Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
It is true that some search engine bots do a better job than others with error handling and can accommodate minor typos with no damage to your site. Why take the risk? Be careful and make sure your robots.txt validates 100% properly.
I once had a major problem with a third-tier search robot, because of a missing blank line at the end of the file. Since the definition of a "record" in robots.txt is that it ends with a blank line, it was understandable -- the robot considered that record to be "unclosed." But it came as a shock, nonetheless.
I would think that most bots do follow the old mantra of "Be liberal in what you accept, and conservative in what you send" but I would never like to test it out. The problem is obviously troubling to Google, as they have a whole section of WebMasterTools dedicated to verifying and checking your robots.txt file.
Google tripped me up a few years ago, when I tried something new at the time: [webmasterworld.com...]