I found it kind of interesting when I used the robots validator (http://www.searchengineworld.com/cgi-bin/robotcheck.cgi) that both Yahoo and MSN don't have a robots file at all - it just 404's out. Google has a very extensive one.
There was previous debate that lack of a Robots file would possibly lead to problems but this doesn't seem to be the case if these major players aren't using them.
The only problem it leads to is an error log full of 404 errors. That can be cured by putting up a blank file named robots.txt, which is equivalent to allowing all robots access to all pages, but it stops the 404 errors.
well, assuming that the domain listed in your profile is the domain you are talking about, it appears that since you grabbed a copy of the robots.txt from searchengineworld that allows everything, that you have the problem fixed...
how do i know about where it was copied from? because it says so right in it ;)
==== domain obfuscated for TOS =================================== 07/29/04 11:16:45 Browsing http://****-xxxxxxxx-xxxx.com/robots.txt Fetching http://xxx-xxxxxxxx-xxxx.com/robots.txt ... GET /robots.txt HTTP/1.1 Host: xxx-xxxxxxxx-xxxx.com Connection: close User-Agent: Sam Spade 1.14