phranque - 12:33 pm on Mar 24, 2013 (gmt 0)
Which Crawlers Does Bing Use? - Bing Webmaster Tools:
most well-behaved crawlers' user agent strings include a url referring to the crawler's information page.
in general, the fallback is the robots exclusion standard.
The value of this field is the name of the robot the record is describing access policy for.
If more than one User-agent field is present the record describes an identical access policy for more than one robot. At least one field needs to be present per record.
The robot should be liberal in interpreting this field. A case insensitive substring match of the name without version information is recommended.
If the value is '*', the record describes the default access policy for any robot that has not matched any of the other records. It is not allowed to have multiple such records in the "/robots.txt" file.
for a sample list of robots (not up to date, but whatev) check out the Robots Database: