Good question. My answer? Prudence.
Since at least 2006-2007 when MSN started simultaneously running lots and lots and LOTS of bots -- bingbot, msnbot, msnbot/2.0b, msnbot-media, livebot-searchsense, MSNPTC, msrbot, msnbot-Products, msnbot-NewsBlogs, MSNBOT_Mobile, MS Search 4.0 Robot, yadda-yadda -- it's been tough determining which bots data-share with each other, or which blocked bots might impact SERPs.
And MSN runs 'unofficial' bots, too: MSN's many cloaked bots. Again. [webmasterworld.com
So now, while Bing and Yahoo hammer out integration/assimiliation and which bots may data-share with each other, I'm reluctant to deny any of their bots whole-hog. That's why I limit based on combinations of IP/Host, filetype, and UA, just as I've been doing with Yahoo, MSN, and Google for years.
Speaking of UA-specific access control...
"Yahoo! Slurp/3.0" ignores robots.txt (ditto "Yahoo! Slurp China"). 'Plain' "Yahoo! Slurp" -- no version number -- is complying. At this time...