cut-and-pasters note that this is a mirror-imaged pair of rules. The first says: "If it comes from a known bing/msn range and DOES NOT call itself the bingbot or msnbot..." The second says "If it calls itself the bingbot or msnbot and DOES NOT come from a known bing/msn range..."
The body of each rule gives the exceptions. In fact the rule is itself an exception; it's rare to have a RewriteRule whose pattern starts with ! Here it means "If they ask for anything other than..." The exception for "forbidden.html" (or any other custom 403 document) is to prevent the server from going into an infinite loop ending in a 500-class error. The bad robot won't get in, but your server has done some extra work.
An alternative is something like:
RewriteRule ^boilerplate/ - [L]
right at the top of your RewriteRules-- before all the [F] and [G] rules. (This is my version. All the error documents live in the /boilerplate directory along with most SSIs and similar files. It is no skin off my nose if the occasional robot asks for "forbidden.html" by name.)
In my case I don't need a mod_rewrite exception for robots.txt because all rules are already constrained by filename or at least extension. And I don't have any other .txt files.
If you block with more than one mod, you need a separate exception for each one. For example <Files "robots.txt"> if you use mod_auth-whatever-it-is-this-week for wholesale IP lockouts.