That's exactly what I try to avoid - getting a 403, by using env allow in Order deny, allow.
Each module that can issue a 403 needs its own exemption. So an "Allow from all" directive only works for those requests that were locked out via a "Deny from..." line. If mod_rewrite is issuing lockouts of its own, you need a line saying
RewriteRule ^403\.html - [L]
replacing "403.html" with the exact path-and-name of your custom error page.
But here you've got a different problem:
RewriteRule ^.*$ ^robots.txt$ [E=bad_bot:1,L]
This rule does not have a RewriteCond exempting requests for robots.txt. So you get an infinite loop, which will show up in error logs as a 500, independent of the response sent out to the user.
I hope the ^ in the target was a typo; you meant / for the root. A ^ caret in this location will be interpreted as a literal ^ character. (I tested.)
I use env because I have some id's of the blocked bots in both RewriteCond and Deny from.
I don't understand what you're trying to do here. And you can set environmental variables in mod_setenvif; you don't need to bring out the mod_rewrite heavy artillery.
One thing you need-- and I think you haven't got-- is an envelope that looks something like this:
<Files "robots.txt">
Order Deny,Allow
Allow from all
</Files>
A "Files" envelope overrides anything that came earlier.
<tangent>
afaik Yandex obeys robots.txt and has done so for many years. So if you don't want it around, just ban it in robots.txt
</tangent>