SetEnvIfNoCase ^User-Agent$ .*(aesop_com_spiderman|ADmantX|alexibot|backweb|bandit|batchftp|bigfoot) HTTP_SAFE_BADBOT
#.. lots more of these
Don't use NoCase unless you absolutely have to. And there's no need for the .* bits at all. List your robots in the casing they actually use-- for example, GoogleBot (sic) is sometimes used by bad robots. Then make a separate NoCase list for only those robots that use so many casings, it isn't enough even to say [Nn]asty[Ss]tinky[Bb]ot.
You also don't need anchors on the header name "User-Agent", unless you're much plagued by visitors using a supplementary header whose name includes the string "User-Agent" somewhere in the middle.
Finally, you may have overlooked the special mod_setenvif notation specifically for UA strings:
BrowserMatch
BrowserMatchNoCase
# IP range based blocking
SetEnvIfNoCase Remote_Addr ^54\. HTTP_SAFE_BADBOT
...
Deny from env=HTTP_SAFE_BADBOT
I gotta say that is a very weird name for your environmental variable, since it sounds as if it means the opposite of what it says. In any case-- haha-- there's absolutely no reason for the NoCase element here, since you're not matching alphabetic text.
To achieve what you want-- "Deny from everyone meeting this condition, except the ones I specify"-- in mod_setenvif, use the ! which means "unset this variable"-- i.e. don't just set its value to 0, false or "" but remove it entirely. Like this:
SetEnvIf Remote_Addr ^54\. bad_bot
(I assume the trailing \. is to protect against IPv6 addresses, since it's redundant in IPv4.)
BrowserMatch (goodbot|othergoodbot|Pinterest) !bad_bot
The "un-set" line with ! obviously has to come
after the "set" line.
:: obligatory disagreement with wilderness ;) ::
Sure, mod_rewrite is easy once you've got the hang of it. But it's fairly server-intensive, and thanks to wonky inheritance it's only practical when all requests pass through a single htaccess file.
For myself I prefer a two-pronged approach.
First there's an htaccess file in my userspace for any directive that's shared by all sites. That means access control via mod_auththingummy augmented by mod_setenvif, and selected <Files> envelopes and headers for things like robots.txt that occur on all sites.
Then each individual site (within the userspace) gets its own htaccess using almost exclusively mod_rewrite.