Do you have FrontPage extensions installed, by any chance?
Also, your some of your similar IP addresses can be compressed and written as:
SetEnvIf Remote_Addr ^211\.95\.22(0¦[2-9]) getout
(escape the periods with a backslash)
For things like:
SetEnvIfNoCase User-Agent ".*Indy Library.*" getout
That can just be:
SetEnvIfNoCase User-Agent "indy library" getout
Which will look for a group of those two words, together, case not sensitive, anywhere in the UA. No asterisks or periods required.
As you've included them, your deny-froms don't specify that they are considered getouts, and some servers require you put all your deny-from IP's on a single line when you do them the other way, as you've written them. (Maybe your actual .htaccess file shows them in a different position relative to your block of getouts?) These would all be grouped with your other SetEnvIf's at the top of the page.
Then, later, you can just add:
<Files ~ "^.*$">
allow from all
deny from env=getout
If <Directories> is used, instead, that can also cause blow-ups.
If you look through the WebmasterWorld forums, there was a really long and good thread about this with the proper structure and order of the elements you want to include in the robot identification threads - about a month or two ago? If I can find it I'll sticky you the URL
I know from blowing up enough Perl code (always!) if you start out simple, then add to what already works, line by line, you can more easily pinpoint exactly what-line-of-what finally caused it to crash. I'd get it as streamlined as possible inititally, then begin adding lines while you debug it.