The most likely problem is that you have implicitly banned the robot from fetching both your robots.txt file and your custom 403 error document (if you have one).
If it tries to fetch robots.txt, it'll get a 403. When the server tries to serve the custom 403 error document, that will result in another 403. So, as a result of this second 403, it will try again to serve the custom 403 error document, resulting in yet another 403... this will continue until the server gives up and throws a 500-Server Error.
You should allow *all* user-agents to fetch these two documents -- and any objects (css files, images, etc.) that are "included" in your custom 403 error page as well -- the number of which should be kept to an absolute minimum, preferably zero, BTW.
robots.txt:
User-agent: bad-bot-nickname
Disallow: /
User-agent: *
Disallow: /public_html/botsv/
Disallow: /cgi-bin/
Disallow: /forms/
Include the trailing blank line as shown.
.htaccess:
Options +FollowSymLinks -MultiViews
RewriteEngine on
#
# Deny access to all resources except robots.txt and custom 403 error document
RewriteCond %{HTTP_USER_AGENT} [i]bad-bot-user-agent-string[/i]
RewriteCond %{REQUEST_URI} !^/(robots\.txt|[i]path-to-your-custom-403-error-page\.html[/i])$
RewriteRule ^ - [F]
I included the two 'setup' directives usually required for use of mod_rewrite. If these are already present in your .htaccess file, then do not include this 'extra copy' of them.
[L] used with [F] is redundant.
Note the use of the two different terms for the bad-bot's "name" -- The "identifier" required in the robots.txt file is almost always different from the full user-agent-string name seen in your raw access log file, so I wanted to at least hint at this fact in the code.
You could put the exclusions for robots.txt and the custom 403 error document in the RewriteRule's pattern. I showed this exclusion as a separate RewriteCond primarily for reasons of clarity. You could delete that second RewriteCond and change the rule to
RewriteRule !^(robots\.txt|[i]path-to-your-custom-403-error-page\.html[/i])$ - [F]
which would be a bit faster.
Note that some Webmasters inadvertently declare a custom 403 error page by ticking a box in their control panel. This typically results in the file "/403.shtml" being declared as the custom 403 error page. However, ticking the box does not create this document, so a 404 will result if any attempt is made to fetch it. Therefore, we often see 403-404-403-404-403-500 error loops as well.
Jim