Forum Moderators: phranque
We have over 1500 informational files on the site and recently have had to watch spam bots as they have begun to overwhelm by trying to devour all 1500 and then coming back the following week to do it again.
Recently, after checking logs and finding only one or two a week of possible real people who use useragent "-" I decided to block all ips who have just the - as their useragent.
(We get a lot of those, and denying them by useragent will cut a lot from individual deny of ip)
I came up with this, which may be good script or not, but which seems to be working.
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^-$
ReWriteRule ^.*$ no.html [L]
(no.html is a txt file remamed no.html. It has email address only, if a real person should read it. Nothing else. It is 26 bits)
Checking the logs the last few days, I am finding this:
68.216.6.5 - - [18/Oct/2004:14:08:12 -0700] "GET /favicon.ico HTTP/1.1" 200 26 "-" "-"
Just a few times, but coming from real people who have downloaded a bunch of gif and jpe's. There is also the line denying the favicon.ico
Because the favicon.ico has "-" as the useragent.
All other download of files for the same ip show:
"Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)" (or some such user agent)
Most of the time the favicon for MSIE has a normal useragent, (Netscape, Opera, and Mozilla all seem to download with normal useragent) but a few special browsers give the favicon useragent as "-"
One individual tried to download favicon.ico three times, so I am guessing it was something noticed.
(If they were bookmarking and the bookmark didn't work I don't know)
I thought about using the following script:
RewriteEngine on
RewriteCond %{GET} ^favicon.ico$
RewriteCond %{HTTP_USER_AGENT} ^-$
ReWriteRule ^.*$ favicon.ico [L]
(I figured a file that asked for favicon.ico and a useragent of "-" would be redirected to favicon.ico, and, being the default when two RewriteCond % lines)
But I don't think there is any such thing as
RewriteCond %{GET}
I only came into scripting for .htaccess last week so all this is new.
So now I am stuck.
Welcome to WebmasterWorld!
This will return a 403-Forbidden response for any request with a user-agent *or* a referrer of "-", unless the request is a GET for favicon.ico:
RewriteEngine on
RewriteCond %{HTTP_METHOD}<>%{REQUEST_URI} !^GET<>favicon\.ico$
RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^-<>¦<>-$
RewriteRule .* - [F]
RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^-<>¦<>-$
RewriteRule !^custom403\.html$ - [F]
I also block requests using a blank user-agent *and* referer, except for HEAD requests. AOL's caching proxies use HEAD requests, so I don't want to block them:
RewriteCond %{REQUEST_METHOD} !^HEAD$
RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^<>$
RewriteRule !^custom403\.html$ - [F]
Important: Change all broken pipe "¦" characters above to solid pipe characters before trying to use any of this code. Posting on this board modifies the pipe character.
If anything else is unclear, see our forum charter [webmasterworld.com] for some useful reference links.
Jim
You answered three concerns.
I was using
<LimitExcept GET POST>
Order Allow,Deny
Deny from all
</LimitExcept>
to deny anything but GET and POST. We do not use PUT on the site and have had problems with "SEARCH /\x90\x02\xb1\x02..... long string and CONNECT
Was aware that AOL uses HEAD and Google also at times I think uses HEAD, so will change the LimitExcept to
<LimitExcept GET HEAD POST>
Order Allow,Deny
Deny from all
</LimitExcept>
RewriteEngine on
RewriteCond %{REQUEST_METHOD}!^HEAD$
RewriteCond %{HTTP_METHOD}<>%{REQUEST_URI}!^GET<>favicon\.ico$
RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^-<>¦<>-$ [OR]
RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^<>$
RewriteRule!^custom403\.html$ - [F]
We get HTTP_REFERER as "-" and HTTP_USER_AGENT as "" and "-"
so it looks on the logs both as:
"-" ""
and "-" "-"
By far most of the bad bots look like "-" "-"
but it might also take care of those that look like
"-" ""
Change all broken pipe "¦" characters of course to a straight up and down line. (I have a European laptop and figuring out where that is, is something else)
If what I have above is screwing things up, I'd appreciate knowing.
Thanks again for the info.
I have been tutoring on .htacces all week and accompanying information.
Now in .htaccess I have the following, which seems to be working.
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://xx.xx.xx.xx/$
RewriteRule ^ /folder/1.html [L]
RewriteCond %{HTTP_USER_AGENT} ^.*xxfgvv.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*aydghthf.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^Wndhrfb [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*cnmxbbcc.*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^WnmnUnmnTg
RewriteRule ^ /folder/2.html [L]
RewriteCond %{REMOTE_HOST} ^.*bcnjdmm.*$ [OR]
RewriteCond %{REMOTE_HOST} ^.*bndjsuurhh.*$ [OR]
RewriteCond %{REMOTE_HOST} ^http://www.badbot.com$
RewriteRule ^ /folder/3.html [L]
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{HTTP_USER_AGENT} ^$
ReWriteRule ^ /folder/4.html [L]
The last three lines is to exclude bots that have no user agent.
I stopped using the RewriteCond %{HTTP_USER_AGENT} ^-$ because the search engine uses that as a user_agent and I cannot change it, and so the search engine becomes inoperable when this rewrite condition is applied.
But the original problem remains.
We are still getting a few a day of favicon.ico that revert to 4.html
When someone requests a bookmark in I.E., or some I.E.'s (Opera, Mozilla Linux and Mac download favicon.ico as routine, but IE doesn't, and when the visitors tries on some I.E. browsers they get a 4.html instead, because on these some I.E.'s the GET request has a blank user_agent.)
I've tried the suggestion of the earlier post but it doesn't work. I get a 500 for the whole site.
I've tried this: (which to me makes logical sense, but it doesn't work)
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{REQUEST_URI}!/favicon.ico/
RewriteCond %{HTTP_USER_AGENT} ^http://xs.zvx.sx.xx/$
ReWriteRule ^ /folder/test1.html [L]
I used the specfic user-agent xx.#*$!.#*$!.xxx, not my domain, as I figured if it didn't work, it would not work only for that user-agent. All the files went down.
Other version, do not work
RewriteCond %{REQUEST_URI}!^favicon.ico$
Other version, do not work
RewriteCond %{REQUEST_URI}!^GET<>favicon\.ico$
Just this one line (all versions) cause a 500 system wide. (all my files go down)
I take it out and everything works fine.
Still stuck over the bookmark, favicon.ico issue, but have learnt a lot and in the process now have a lot of power over the bad bots.
I stopped using the slash mark for xx\.xxx\.xx\.xxx
Because the apache seemed to work better without it.
I guess just one . at a time, when it's surrounded by ^ and $ is now acceptable on the updated versions.
I'll take these two for now:
Other version, do not work
RewriteCond %{REQUEST_URI}!^GET<>favicon\.ico$Just this one line (all versions) cause a 500 system wide. (all my files go down)
A space is required between "}" and "!"
This won't work anyway, because it should have two environment variables as I posted it above:
RewriteCond %{REQUEST_METHOD}<>%{REQUEST_URI} !^GET<>/favicon\.ico$
However, I forgot the leading slash on "/favicon" which is probably why it did not work.
I stopped using the slash mark for xx\.#*$!\.xx\.#*$!Because the apache seemed to work better without it.
Don't stop using the slash. "\." means "match only a literal period." If you do not use the slash, then "." means "match any character." This can lead to problems that are very hard to find.
For example, let's say you block an IP address using
RewriteCond %{REMOTE_ADDR} ^1.2.3.4$
RewriteRule .* - [F]
That is OK.
Hoewever, let's say you want to block several address, like 1.2.3.0-255
Normally, you'd write the RewriteCond like this:
RewriteCond %{REMOTE_ADDR} ^1\.2\.3\.
That would block IP addresses in the range 1.2.3.0 through 1.2.3.255
But if you leave off the slashes, like this:
RewriteCond %{REMOTE_ADDR} ^1.2.3.
Then the following addresses will be blocked:
1.2.3.0-255.0-255
102.3.0-255.0-255
112.3.0-255.0-255
122.3.0-255.0-255
...
192.3.0-255.0-255
1.203.0-255.0-255
1.213.0-255.0-255
1.223.0-255.0-255
...
1.293.0-255.0-255
1.2.30.0-255
1.2.31.0-255
1.2.32.0-255
...
1.2.39.0-255
etc.
Instead of blocking only 256 addresses, you have blocked many thousands of addresses!
You should consider the slashes non-optional. This is because "." will match any single character, alpha, numeric, or otherwise!
Jim
I just tried the code using the special user_agent and it worked fine. Nothing went down.
So I have plugged it into the regular place for it to work, and nothing went down.
RewriteCond %{REQUEST_METHOD}<>%{REQUEST_URI}!^GET<>/favicon\.ico$
RewriteCond %{HTTP_USER_AGENT} ^$
ReWriteRule ^ /folder/4.html [L]
So just have to wait now, and check the logs in the next days.
Just noticed - unless you makes three spaces between the } and the! it looks like they are right next to each other when it appears on preview. (the ! likes to get cosy to whoever it's closest to)
I tried to figure your combining the request_method with the request_URI but never got that far into understanding it.
I just couldn't come up with that in any of the tutorials.
I guess the reason I haven't had a problem with not including the slash is because I am calling a specific address not sending it searching for an address.
I had read what you included, and will use it.
Thanks again.
I wouldn't have been able to figure this one out, with the information I had studied.
Just as an update for anyone reading this.
I missed out the
RewriteCond %{REQUEST_METHOD} GET
request.
This produced an over 403 system file (not a custom one)for all requests with no user-agent. ("-" on log)
Using this:
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{REQUEST_METHOD}<>%{REQUEST_URI} !^GET<>/favicon\.ico$
RewriteCond %{HTTP_USER_AGENT} ^$
ReWriteRule ^ /folder/4\.html [L]
gets the file 4.html (which is the intended file.)
(There is a contact email address only, using the
@
instead of the @
contact robot@domain_name.com
The last information was not quite correct.
the log looked like "-" ""
not "-" "-"
so I amended the .htaccess to this.
RewriteCond %{HTTP_USER_AGENT} ^$
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{REQUEST_METHOD}<>%{REQUEST_URI}!^GET<>/favicon\.ico$
ReWriteRule ^ /thewei/image/special/no\.html [L]
Also moved the http_user_agent to the top line to make it easier on the computer.
Now if the user agent is not "" then it will ignore the rest to after the ReWriteRule
I added this:
RewriteCond %{HTTP_USER_AGENT} ^-$
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{HTTP_REFERER} ^$
ReWriteRule ^ /folder/test1\.html [L]
this gets all bots that have a user agent of "-" and no referrer
and it does not screw up my search engine, because the search engine has user agent "-", but also a referrer, probably the domain name.
(Included the line
RewriteCond %{REQUEST_METHOD} GET
because not clear what AOL uses trolling the site with the HEAD command)
Time will have to show if I need to add this.
RewriteCond %{HTTP_USER_AGENT} ^-$
RewriteCond %{HTTP_REFERER} ^-$
RewriteCond %{REQUEST_METHOD} GET
ReWriteRule ^ /folder/test2\.html [L]
It means "do not substitute (change) the URL." It is useful with several flags, such as [F], [G], or [L].
> RewriteRule .* - [F]
"If the local URL-path is anything, then do not change the URL, and return a 403-Forbidden server response."
> RewriteRule !^custom403\.html$ - [F]
"If the local URL-path is NOT 'custom403.html' then do not change the URL, and return a 403-Forbidden server response."
This is described in the Substitution section of the Apache RewriteRule [httpd.apache.org] documentation.
Jim
I have *never* seen a legitimate request using a user-agent or referrer of "-", so frankly, I wouldn't even worry about it. I just use:RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^-<>¦<>-$
RewriteRule!^custom403\.html$ - [F]
I find this very interesting. However, I am a bit confused reading this. So please allow me to pose a question regarding the above.
IIRC, I have read some posts here that suggested blocking out the ones that had a blank or a "-" for BOTH the UA AND the Referrer.
If we were to block when either UA OR the referrer is blank (or a "-"), would it not block "type-in" traffic, etc.?
Thanks for your help with this, Jim.
However, we're not looking at blank referers in the code you cited, but rather at the case where a malicious agent supplied a non-blank referer that is equal to a literal hyphen. In Apache access logs, this will look like a blank referrer, because Apache substitutes a hyphen and puts "-" in your logs for a blank referrer, rather than showing "".
So if you see "-" in your logs, it could be a blank referrer or it could be a referrer using a literal hyphen.
That is the basis of my comment about 'no legitimate referrer of "-" ' ... I mean that no legitimate request will contain only the literal hyphen.
Even though the difference between a blank and a hyphen is not visible in the log files, it is visible to mod_rewrite, and so can be blocked.
Here's what I recommend:
I suspect that entries that appear in Apache logs as "" (which I have only seen recently) are using a backspace or delete character; Otherwise, I can't see how they bypass Apache's behaviour of subitituting a hyphen for a blank. I've been meaning to try to test that, but haven't had the time.
Jim
I thought I had solved this one but apparently not.
xx.xx.#*$!.#*$! - - [24/Oct/2004:19:35:06 -0700] "SEARCH /\x90\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02\xb1\x02
\xb1\x02\xb1\x02
(I've broken the string and given only 10% of it for the purposes of the query)
The string is only broken on the site by a 414 340
(time)
which then adds "-" "-"
HTTP_REFERER and HTTP_USER_AGENT I presume
My first attempt was to try:
<LimitExcept GET POST>
order allow,deny
deny from all
</LimitExcept>
When that didn't work I tried:
<LimitExcept GET HEAD POST>
Deny from all
</LimitExcept>
When that didn't work I tried both:
<LimitExcept GET HEAD POST>
Deny from all
</LimitExcept>
and:
RewriteEngine on
RewriteCond %{REQUEST_METHOD} SEARCH [OR]
with other RewriteCond %
(The other rewrite conditions work.
I looked at www.w3.org/Protocols/HTTP/Methods.html
and it says
"SEARCH
Proposed only. The index (etc) identified by the URL is to be searched for something matching in some sense the enclosed message. How does the client know what message fromats are acceptable to the server?"
Back to me.
The long search string (a virus meant to attack Windows servers) is still hitting the Apache site, giving a headache to the log every time.
Your server seems to have correctly responded with a 414 (URL too long) response as you have noted above:
The string is only broken on the site by a 414 340
Here is another thread about this 414 bombardment. [webmasterworld.com]
HTH
Even though the difference between a blank and a hyphen is not visible in the log files, it is visible to mod_rewrite, and so can be blocked.Here's what I recommend:
# Block if both Referrer AND User-agent = blank, AND Request_Method is not HEAD
# Block if Referrer OR User-Agent = hyphen
Thanks for the clarification. Greatly appreciated.
I suspect that entries that appear in Apache logs as "" (which I have only seen recently) are using a backspace or delete character; Otherwise, I can't see how they bypass Apache's behaviour of subitituting a hyphen for a blank. I've been meaning to try to test that, but haven't had the time.
Yes, this would be quite interesting to explore. Sticky mail me if I could be of any help in running any tests, etc.
That would work but it would screw up this:
RewriteCond %{REQUEST_METHOD} GET
RewriteCond %{REQUEST_METHOD}<>%{REQUEST_URI}!^GET<>/favicon\.ico$
ReWriteRule ^ /folder/file6\.html [L]
(The - before L causes a system failure in the tests I have done. It works with F
ReWriteRule ^ 6\.html - [L]
ReWriteRule ^ 6\.html$ - [L]
Both cause system failures.
Whether the ^ in front makes a difference here I don't know.
Here's examples from two tutorials:
RewriteRule \.(gif¦jpg)$ [mydomain.com...] [R,L]
RewriteRule ^.*$ X.html [L]
)
And at the moment I am using this to stop a valid search engine bot from going to the site until it reads robot.txt.
RewriteCond %{HTTP_USER_AGENT} ^.*dhsjeyr.*$
RewriteCond %{REQUEST_METHOD}<>%{REQUEST_URI}!^GET<>/robots\.txt$
ReWriteRule .*\.(htm¦html¦jpg¦gif¦jpe)$ - [F]
(I didn't redirect it to robots.txt because it has to know it is reading a robots.txt file.)
If I have to choose between stopping the favicon icon from being downloaded and the pesky, but not threatening string from appearing daily, I probably choose to keep the favicon.
Maybe, as some famous person asked, (can't remember), is there is a third option?
Will not bring politics into this however.
ReWriteRule ^ 6\.html - [L]
ReWriteRule ^ 6\.html$ - [L]
Both of these should cause a server error, because they are invalid.
bose is correct, there is no reason to worry about the long "SEARCH /\x90\x02\xb1..." request intended to compromise windows servers, first because your server is Apache, and therefore immune, and also because your server provided the correct "Request too long" response.
Other than blocking these requests at your firewall, there's nothing much you can do about them.
Jim
I have tried the following and it seems to work as regards allowing the favicon.ico to be downloaded.
RewriteCond %{HTTP_USER_AGENT} ^$
RewriteCond %{HTTP_REFERER} ^$
RewriteCond %{REQUEST_METHOD}!^HEAD
RewriteCond %{REQUEST_METHOD}<>%{REQUEST_URI}!^GET<>/favicon\.ico$
ReWriteRule ^ /folder/file6\.html [L]
I tested with a useragent -
It allows normal GET
but not GET with useragent -
When requesting a GET favicon.ico it asks where to download.
So I am assuming the logic sequence is correct.
Whether this stops the SEARCH string I don't know, there being no valid REQUEST_METHOD SEARCH
Whoever programmed the virus may be forcing something that appears as a REQUEST_METHOD SEARCH but is a GET, or other REQUEST_METHOD in disguise.
I have a firewall but have never tried blocking anything but .exe files.
As you know the SEARCH string comes from many sources.
Using the above rewrite conditions is one more attempt, anyway.
No, what you see in your log is what the server received.
I'm not sure exactly what your first statement above means; I can successfully block SEARCHes using
RewriteCond %{REQUEST_METHOD} ^SEARCH$
RewriteRule .* - [F]
Here are the last three lines of my .htaccess file:
RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^-<>¦<>-$
RewriteRule ^.* - [F]
I don't have a favicon.ico. I made the ¦ solid.
If no one sees a problem, I'm going to leave it.
Thanks Jim.
Here are the last three lines of my .htaccess file:RewriteCond %{HTTP_USER_AGENT} ^Zeus [NC,OR]
RewriteCond %{HTTP_REFERER}<>%{HTTP_USER_AGENT} ^-<>¦<>-$
RewriteRule ^.* - [F]
Just a quick observation -you don't need to anchor the pattern matching in the RewriteRule. As in -
RewriteRule .* - [F]
HTH
There's another problem I am seeing for the following code:
RewriteCond %{HTTP_REFERER} ^http://192\.168\.**\.**/$
RewriteRule ^ /folder/test8\.html [L]
(IP address should read as all numbers)
The code is doing the job of searching for the referrer
giving a redirect to test8.html for that ip
so it works.
But there are some sites, such as Google-image that have long strings in their referrer.
Such as
"http://example.com/foo.htm&h=168&w=220&sz=16&tbnid=axsNl_DJnaoJ:&tbnh=77&tbnw=100&start=299
&prev=/images%3Fq%3Dkim%2Bjong%26start%3D280%26hl%3Dfr%26lr%3D%26sa%3DN"
There are other sites that we don't mind hotlinking that have such as
"http://www.example.com/index.cfm?fuseaction=user.viewProfile&friendID=141140
&Mytoken=20041027154241"
as their referrer.
These both are also are getting the redirect test8.file
So somehow the referrer is including other numbers.
I am wondering what I am doing wrong.
The referrer of the bot we are giving a redirect to, has a / at the end of it, that is why it is included.
The abover referrer code is the only code we have in .htaccess for a referrer.
It has been isolated with an [L] at the end
RewriteCond %{HTTP_REFERER} ^http://192\.168\.**\.**/$
RewriteRule ^ /folder/test8\.html [L]
On other set of code has been using test8.html but have now changed that.
RewriteCond %{HTTP_USER_AGENT} ^zbnxhhd.*$ [OR]
RewriteCond %{REMOTE_HOST} ^.*dhsjhd.*$ [OR]
RewriteCond %{REMOTE_HOST} ^.*fiouf.*$ [OR]
RewriteCond %{REMOTE_HOST} ^.*chjhjuc.*$
RewriteRule ^ /folder/test9\.html [L]
None of this code has numbers in the strings, so I am assuming there is not a bleeding into this code.
It seems to be a problem with the referrer code
RewriteCond %{HTTP_REFERER} ^http://192\.168\.**\.**/$
RewriteRule ^ /folder/test8\.html [L]
matching with numbers in the the other referrers.
"http://#*$!xx.htm&h=168&w=220&sz=16&tbnid=axsNl_DJnaoJ:&tbnh=77&tbnw=100&start=299
&prev=/images%3Fq%3Dkim%2Bjong%26start%3D280%26hl%3Dfr%26lr%3D%26sa%3DN"
"http://www.xxx.com/index.cfm?fuseaction=user.viewProfile&friendID=141140
&Mytoken=20041027154241"
[edited by: jdMorgan at 1:17 pm (utc) on Oct. 28, 2004]
[edit reason] Fixed side-scroll and example URLs [/edit]
Okay, there is another important bit of information I didn't include.
If Google tries a second time, which it usually does,
(perhaps because test8.html file is blank 200 0
and perhaps because a human at the other end is seeing nothing change and clicks again
Google first shows a small pic which it has stored - I think this is how it goes - there is a message at the side asking if you want to see the full sized pic - that gets the pic from the site, the same as hotlinking.
Google also downloads the full htm file and all its subfiles and shows it underneath the pick - but here I am noticing that Google is getting a code 200 0
for the .htm file, and so is not proceeding further to download the htm and subfiles, which is usually does, likely because it sees a blank .htm code 200 0 and so cannot read any subfiles.
It, or the human did ask for the enlarged .jpeg however, and did receive it. All of these GETS had the same referrer.)
In the end, Google using the same referrer code, gets the .jpg 200 29457
29457 is the size of the jpeg)
This was not the situation with the hotlinker. It did not try a second time, so it received the test8.html
file instead of the .jpeg (and was content with that, the human at the other end probably seeing a blank space where the hotlinked pic should be)
If you want to block access coming from a particular IP address, use %{REMOTE_ADDR}. If you want to block by referrer, and the "bad" referrer is identified only by IP address (that is, they don't use a domain name), then what you have should be fine.
Do not use start *and* end anchors on your patterns unless you are sure that's what you want. See the Regular Expressions tutorial cited in our forum charter for help with anchoring.
Because you have start-anchored the patterns, there is little chance that mod_rewrite is getting confused about numbers in your referrers; You have specifically required the referrer to start with "http://" followed immediately by the IP address. This is quite specific, and won't be fooled by some other number in the referrer string. Look for the problem elsewhere.
You will also need to add an exclusion to prevent recursion problems:
RewriteCond %{HTTP_REFERER} ^http://192\.168\.**\.**[b]/[/b]
RewriteCond %{REQUEST_URI} !^/folder/test8\.html$
RewriteRule .* /folder/test8\.html [L]
RewriteCond %{HTTP_REFERER} ^http://192\.168\.**\.**[b]/[/b]
RewriteRule !^folder/test8\.html$ /folder/test8\.html [L]
Jim