Forum Moderators: phranque
Of course, most of these requests include the same script names but they look in different directories. What I want is something I can add to my .htaccess file that will serve a 403 to any request that contains certain words.
I'm part way there with the following:
RewriteCond %{REQUEST_URI} ^/badstring [OR]
RewriteCond %{REQUEST_URI} ^/anotherbadstring
RewriteRule .* - [F,L]
However, this only appears to block requests where the specified word appears right after the .com of my domain name. If the bot is looking a few directories deep, as they often do, these rules aren't triggered.
What do I need to do to make sure ANY request with 'badstring' in it - ANYWHERE in it - gets the 403? And is there a more efficient way to do it than a new RewriteCond for each bad string?
Thanks in advance,
Matthew
I usually use a 'badrobot' list. Be careful if you do this with regular expressions/optional variables after a beginning string. I had to fire myself when I accidentally blocked Explorer for about six hours the other day. Decided to hire myself back when I discovered the issue so I could get it fixed. =)
Justin