Forum Moderators: phranque
I'm trying to ban sites by domain name, since there are recently lots of reference spammers.
I have, for example, the rule:
RewriteCond %{HTTP_REFERER} ^http://(www\.)?.*stuff.*\.com/.*$ [NC]
RewriteRule ^.*$ - [F,L]
which should ban any sites containing the word "stuff"
www.stuff.com
www.whatkindofstuff.com
www.some-other-stuff.com
and so on.
However, it is not working, so I am sure I did not setup a proper pattern match rule. Anyone care to advise?
[edited by: jatar_k at 5:06 am (utc) on May 20, 2003]
First, I have two of filenames in question in my cgi-bin.
When I use Wannabrowser to GET FormMail.cgi, or php, or Form-Mail.cgi, I get my custom 403 page, but not banned. When I type the search for FormMail.pl, or formmail.pl, both get banned, because they are both traps and both file names exist on my server.
Second, and best of all, when I type in a search for a non-existent filename in the cgi-bin I also get a 403! I used cgi-bin/nonexistent.php as the filename. There is nothing in my .htaccess that I know of that should cause a 403 instead of a 404. I am going to go over the htaccess file word by word to try to find out why this is happening.
More to come...
More: I just checked the permission I had set on my cgi-bin and found them to be 751. I chmodded to 755 and I now get a 404 error for a file not found, instead of 403 forbidden.
Final test: formmail.php now gets banned as designed! The whole problem was a lack of read permission for the World group. I believe I set it at 751 to prevent outsiders from reading the scripts and stealing email addresses, etc (months ago). I have since learned to secure the individual files with 711 permissions, which work just fine.
Thanks to all who tried to help in this unusual situation.
Wiz
There is no need to have multiple copies of the trap, nor to use the Redirect directive (which causes a 302). Just "silently" rewrite *any* filename you want to trap (whether it actually exists or not) to the script:
RewriteRule (form.*mail¦mail.*(form¦to¦2)) /path_to_script.pl [NC,L]
Maybe using the above info to simplify your "trapping model" will get rid of whatever the problem is as a side-effect. Or... maybe not.
Jim
There is one anomally I just discovered. After using Wannabrowser for the formmail tests I left it banned in .htaccess, then went and successfully fetched all the webpages I wanted. The only thing that was banned was access to files listed in the formmail condition. This means something else is overriding the "ban" environment set by the trap script and the directives in .htaccess. Would someone take a quick look at the Files restriction group and tell me if I need to rearrange any directives?
SetEnvIf Remote_Addr ^206\.194\.114\.2$ ban
SetEnvIf Request_URI ^(/includes/403\.html¦/robots\.txt)$ allowit
<Files *>
order deny,allow
allow from env=allowit
deny from env=ban
deny from 12.219.232.74
deny from 24.53.200.12
deny from 24.188.211.3
deny from 61.4.64.0/20
deny from 62.253.166.153
deny from 65.33.10.192
deny from 65.57.163.78
deny from 66.36.240.135
deny from 66.36.246.127
deny from 66.72.195.144
deny from 66.76.144.219
deny from 66.119.34.39
deny from 66.250.125.195
deny from 68.42.21.162
deny from 142.177.144.148
deny from 152.163.252.70
deny from 152.163.252.100
deny from 170.224.224.38
deny from 200.176.32.214
deny from 203.194.146.175
deny from 204.234.17.35
deny from 206.135.194.194
deny from 207.134.171.4
deny from 210.192.120.74
deny from 210.192.96.0/17
deny from 212.138.47.18
deny from 213.221.116.114
deny from 216.93.191.2
deny from 217.21.117.121
deny from 217.78.
deny from 220.73.25.68
deny from 220.73.165.
deny from 220.99.112.2
allow from all
</Files>
Wiz
I have been trying to get the env=ban portion of my .htaccess to actually ban something, to no avail...until just now.
By using the process of addition, I started with a minimal .htaccess in a test directory, with files and folders to match the real thing. Then, I added my banned IP list and the deny from allow from list. Everything worked as expected in the experimental directory, so I kept adding rules and conditions from my main .htaccess until I finally broke the experimental version. The code lines below are the ones that are causing my <Files *> directives to be ignored.
<Files *>
<LimitExcept GET POST>
deny from all
</LimitExcept>
</Files>
I will need a suitable substitute to insert into my rewrite conditions list, if anyone can help me with that. Or, possibly, the syntax is bad in this rule?
I should also mention that the ruleset above follows the main <Files *> allow-some, deny-some rules, abbreviated below.
<Files *>
order allow,deny
allow from all
allow from env=allowit
deny from env=ban
deny from 12.219.232.74
deny from 24.53.200.12
<snip>
</Files>
Wiz
These are the requests i got last time i checked (a couple of days ago):
/cgi-bin/mail.cgi
/cgi-bin/formmail.pl
/cgi-bin/Mail.pl
/cgi-bin/formmail.cgi To catch "mail" as well, i believe you have to modify the compressed version a little bit, by adding a questionmark after "2)":
RewriteRule (form.*mail¦mail.*(form¦to¦2)?) /path_to_script.pl [NC,L] Of course, if you have a legitimate file called "mail-something" (eg. "mail.html") that one will get caught too, which is not what you want, but the expression will need to become a little bit longer to take that into account also.
/claus
I have to report that the compressed version of the form mail rule has caused chaos on my server! I have a page named fmsecurity.html and it all about FormMail security. As soon as I added the compressed line to my .htaccess it banned three visitors and myself who read my FormMail security page! Luckily I had my FTP client open and active when I got banned! Due to this unexpected result I have gone back to Balam's two line Form-Mail ruleset:
RewriteCond %{REQUEST_URI} (.?mail.?form¦form¦(GM)?form.?.?mail¦.?mail)(2¦to)?\.?(asp¦cgi¦exe¦php¦pl¦pm)?$ [NC]
RewriteRule .* path_to_hell.pl [F]
Jim; thanks for the rewrite condition to replace the LimitExcept problem.
Wiz
The problem you had with the compressed rule doesn't make any sense.
Running your "fmSecurity.html" request through a regex tester [regexlib.com] with the compressed rule produces a "No Match" result as expected, so you likely had some other problem such as a missing space or an extra (or missing) [OR] somewhere.
Jim
I think what I am trying to say is that the meta data in the header includes the word FormMail, thus it matched the "form" rule in the compressed rule. See header below:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"><html>
<head>
<title>Security Alert For FormMail Script Users</title>
<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
<meta name="description" content="This is to alert users of Matt Wright's FormMail Perl script about the severe security issues that can arise from using older, unpatched versions of this script. We offer tips for securing, or replacing FormMail with a more modern, secure version.">
<meta name="keywords" content="cgi security, formmail security, prevent email relaying, formmail, nms mail script, hide recipient address in form, stop guestbook email harvesting">
Wiz
> There is no other explanation
Well, there must be, because .htaccess processing is finished before any file is served, and a RewriteRule does not "examine" file content, it only looks at a derivative of the server variable {REQUEST_URI} (which is wny using a RewriteCond testing {REQUEST_URI} is usually only necessary for complex rewrites).
My guess is that there was some other problem, as stated above. Otherwise, I'd have banned myself at least a year ago. That code came off a working server, and was validated again today using the tool I cited above.
I'd be very interested if you ever identify the problem; Do you keep backups of previous .htaccess files by any chance?
Jim