Forum Moderators: phranque

Message Too Old, No Replies

Site protection code for Apache

What general code you have in your .htaccess

         

smallcompany

5:15 am on Aug 8, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If we were to exchange the code of our .htaccess, lines that are applicable to any (or most) of Apache configurations, all for the purpose of protection, what would you have to post?

Below is some example from my .htaccess, mostly thanking to this forum, and especially (who else) jdMorgan.

### DENY IP ADDRESS
deny from X.X.X.X
### DENY DOMAIN FROM LINKING TO YOUR SITE (i.e. block links to your images)
RewriteCond %{HTTP_REFERER} example1\.com [NC,OR]
RewriteCond %{HTTP_REFERER} example2\.com [NC]
### STOP BAD BOTS
RewriteCond %{HTTP_USER_AGENT} bot1 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} bot2 [NC]
### STOP phpMyAdmin ATTACK
RewriteRule (phpMyAdmin¦phpmain\.php¦remository\.php) - [NC,F,L]
### INVALID FILE AND FOLDER REQUESTS
RewriteCond %{REQUEST_URI} something1 [NC,OR]
RewriteCond %{REQUEST_URI} something2 [NC]
### BLOCK attempts to use our server as a proxy
RewriteCond %{THE_REQUEST} ^[A-Z]+\ /?http:// [NC]
RewriteCond %{THE_REQUEST} !^[A-Z]+\ /?http://([^.:/#?\ ]+\.)*example\.com\.?(:[0-9]*)? [NC]
RewriteRule ^ - [F]

This may be a kind of diaper 1 or 2 for many, but that's what I use in my .htaccess files. This would be focused on stopping bad stuff.

How about your .htaccess files?

Thanks

Caterham

11:02 am on Aug 8, 2009 (gmt 0)

10+ Year Member



### BLOCK attempts to use our server as a proxy

Is your server configuration that bad that you need such? BTW: It's completely useless, because if the request was turned into a proxy request, your .htaccess file will never be read (directory walk is bypassed).

There are also either some RewriteRule directives or OR flags missing, depending upon the way you want to go.

smallcompany

2:34 am on Aug 9, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That was in order to stop requests as discussed in this post:

[webmasterworld.com...]

Are you sure it's useless?

Thanks

Caterham

12:54 pm on Aug 9, 2009 (gmt 0)

10+ Year Member



If you store it in .htaccess files, yes. If your .htaccess file was reached, you know don't have a proxy request but a local request for a file.

jdMorgan

2:20 pm on Aug 9, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This code is still useful if you change the comment to reflect what it's actually doing:

### BLOCK requests with protocol and domain name in requested URL-path
RewriteCond %{THE_REQUEST} ^[A-Z]+\ /?https?:// [NC]
RewriteCond %{THE_REQUEST} !^[A-Z]+\ /?https?://([^.:/#?\ ]+\.)*example\.com\.?(:[0-9]*)? [NC]
RewriteRule ^ - [F]

These requests result either from badly-coded or malicious clients, I'm not sure which. But either way, this kind of request never comes from what anyone might consider to be a legitimate visitor.

Here's one from yesterday -- obscured just a bit since it might otherwise be dangerous for our members to click on, or might otherwise be revealing someone's valid account:

211.95.78.*** - - [09/Aug/2009:00:35:22 -0600] "GET http://ant-foo.ds-foo-abuse.com/abc-foo.php?auth=45V456b09n&strPassword=PP%5BHWT%40YCLCJGQZ&nLoginId=44 HTTP/1.1" 403 666 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.8.1.12) Gecko/20080201 Firefox/2.0.0.12"

Jim

Caterham

5:32 pm on Aug 9, 2009 (gmt 0)

10+ Year Member



If you get such a request mapped, that means that your <virtualhost > (if you're using name-based virtual hosting; otherwise all hosts would be accepted anyway) accepts ant-foo.ds-foo-abuse.com as hostname.

The uri

http://ant-foo.ds-foo-abuse.com/abc-foo.php?auth=45V456b09n&strPassword=PP%5BHWT%40YCLCJGQZ&nLoginId=44
from the request line is translated into

r->args auth=45V456b09n&strPassword=PP%5BHWT%40YCLCJGQZ&nLoginId=44
r->uri /abc-foo.php
r->hostname ant-foo.ds-foo-abuse.com

which is done by ap_parse_uri(). The supplied host header is being evaluated and set to r->hostname, if r->hostname was not set yet. But in such a case it was set in ap_parse_uri() before, hence it takes precedence over the host header supplied and the name-based vhost matching is done with the host extracted from the request line.

If matching the virtualhost for other hostnames is unintentional, I'd prefer to fix the cause by adjusting ServerName and ServerAlias (or ip based hosting, fixing it in translate_name) rather configuring the server to map the request to the file system and fix it there in the fixup phase...

jdMorgan

6:52 pm on Aug 10, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Right, the presumption here is that WebmasterWorld readers implement blocking rules only for problematic requests that they are actually seeing on their servers. So if a Webmaster is seeing these "GET /http://example.com/foo" requests in his log files, then that implies he's on an IP-based server and might want to use this code. Name-based virtual hosts will never see such requests, as you stated above.

These particular requests aren't harmful in and of themselves, except for wasting a tiny bit of server resources. But who knows what the goal is, and what subsequent requests might be received if you give them a 200-OK?

I don't know, I just kick all 'weird' requests to the curb with a 403, and forget about them. Not worth the bother or the worry... 403-Forbidden, Connection: Close, Done! :)

Jim