Forum Moderators: phranque

Message Too Old, No Replies

.htaccess block country/url allow deny vs RewriteCond

         

bbxrider

10:55 am on Jun 30, 2010 (gmt 0)

10+ Year Member



there seems to be at least 2 ways to block country's and specific urls
there is this generated from block country

<Limit GET HEAD POST>
order allow,deny
deny from 77.242.16.0/20
deny from 80.78.64.0/20
etc
allow from all
</LIMIT>

and this which I built from some other posts and info
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ^EmailCollector
RewriteCond %{HTTP_REFERER} ^-?$ [NC]
RewriteCond %{HTTP_USER_AGENT} ^-?$ [NC]
RewriteCond %{REMOTE_ADDR} ^216\.169\.111\.
etc for remote_addr
RewriteCond %{HTTP_REFERER} ^http://www.iaea.org$
RewriteRule ^.* - [F]

I know I can add rewriteCond directives for different octets of the url, is either method better than the other?
my main interest is the most efficient since this is for my test machine where I have anywhere from 3 to 8 virtual host sites I working on and I want my httpd to get rid of the spammers/hackers with minimal processing since my httpd can bog down my processing.
for example between may and june 212.117.163.3 hit my access log 203000+ times! I don't care if its a legit robot, I want it disposed of as efficiently as possible. I'm actually putting some in my router but over all its easier to .htaccess

wilderness

5:09 pm on Jun 30, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The use of mod-rewrite in IP ranges allows the merging of many multiple lines that would otherwise be required in mod-access.

EX:
#Please note for copy and pasters—NO LOGIC for these numbers, other than to provide a multiple expression example.
RewriteCond %{REMOTE_ADDR} ^(11[6-9]|12[0-6])\.

bbxrider

6:43 pm on Jun 30, 2010 (gmt 0)

10+ Year Member



thanks that makes sense, can anybody recommend posts about how to construct the url logic, like block all instances 212.150.... vs just block all 212
its a url version of regular expressions?

jdMorgan

3:33 am on Jul 6, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Look into using the (or a) firewall to block accesses. That is even more efficient than using any Apache resources...

If you do decide to use Apache mod_access, then you'll likely be much happier over the long term witha construct such as:

Order Deny,Allow
#
<Limit GET HEAD POST>
# Deny from bad guys
Deny from 77.242.16.0/20 80.78.64.0/20 123.45.6 192.168.0.0/255.255.0.0
# Allow one good IP address in the range above
Allow from 192.168.1.4
</Limit>
#
<LimitExcept GET HEAD POST>
Deny from all
</LimitExcept>

Note that in your example code, HTTP methods such as PUT and DELETE were completely unrestricted...

Use of the "Deny,Allow" priority allows you to add 'exceptions' to large ranges of IP addresses as shown, which can be very useful. It also allows you to add exceptions to allow *any* IP address to fetch robots.txt and your custom 403 file, avoiding the two kinds of "self-inflicted DOS attacks" that can occur if this is not done.

Jim

bbxrider

7:49 am on Jul 6, 2010 (gmt 0)

10+ Year Member



so you would literally code the

<LimitExcept GET HEAD POST>
Deny from all
</LimitExcept>

directive after the limit directive with all the url's? in that order?

jdMorgan

2:54 am on Jul 10, 2010 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Since the <Limit> and <LimitExcept> containers and methods are mutually-exclusive, their order makes no difference. Their order could be significant if the enclosed lists of methods were different.

Also note that you really don't need to specify "HEAD" -- its permissions are inherited from those of "GET" in either case.

Jim