Forum Moderators: phranque

Message Too Old, No Replies

What's the best way to deny i.p.'s

What are the drawbacks to mod_rewrite, htaccess or hosts.deny etc.

         

idoc

4:28 am on Mar 31, 2004 (gmt 0)

10+ Year Member



I hate to do it, but I need to deny a good portion of i.p.'s out of apnic, ripe and maybe lacnic due to repeated proxy attempts and predatory redirects occuring from several i.p. blocks. What is the best method to deny access by i.p. provided you have a dedicated linux server running apache? I have thought a couple days now about how to limit predatory redirects of html pages without restricting access to users behind a legitimate firewall or proxy server of an ISP. I am not sure there is a way to do this without hurting legitimate traffic, so I have decided to limit my exposure to i.p. addresses that most of these have originated... or at least the ones where I can't contact someone at a NOC somewhere who will be interested.

Any ideas would be appreciated from someone who has been down this road.

jdMorgan

5:32 pm on Mar 31, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



OK, my thoughts:

The best way to do it is the way with which you are most comfortable. This is important, because if you make an error and block real customers, it can be a big disaster. So, whatever method seems clearest and easiest for you is the "right way" to do it. Efficiency is a concern, but a relatively small one; Your server works for you, and not the other way around... :)

Another tip: Always make your custom 403 error page polite!

Jim

idoc

7:04 pm on Mar 31, 2004 (gmt 0)

10+ Year Member



Thanks,

In this case the server is the web server of a financial services company that only does business in the US. I don't really want to block the UK or Canada or any other country that conceivably someone could have a dual residency etc. or that I can get similiar industry links with for example... But, I can reasonably feel safe blocking most parts of China, Korea and Eastern Europe and part of lacnic I think. Is there a shortcut to download a list posted somewhere along these lines?

I am thinking to blanket block some addresses in hosts.deny and then add a spider trap to get the others that come up... realizing I will probably have to hand edit the .htaccess frequently to free up dynamic domestic addresses. <added> I would think the ipchains route would be better on a separate server, but the server is currently colocated. </added>

[edited by: idoc at 7:12 pm (utc) on Mar. 31, 2004]

digitalv

7:12 pm on Mar 31, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There are two ways I do this... first is at the router, switch, or firewall which denies ALL attempts to access the network. They wouldn't even be able to PING you - as far as they know you don't exist.

I also use a SQL database method for sites hosted in places where I don't have access to configure the network hardware myself. I have a database of IP ranges (stored in DEC format for speed/range capabilities).

Using an include on every page, I get the visitor's IP address and convert it to DEC with a simple function. Then I execute the following code:

If cdbl(con.execute("proc_VerifyIP <theIPaddress>")(0)) > 0 Then response.redirect "/accessdenied.htm" End If

(I'm using ASP code but you could of course modify this for the language of your choice).

Basically a stored procedure takes the decimal IP address and queries a table of "bad" IP ranges. The SQL code is something like SELECT count(*) FROM IPRanges WHERE VisitorIP >= @IPAddress AND VisitorIP <= @IPAddress

If the IP address of the visitor falls within that range, the stored procedure will return a count of 1 or greater. If nothing exists, count will be zero. The script on the website says if it returns anything OTHER THAN zero, redirect them to an access denied page.

The site I have running this receives nearly 100k hits a day and there is no loss of performance. I also have a ton of other things that go on with the database, this is just part of it. So if you're worried about DB performance doing it this way, DON'T.

idoc

9:20 pm on Mar 31, 2004 (gmt 0)

10+ Year Member



Thanks, I went ahead and put up this for the hosts.deny:

# ip blocks
ALL: 62.248.
ALL: 80.96.75.
ALL: 81.6.
ALL: 81.7.
ALL: 142.179.
ALL: 164.77.74.
ALL: 196.30.116.
ALL: 200.
ALL: 210.
ALL: 211.
ALL: 217.131.
ALL: 218.
ALL: 219.92.

Did I miss any really egregious ip blocks? And also I put up the birdman's spider trap. It will be interesting. ;)

taivu

12:56 am on Apr 1, 2004 (gmt 0)

10+ Year Member



idoc: Michael Wise maintains a list of Chinese and Korean net blocks, you may it helpful.

[okean.com...]

idoc

3:04 pm on Apr 1, 2004 (gmt 0)

10+ Year Member



Taivu,
Thanks for the link. As I look it over...if I need to get the list this long, I might look into a sql lookup solution type like digitalv said.

At what point in size does a bloated hosts.deny or a .htaccess file slow up a request I wonder?

<added> after looking over several months of server logs for certain apnic class A addresses... I could not find even one legitimate surfer for some of them so I will be using a broader brush than most folks can I think. </added>

VectorJ

4:59 am on Apr 3, 2004 (gmt 0)

10+ Year Member



I googled proxy sites and got a list of available proxies, then just blocked all of the IPs on the list, as well as the web-based proxy services (like nonymouse.com). It's a huge list, something like 2000+ entries, but it has served me well. My trolls and ne'er-do-wells disappeared immediately after I dropped the iron curtain on 'em.

idoc

11:19 pm on Apr 3, 2004 (gmt 0)

10+ Year Member



That's also probably a good idea... though it would be some work it is probably worth it. I searched through 3 months of server logs and also noted several i.p.'s that had been notorious for downloading the site and added them to the list also. Some I had already got in the ripe and apnic blocks, but a few were domestic. I found a couple of rings of sites that operate as shills to being about music or eastern religion etc. that redirect to financial services companies. I am pretty well convinced now this has hurt one such site of mine that fell very hard with the last update. I have found content from my site in uncached google snippets on several of these sites. I believe they were simply forwarding their domain to me for googlebot and that the googlebot gave them my content as if it originated with them. I had an email into webmaster@google...maybe I need to call them next week and try to get to the bottom of it all. Thanks for all the good suggestions. Already it almost seems the server had a memory upgrade or something just filtering what I have so far has helped trememdously.