Forum Moderators: phranque

Message Too Old, No Replies

Blocking IP Addresses - When does it become a bad idea?

Block IP Address, Limitations, Performance issue

         

mikesz

4:20 am on Apr 20, 2009 (gmt 0)

10+ Year Member



Like a lot of sites now days, I have an .htaccess file that has a fair number of IP addresses that are being blocked from accessing my dating sites. The addresses are a combination of single IPs and Ranges of known spammers and scammers.

I am aware that this list is checked for every request and am wondering when the number of IPs goes from effective to "OMG That's killing your site performance"?

How many is too many? What are the alternative?

I got the original "perfect" list from one of the posts on this site but I have added a LOT from bad guys that have been caught on my sites dumping junk.

Ideas?

tangor

5:51 am on Apr 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Is this a form-based site? Interactive? Accepts input from the web? If so you'll be busy busy busy unless input sanitizing is implemented. As for all the rest a white list robots.txt and some UA bans (about six at the present) killed 95% of all the undesired to my site. What's left over is much easier to deal with.

Time and bandwidth is better managed by deciding what is ALLOWED. Block everything else. Otherwise many hours and much hair-pulling is the future.

jdMorgan

2:47 pm on Apr 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"Too many" is when you notice a degradation in responsiveness when testing the "all .htaccess blocking rules enabled" condition versus the "all .htaccess blocking rules disabled" condition. Easy enough to test by comparing your site's responsiveness when using your current .htaccess file versus a "greatly trimmed down" one that lacks the long list of blocked IP addressee.

You basically have to test, because the answer depends on how many requests per second you're getting, and your server performance, neither of which are known to us. And server performance is affected by many factors, including the capabilities of the server hardware and its network connection, and on shared hosting, the efficiency of all of the other co-hosted sites.

Jim

wilderness

3:01 pm on Apr 20, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As for all the rest a white list robots.txt and some UA bans (about six at the present) killed 95% of all the undesired to my site. What's left over is much easier to deal with.

This is a fairly vague and general statement.

The thief whom breaks in to your house is not visiting for the first time on the day of the break-in. Rather, he's previously surmised your weaknesses (perhaps on more than one occasion) and returned to capitalize on your lack of improving security.

It's the same with most bots and/or harvesters.
They appear initially under minor visits not grabbing many pages.
Had a corrective action been taken at that time, they could have been "cut off at the pass".