Forum Moderators: phranque

Message Too Old, No Replies

Website scan blocking

Blocking website scans

         

lucg71

2:19 pm on Aug 11, 2007 (gmt 0)

10+ Year Member



I'm sure most of you experience this as well, and I'm hoping someone can point me in the right direction.

Every day, my websites get his with scanners who are searching for vulnerable software on my websites (IE:/_vti_bin/_vti_aut/fp30reg.dll). My sites are pretty secure, but I would like to block these types of scans. An IP will often search for hundreds of html/php/dll/etc... and fill up my logs.

I'm looking for a way to auto block IPs when they hit a certain configurable threshold of 404s, 401s or 403s. I already similar software running that will block invalid ssh login attemps, and I'm hoping something similar exists for website scanning.

I'm running apache on a linux server. Any help is appreciated.

londrum

8:36 pm on Aug 11, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



sounds like you just need to install a bot-trap. there's a couple in the webmasterworld libraries that work pretty well. (i've seen two different ones - a perl one and a php one. i use the php one myself.)

if you are finding that these programs are scanning for the same directories all the time, then just place the bot-trap in that directory (create it, if you have to), then the trap will automatically add their IP to the .htaccess file every time they go for it, which will straight away block them from accessing every other page.