Forum Moderators: phranque

Message Too Old, No Replies

Limiting # of Requests

I can do this with CGI, but....

         

Chico_Loco

1:10 pm on Jul 19, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'd like to find a way to do this with an Apache module because it would then encompass requests for all files (images etc..) instead of just the CGI script....

I'm trying to stop scrapers: If somebody requests more than x number of files from the server in a given time period from a given IP (or subnet), it will redirect them to another page where they must perform a human possibly only action in order to continue requesting files from the server uninterruptedly.

Obviously, this would need to comprise with a DB (perhaps MySQL). Does such a module exists, or must I have to hire someone to get it done for me?

Naturally, rules would need to be in place. I need to be able to specify the # of hits, the time period, IP blocks which are immume (for Google, Yahoo) etc..

jdMorgan

2:20 pm on Jul 19, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The badly-behaved bot script [webmasterworld.com] by xlcus and AlexK posted in the PHP forum library is the closest thing I can think of.

JIm