Forum Moderators: phranque
I'm trying to stop scrapers: If somebody requests more than x number of files from the server in a given time period from a given IP (or subnet), it will redirect them to another page where they must perform a human possibly only action in order to continue requesting files from the server uninterruptedly.
Obviously, this would need to comprise with a DB (perhaps MySQL). Does such a module exists, or must I have to hire someone to get it done for me?
Naturally, rules would need to be in place. I need to be able to specify the # of hits, the time period, IP blocks which are immume (for Google, Yahoo) etc..
JIm