Forum Moderators: phranque

Message Too Old, No Replies

Please help us find a way to block site rippers (2)

bandwidth traffic throttling site rippers crawlers bots

         

westmc

1:42 am on Mar 25, 2005 (gmt 0)

10+ Year Member



We are looking for a way to stop site rippers, preferably through traffic throttling (e.g. users of the website can only download x number of pages in y amount of time or users can only download a total of z number of bytes per day). Searching on this board, I found reference to a php script to do this, but it seems to only work on php pages. We want this to work with static html pages. Another solution I read about on here was mod_throttle, but it doesn't seem to be compatible with Apache 2. Using the referrer and/or user agent will not be sufficient for our needs. Any help would be appreciated.

Mods: I also posted this in Webmaster General because I was unsure of where the best place was. I apologize for posting twice.

sitz

2:02 am on Mar 25, 2005 (gmt 0)

10+ Year Member



For Apache 2.0, there's mod_bandwidth ([ivn.cl ]). If you want to limit by filename, that's the way to go. If you want to limit by bytecount, that's *also* a possible route, but you may also want to look into a traffic-shaping solution for whatever OS you happen to be running (on Linux, this would be iptables). The advantage here is that you've offloaded the traffic-shaping into kernel space, reducing the load on both the host *and* the application.