Msg#: 13041 posted 1:23 pm on Mar 28, 2006 (gmt 0)
Why max out the number of pages views? I'd imagine a more effective solution would be to block traffic by viewing profile. A la, you're not gbot or slurp, and you're requesting a new page every 3 seconds, so you get banned.
Msg#: 13041 posted 2:11 am on Mar 29, 2006 (gmt 0)
You can't block by User-Agent because they can easily be faked.
I don't think it hurts to include User-Agent blocking, to weed out the low hanging fruit of theives that are too dumb to mask their Wget or whatever tool they're using to download an entire site, and keep them from stealing those 100 pages a day under your IP count method.
It bans an IP if the number of accesses exceeds a certain amount over a specified time limit. To speed up the process IPs are stored in a 2 to 4 digit hash so it may ban a few sites accidentally, especially on a busy site.
Msg#: 13041 posted 7:43 am on Mar 30, 2006 (gmt 0)
I always liked the idea of the hidden/ 1x1 pixel used to trigger a blocking script. Realistically its only going to be picked up by a bot. You'll still need have to differentiate between good bots like google etc and the rest but its certainly helped me fend off a lot of the bad guys.