Forum Moderators: coopster
So I have a couple of questions,
1. I can code this myself, and have it include'ed in every file, but I was wondering if it was possible to get it so this script is called for all requests? I'm not interested in any sort of cgi-bin on my site, so I'm not bothered if I lose that. This would mean I could also keep an eye on *what* is being requested, and have it auto-add IP's if it's a robot ignoring the ignore list, or if someone is "fishing" for files with certain names (exploits etc)...
2. Has anyone done this before? and got code to share? call me lazy if you like, but cut'n'paste is so much quicker ;-)
3. Assuming I do get it up and running (and it doesn't go the way of 1/4 of my other brilliant ideas), what sort of things should I be filtering on? USER_AGENT, and REMOTE_ADDR immediately come to mind, I could also put in checks for things like consistant REFERER abuse (hey, my work, I don't want someone else stealing it for their site, and taking my bandwidth with it...)
I'm running apache2, php 4.3 (as a module, rather than cgi-bin), MySQL 4.0, and all on WinXP (will be moving over to linux in a couple of months, but this setup is portable enough ;-)
i wish i could remember the names and/or the sites... however, google is yer friend! that's how i found them long before i found this site... search for "spambot trap" for one... its based on those bots that look for email addresses and may be able to be modified for more general purposes... also search for "defeat bad web robots"... there are several solutions there...