I have been using this script for a couple of weeks now and it really seems to help identify mischeivious people as well as spiders that are looking for missing files (robots.txt). I put a meta refresh in the header to redirect a human user to the home page. Everyday it seems I get a visit from some new spider that I would probably miss otherwise. Here's the link Apache Guardian [xav.com] Does anybody use a similar, perhaps better system?
That script is very handy and probably will suite you fine. I do something similar with all my server requests. I have every request logged, good or bad, into a central searchable database. I'm a log junky.