wilderness - 11:11 pm on Jun 23, 2013 (gmt 0)
See, that's the problem with blacklisting is that you're always chasing bot user agents and by the time they've already been on the machine and done their dirty deeds it's too late.
I cannot tell you the last time I added a UA to my bots list, course I'm certainly not a newcomer to black-listing, nor are most of the regulars in this forum.
(FWIW the preliminary list I provided in another in the other thread explains that it's simply a beginning. My primary list is far more extensive.)
It's an endless game of cat and mouse, complete no-win scenario and a continual waste of time.
We may have the white & black (list) debates all day long, and what it really comes down to is a method that useable and easy for newcomers to comprehend.
It also comes back to the old choice, about what's beneficial or detrimental to our own sites.
I'm "assuming" under the pretense of "cat & mouse" that you haven't monitored your logs since 2005? (rhetorical).
Which I know to be untrue because you have multiple scripts in place that monitors your logs for you, along with session-ids, which you use to pinpoint the original source.
Do you have a user-agent whitelist that is absolutely guaranteed to identify all humans?
Course he does lucy, unfortunately sharing and/or publishing such a syntax for others to use also opens the door for bots to change their procedures as they become aware of the methods.
That's why there has never been a useful list published for white-listing rather vague explanations.