wilderness - 6:23 pm on Oct 4, 2011 (gmt 0)
Am I going to be fighting an endless battle with bots or is there a way to satisfy those that have the ability (at a higher cost to them) to circumvent counter-measures?
Your going in circles (chasing your own tail).
You've been provided multiple solutions by longtime participants in this forum, and yet, rather than tackling the work and solutions, your still looking for a one-shot copy and paste solution where no such thing exists.
The most effective method is to "deny all", then "allow the UK ranges".
then go back allow non-UK ranges based upon raw log activity.
Two tiers of blocking.
Only way to deal with that is to ip ban
Put the static pages and common files in / with relatively loose root-level .htaccess controls, then put the search interface (and any supporting files) in a subdir with tight dir-level .htaccess controls? Or alternatively, in a subdomain?
I'm in the UK and I had a similar problem. I now have a .htaccess file several thousand lines long