Welcome to WebmasterWorld Guest from 220.127.116.11
is that users are going to be UK only - but I might want to allow US,CA,IE,AU,NZ usage too (mainly so webmasters/press can try it and write about it).
Do you mean opening up single IPs?
Also, do you find that people change IP often in the States?
The North American (at least within the US and CAN) IP ranges even though dynamic are fairly consistent (unless the user resets their modem daily). With the majority using broad band connections these days their routers are left on 24/7 maintaining the same dynamic IP for months at a time.
(such as IP addresses that have many simultaneous users - is AOL still like that?)
people are likely to want to "crawl" the site by doing lots of search queries rather than link-based crawling.
Allowing a few queries then checking the user is a human with a captcha is a fairly common approach on webmaster-targeted sites (such as for whois, speedtest, dns test, geo ip, etc).
If you want to be listed in search engines such as Google do not block US IPs.
joined:Apr 25, 2002
Am I going to be fighting an endless battle with bots or is there a way to satisfy those that have the ability (at a higher cost to them) to circumvent counter-measures?
The most effective method is to "deny all", then "allow the UK ranges".
then go back allow non-UK ranges based upon raw log activity.
Two tiers of blocking.
Only way to deal with that is to ip ban
Put the static pages and common files in / with relatively loose root-level .htaccess controls, then put the search interface (and any supporting files) in a subdir with tight dir-level .htaccess controls? Or alternatively, in a subdomain?
I'm in the UK and I had a similar problem. I now have a .htaccess file several thousand lines long