inbound - 10:04 pm on Oct 3, 2011 (gmt 0)
Allowing a few queries then checking the user is a human with a captcha is a fairly common approach on webmaster-targeted sites (such as for whois, speedtest, dns test, geo ip, etc).
Indeed, this is likely to be the method for the areas that are aimed at webmasters. Unfortunately, the main part of the site (being aimed at average users with no huge carrot enticing them) would probably not support any, even small, hurdles.
If you want to be listed in search engines such as Google do not block US IPs.
That's part of the issue; I don't want any more than a couple of static pages to be listed in any search engine (and those will be treated differently) - the site behind the search interface is for humans only. I know that might sound strange, our other sites welcome search engine traffic, but the new site just does not suit being returned as a search result...