Msg#: 3357921 posted 12:57 pm on Jun 4, 2007 (gmt 0)
I've just been crawled by another Nutch variant. Rather than having to add the user agent to robots.txt for each variant, is it possible to use a wildcard to disallow all spiders which have 'Nutch' anywhere in the user agent?
Msg#: 3357921 posted 7:33 pm on Jun 4, 2007 (gmt 0)
according to the nutch website
Different installations of the Nutch software may specify different agent names, but all should respond to the agent name "Nutch". Thus to ban all Nutch-based crawlers from your site, place the following in your robots.txt file:
Msg#: 3357921 posted 8:37 pm on Jun 5, 2007 (gmt 0)
The only way to truly block all the noise, including all the nutch's, is to do your robots.txt and .htaccess file in WHITELIST format so everything else goes away. Tell 'em nicely in robots.txt and keep 'em out by force in .htaccess.