Lately I have been getting hit pretty hard by lexibot. It totaly disreguards robots.txt and downloads massive amounts of pages and as a result is chewing up bandwidth. From what I have read about this thing it is a cross between an off-line browser and a desktop search tool. Are you people allowing or banning this from your sites. I guess the only way to bar it is to use htaccess since it seams to be oblivious to robots standards.