Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
The robots.txt file is not intended for controlling access, it main purpose is to disallow access to certain parts of your site and only if the robot visiting obeys the Robots Exclusion Protocol.
It's only the main search engine spiders which normally obey this protocol, others just ignore it and continue to spider your site, it could be an email harvester looking for email addresses or a rogue spider from some unknown source.
The only way to block these is either via a .htaaccess file or with the httpd.conf file if you have root access using apache
For the perfect .htaccess ban list