i am sorry if i am asking stupid question(as u can see, i am a total newbie in this!), but can anyone please help? isnt that avoid putting the file that you dont want the robot to crawl onto the host a better approach than applying the robot.txt? Thanks a lot.
It's not a must for seo but it is better to have one even if it's a blank file to stop serving 404's to the spiders
The robots.txt file is not intended for controlling access, it main purpose is to disallow access to certain parts of your site and only if the robot visiting obeys the Robots Exclusion Protocol.
It's only the main search engine spiders which normally obey this protocol, others just ignore it and continue to spider your site, it could be an email harvester looking for email addresses or a rogue spider from some unknown source.
The only way to block these is either via a .htaaccess file or with the httpd.conf file if you have root access using apache