Msg#: 3787861 posted 5:24 pm on Nov 17, 2008 (gmt 0)
robots.txt is a text file that is placed at the root level of a website. its purpose is to help inform automated programs aka search engine spiders which areas of your website they should access or not access. search engines normally assume they can spider anything online so unless you specifically block it they will probably try to access it. please remember that robots.txt is a voluntary protocol. if an automated program is misbehaving (because of a mistake or intentional programming) then it will still access areas of your site that are blocked by robots.txt.
some webmasters of simple websites that only contain files they want included in search engines do not even upload a robots.txt. most webmasters use robots.txt to block search engines from accessing folders containing stats, user data, scripts and other things you do not want listed in search engines.
if you want to properly secure and protect those files you should use an htaccess file or isapi. htaccess allows webmasters to block access compared to robots.txt which simply asks spiders to be nice and not index certain areas.