Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: goodroi
A warm greetings to you all.....
As I am new to this forum, I have least knowledge about search engine.....So i would like to know what is Robots.txt?
What is its function? How does it work? Please tell be about the importance of Robots.txt in terms of search engine? How far is it helpful?
Looking ahead for your valuable comments
robots.txt is a text file that is placed at the root level of a website. its purpose is to help inform automated programs aka search engine spiders which areas of your website they should access or not access. search engines normally assume they can spider anything online so unless you specifically block it they will probably try to access it. please remember that robots.txt is a voluntary protocol. if an automated program is misbehaving (because of a mistake or intentional programming) then it will still access areas of your site that are blocked by robots.txt.
some webmasters of simple websites that only contain files they want included in search engines do not even upload a robots.txt. most webmasters use robots.txt to block search engines from accessing folders containing stats, user data, scripts and other things you do not want listed in search engines.
if you want to properly secure and protect those files you should use an htaccess file or isapi. htaccess allows webmasters to block access compared to robots.txt which simply asks spiders to be nice and not index certain areas.
you may also want to read this old thread on the same topic