Welcome to WebmasterWorld Guest from 184.108.40.206 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
robots.txt file question? What's this? printlist msg:1529572 11:42 am on Oct 9, 2002 (gmt 0) I am reworking an old site and found a robots.txt file in the root directory. Contents are: User-agent: * Disallow: /admin/
I don't know what this means. I definitely want this site and all its filed crawled by google and other search engines.
Is this file preventing the crawl? Would deleting it altogether be better?
chiyo msg:1529573 11:45 am on Oct 9, 2002 (gmt 0)
It is telling all spiders not to spider any files in the /admin/ directory only. From the sound of the name its probably a directory with files in it you dont want spidered! eg drafts, programs, caches etc. korkus2000 msg:1529574 11:46 am on Oct 9, 2002 (gmt 0)
I think you want to keep crawlers out of your admin area. I would keep it. Here is a tutorial to help you.