I am reworking an old site and found a robots.txt file in the root directory. Contents are: User-agent: * Disallow: /admin/
I don't know what this means. I definitely want this site and all its filed crawled by google and other search engines.
Is this file preventing the crawl? Would deleting it altogether be better?
Thanks, Steve
chiyo
11:45 am on Oct 9, 2002 (gmt 0)
It is telling all spiders not to spider any files in the /admin/ directory only. From the sound of the name its probably a directory with files in it you dont want spidered! eg drafts, programs, caches etc.
korkus2000
11:46 am on Oct 9, 2002 (gmt 0)
I think you want to keep crawlers out of your admin area. I would keep it. Here is a tutorial to help you.