Forum Moderators: goodroi

Message Too Old, No Replies

robots.txt file question?

What's this?

         

printlist

11:42 am on Oct 9, 2002 (gmt 0)

10+ Year Member



I am reworking an old site and found a robots.txt file in the root directory. Contents are:
User-agent: *
Disallow: /admin/

I don't know what this means. I definitely want this site and all its filed crawled by google and other search engines.

Is this file preventing the crawl? Would deleting it altogether be better?

Thanks,
Steve

chiyo

11:45 am on Oct 9, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



It is telling all spiders not to spider any files in the /admin/ directory only. From the sound of the name its probably a directory with files in it you dont want spidered! eg drafts, programs, caches etc.

korkus2000

11:46 am on Oct 9, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think you want to keep crawlers out of your admin area. I would keep it. Here is a tutorial to help you.

[searchengineworld.com...]