Welcome to WebmasterWorld Guest from 184.73.3.107

Forum Moderators: goodroi

robots.txt file question?

What's this?

   
11:42 am on Oct 9, 2002 (gmt 0)

10+ Year Member



I am reworking an old site and found a robots.txt file in the root directory. Contents are:
User-agent: *
Disallow: /admin/

I don't know what this means. I definitely want this site and all its filed crawled by google and other search engines.

Is this file preventing the crawl? Would deleting it altogether be better?

Thanks,
Steve

11:45 am on Oct 9, 2002 (gmt 0)

WebmasterWorld Senior Member chiyo is a WebmasterWorld Top Contributor of All Time 10+ Year Member



It is telling all spiders not to spider any files in the /admin/ directory only. From the sound of the name its probably a directory with files in it you dont want spidered! eg drafts, programs, caches etc.
11:46 am on Oct 9, 2002 (gmt 0)

WebmasterWorld Senior Member korkus2000 is a WebmasterWorld Top Contributor of All Time 10+ Year Member



I think you want to keep crawlers out of your admin area. I would keep it. Here is a tutorial to help you.

[searchengineworld.com...]

 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month