Welcome to WebmasterWorld Guest from 107.20.110.201

Forum Moderators: goodroi

Message Too Old, No Replies

Multiple Directories

robot.txt, subdirectories, protected directories

     

Ansuz

1:47 pm on Feb 28, 2005 (gmt 0)

10+ Year Member



Greetings all. My question regards multiple directories under a root outside of the main web docs folder...

Currently running Apache under Linux but have a control panel managing everything. Directories are handled with non-protected and protected directories separate under the root. (separate /httpdocs and /httpsdocs with cursory /cgi-bin, etc under each.) Needless to say it's occasionally a pain.

When creating a robot.txt should I treat secure folders and files as a subdomain? In other words do I need to have a separate robot.txt in the secure directory.

That being the case, is the statement:

User-Agent: *
Disallow: /https

irrelevant in standard web docs folder robot.txt, or does it prevent robots from *following* links in a page to the secure directory as well. (or use this in both directories?)

Related... Is it bad form to use both robot meta tag and a robot.txt? For instance on pages with a link to secure files/folders I don't want indexed, can I use a meta tag (index, nofollow) to prevent spiders crawling to that page?

Thanks much in advance.

 

Featured Threads

Hot Threads This Week

Hot Threads This Month