Forum Moderators: mack

Message Too Old, No Replies

robots.txt file and sub domains

robots.txt file and sub domains

         

bohemian

12:56 am on Sep 13, 2004 (gmt 0)

10+ Year Member



Hi All,

I'm puzzling about how robots.txt works for sub domains, if I want to disallow robot to sub_domain.my_site_com, can I do:

1).

User-agent: *
Disallow: /sub_domain/

am I doing right? and how about for sub_sub_domain.sub_domain.my_site_com?

2).

User-agent: *
Disallow: /sub_domain/sub_sub_domain/

Or if I disallow spidering "sub_domain", is that disallow spidering "sub_sub_domain" too? I mean I just only need No. 1 commands, or do I need both No.1 and No 2 in the robot.txt file?

Thank you,

encyclo

1:02 am on Sep 13, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sub-domains are seen by Google (and all other search engine bots) as being completely separate domains - so you need to put a robots.txt in the document root of each sub-domain you have.

If you want to completely exclude a sub-domain, just use:

User-agent: * 
Disallow: /

In the document root of that particular sub-domain.

You might want to check out the Robots.txt forum [webmasterworld.com] for more information.

bohemian

4:50 am on Sep 13, 2004 (gmt 0)

10+ Year Member



Thanks for the reply encyclo,

I did not know there's robots.txt forum here.

Thanks again!