Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Disallowing Sub-Domain

3:16 pm on Aug 27, 2009 (gmt 0)

New User

5+ Year Member

joined:Jan 6, 2009
votes: 0

Hey everyone.

My website is example.com, and I have been developing it under beta.example.com

I had hoped search engines wouldn't find that, but they have, so now I need to hide it with robots.txt. But I'm really scared of putting robots.txt in the root of beta.foo.com it will screw up my main site.

So, are the search engines supposed to look at [beta.example.com...] and I put:

User-Agent: *
Disallow: /

Thanks for you help, I gotta get this blocked soon!

[edited by: goodroi at 12:50 pm (utc) on Aug. 28, 2009]
[edit reason] Examplified [/edit]

3:27 pm on Aug 27, 2009 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
votes: 0

As long as client requests for beta.foo.com/robots.txt and foo.com/robots.txt return the two different (and correct) robots.txt files, you'll be fine (You can easily test this using your browser).

Search engine robots expect <any-subdomain>.example.com and example.com to be separate Web sites, each with it's own unique robots.txt and unique pages, so your approach is the correct one. It when these requests are *not* handled separately by a server that you can get into trouble -- typically because of duplicate-content issues.