Welcome to WebmasterWorld Guest from 23.20.223.88

Forum Moderators: goodroi

Message Too Old, No Replies

Disallowing Sub-Domain

     
3:16 pm on Aug 27, 2009 (gmt 0)

5+ Year Member



Hey everyone.

My website is example.com, and I have been developing it under beta.example.com

I had hoped search engines wouldn't find that, but they have, so now I need to hide it with robots.txt. But I'm really scared of putting robots.txt in the root of beta.foo.com it will screw up my main site.

So, are the search engines supposed to look at [beta.example.com...] and I put:

User-Agent: *
Disallow: /

Thanks for you help, I gotta get this blocked soon!

[edited by: goodroi at 12:50 pm (utc) on Aug. 28, 2009]
[edit reason] Examplified [/edit]

3:27 pm on Aug 27, 2009 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



As long as client requests for beta.foo.com/robots.txt and foo.com/robots.txt return the two different (and correct) robots.txt files, you'll be fine (You can easily test this using your browser).

Search engine robots expect <any-subdomain>.example.com and example.com to be separate Web sites, each with it's own unique robots.txt and unique pages, so your approach is the correct one. It when these requests are *not* handled separately by a server that you can get into trouble -- typically because of duplicate-content issues.

Jim

 

Featured Threads

Hot Threads This Week

Hot Threads This Month