Forum Moderators: open

Message Too Old, No Replies

Two domains, same folder

Want to disallow googlebot on one domain

         

JohnC

2:38 pm on Feb 4, 2003 (gmt 0)

10+ Year Member



Hello,

I have two domains pointing to the same website/folder on the server (I know this is bad). I use one domain for PPC and the other for everything else. Google is currently indexing both of them but I know the clock is ticking for a penalty so I need to know how to prevent googlebot from indexing domain1.com while continuing to index domain2.com

I know a robots.txt file will do the trick but both domains share the same root so a disallow would prevent indexing of both domains since they share the same robots.txt.

Any suggestions?

hetzeld

2:59 pm on Feb 4, 2003 (gmt 0)

10+ Year Member



Hi JohnC,

I did this once for one of my clients and found that the easiest way to go was the URL rewriting.
If you define a conditional rule that will present a different file when robots.txt is requested, depending on the HTTP_HOST, it should work OK.

Dan

<edit> Sorry, I forgot to welcome you! Didn't realize it was your first post here</edit>

heini

3:07 pm on Feb 4, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Welcome JohnC

For a quick fix you could put up robots.txt alloowing/disallowing specific subdirs.
Try this tutorial:
[searchengineworld.com...]