Forum Moderators: open
I have two domains pointing to the same website/folder on the server (I know this is bad). I use one domain for PPC and the other for everything else. Google is currently indexing both of them but I know the clock is ticking for a penalty so I need to know how to prevent googlebot from indexing domain1.com while continuing to index domain2.com
I know a robots.txt file will do the trick but both domains share the same root so a disallow would prevent indexing of both domains since they share the same robots.txt.
Any suggestions?
I did this once for one of my clients and found that the easiest way to go was the URL rewriting.
If you define a conditional rule that will present a different file when robots.txt is requested, depending on the HTTP_HOST, it should work OK.
Dan
<edit> Sorry, I forgot to welcome you! Didn't realize it was your first post here</edit>
For a quick fix you could put up robots.txt alloowing/disallowing specific subdirs.
Try this tutorial:
[searchengineworld.com...]