Forum Moderators: open
For the moment (paranoid about getting banned or ortherwise penalized for duplicate content) we've added a <meta NAME="ROBOTS" CONTENT="NOINDEX,NOFOLLOW"> tag to the <HEAD> section of all the subdomains/branded sites.
Is there any (acceptable) way of using the links to our branded sites to add to the main site's PR? I was thinking we could redirect googlebot and/or other bots to our main site if it tries to visit the subdomains, but I don't expect that's appropriate. At the very least, is there a way to allow spiders to crawl the subdomains without duplicate content penalties?
Main options:
1) Remove noindex/noarchive tag from branded sites and
include a link in the footer to the main site.
Pros: Simple, easy, legitimate.
Cons: Likely duplicate content penalty.
2) Redirect spiders to the main site when they
try to visit the subdomains.
Pros: Fairly comprehensive solution, covers most SE's.
Cons: Could be considered a spam tactic, risk PR0
3) Leave the branded sites as they are.
Pros: 100% kosher, no risk.
Cons: We could really use the extra PR.
4) Use frames, navbar in top frame, our site in bottom frame.
Pros: Sounds fully legitimate, simple to implement.
Cons: Looks slightly different, don't know how spiders will handle it.
I *CAN*NOT* risk being banned/penalized/etc, and I don't even WANT to try to "cheat" google in any way so I'm leaning towards option #4 at this point.
Recommendations? Other options/suggestions?
This example is for an articlem not an entire site. Having miltiple clone sites on subdomains coudl very well trip the dup filter trap. If I was you I woudl set about offering diferent content for each sub domain but keep the sites suitable for their use.
Maybee if you give a bit more information about why you need 30 branded sites we might be able to help you out more.
Mack.
[ourwebsite.com...]
(or whatever else winds up sounding like a good idea...)
It's just that this presents all sorts of SE problems/questions/etc. :(
We just don't have the resources right now to build enough custom content for each sub-site to avoid duplicate traps (and I'd prefer not to force google to crawl all these duplicate sites anyway)