Msg#: 16121 posted 7:24 pm on Aug 14, 2003 (gmt 0)
I have a site, for example www.widegets.com, and I am getting ready to run an ad in a magazine. I thought it would be easy to track web leads from the ad if i put a different url in it. So I thought it i put something.widgets.com in the ad then whoever hits the subdomain saw the ad. Make sense? But I don't want this url to get indexed or anything for the fear of possible duplicate content in Google. How can I avoid this? Do i need to set up another server with a meta redirect and exclude via robots.txt or can I exclude a subdomain from robots.txt already? If so, how?
Msg#: 16121 posted 7:39 pm on Aug 14, 2003 (gmt 0)
The company I work for buys up a lot of domain names that are industry related. At a later time my department goes back and does something with most of the domains. Not duplicate content; we just decide to break the company's products into category specific sites. Anyway, while the domains are waiting to be turned into a site they all just point to the primary site for the company. Currently something like 18 domain names are still just sitting and pointing to the primary site. To date, this has never created duplicate content.
Now we only submit domains that are working sites. If you actually submitted a domain that redirects to another, I am not sure what would happen. The point to my long drawn out explanation is that as long as you do not submit the extra domain name or link from another site using the extra domain name, you will be ok.
Putting a link on another site using the extra domain would alert SE's to it existence, but not sure what would happen.
Msg#: 16121 posted 7:45 pm on Aug 14, 2003 (gmt 0)
Do i need to set up another server with a meta redirect and exclude via robots.txt or can I exclude a subdomain from robots.txt already? If so, how?
You will need to set up another site, not server, where subdomain.example.com points to the directory with your ad. Then just put a new robots.txt in that directory that denies all bots.
Like aaronjf said though, if you don't link to the subdomain the bots won't find it.
BTW, if you use [something.example.com...] make sure you also have [[b][...] pointing to the same place. It's amazing how many people don't realise you can have sites without www and insist on typing it in (followed by emails and phone calls saying "your site doesn't work")...
Msg#: 16121 posted 4:38 am on Aug 15, 2003 (gmt 0)
I think I might make a special page for it specifically to go along with the messaging for the ad.
If I understood you right that can work well too. But keep it simple like www.widgets.com/green
Then on the page link out to the rest of the site so the customer is not trapped on that page, but don't link into it form the rest of the site so you could accurately track how well the ad is doing. Then put a meta NO INDEX, FOLLOW on the page so even if for some reason bots find it they will leave it out of the index. That would actually probably be a lot less hassle - and a little quicker - than purchasing and setting up a secondary domain that points to your primary just to track traffic. Not to metion it would save you a couple of bucks.
Msg#: 16121 posted 10:28 pm on Aug 15, 2003 (gmt 0)
I don't want this url to get indexed or anything for the fear of possible duplicate content in Google
Use a 301 redirect from subdomain.widgets.com to www.widgets.com. Then Google won't ever index the subdomain, as it knows the proper location of the site.
I have a domain with .com .net .org and .co.uk tlds all pointing at the same website. I use a 301 redirect on all of them except .com back to the .com version, and Google now only has that version in its index.