homepage Welcome to WebmasterWorld Guest from 54.166.173.147
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
Noindexing a subdomain I can't control?
Sand




msg:4595172
 2:37 am on Jul 22, 2013 (gmt 0)

I'm working with a new vendor. Our technical agreement is setup like this:

I created a new cname record on my site. Now, a (previously unused) subdomain on my site points to a co-branded page on the vendor's site. The effect is that it looks like you're visiting my site when you hit the subdomain, but in reality the content you see on this subdomain is hosted by the vendor.

Now, the problem is that I do not want this content in Google's index since there isn't anything original about it. It aligns well with what my visitors might be looking for, and it puts some extra money in my pocket -- but the content isn't unique. I'm cool with that. Let me send my own visitors there, I don't need Google's. They'll have a good experience.

I can't control robots.txt on the subdomain, on account of my cname record. I also can't control the meta robots directive since the content is hosted on their servers.

Short of asking them to build a one-off noindex solution for me and me alone, is there any other way that I can prevent the pages on my subdomain from being indexed?

 

aakk9999




msg:4595264
 9:07 am on Jul 22, 2013 (gmt 0)

Can't you check for HTTP_HOST of the request and if it is for a subdomain, rewrite robots.txt request to a different file where you block the crawling?

phranque




msg:4595280
 9:59 am on Jul 22, 2013 (gmt 0)

can you upload a .htaccess file to the root directory of that web server?

lucy24




msg:4595384
 4:07 pm on Jul 22, 2013 (gmt 0)

Double-checking here, because it sounds like a configuration I met recently on an unrelated site:

Your primary domain is example.com

Your vendor is at spam.example.com Well, maybe not literally, but work with me here

The primary domain is hosted on your own server, or at least in your own space within shared hosting

The subdomain is physically located in a completely different place, so requests for the subdomain will never see the primary domain's htaccess and/or config.

Is that the setup?

Planet13




msg:4595390
 4:21 pm on Jul 22, 2013 (gmt 0)

I would just ask the vendor to do a one-off noindex for the subdomain.

I am surprised that other websites haven't asked for that as well, with Panda and everything.

Sand




msg:4595433
 6:26 pm on Jul 22, 2013 (gmt 0)

Double-checking here, because it sounds like a configuration I met recently on an unrelated site:

Your primary domain is example.com

Your vendor is at spam.example.com Well, maybe not literally, but work with me here

The primary domain is hosted on your own server, or at least in your own space within shared hosting

The subdomain is physically located in a completely different place, so requests for the subdomain will never see the primary domain's htaccess and/or config.

Is that the setup?


This is exactly right.

JD_Toims




msg:4595536
 10:59 pm on Jul 22, 2013 (gmt 0)

It's pretty simple for them to noindex the whole thing.
All they have to do is put the following in the .htaccess of the subdomain:

Header set X-Robots-Tag "noindex"

lucy24




msg:4595547
 11:38 pm on Jul 22, 2013 (gmt 0)

That's assuming that the two of you-- OP and vendor, I mean-- remain on good terms. But that's a whole nother discussion, having to do with whether it's ever a good idea to let some other person have control of a subdomain under your domain. Seems like there could be calamities in both directions.

rish3




msg:4595555
 12:25 am on Jul 23, 2013 (gmt 0)

To answer it literally, you could:

a) run varnish (or a similar proxy) on a new ip address on your host.

b) change the A record for the subdomain to point to this new ip address.

c) put an entry in /etc/hosts on your host that points the subdomain to their server.

d) within varnish, configure them as the backend, but inject the X-Robots-Tag. Varnish makes injecting http headers easy.

However, if this is the company that I think it is, their whole model for this setup depends on organic search. In fact, a standard part of their contract requires that you point links from your domain to several pages of the subdomain...for this very reason.

Are you sure they have some way to drive traffic other than organic search?

Sand




msg:4595578
 2:34 am on Jul 23, 2013 (gmt 0)

Yeah, I'm not feeling this setup. Might just have to do some API development and maintain control of everything

Without going into too many details, my partner is related to a classifieds vertical of which I have a ton of informational content.

I don't expect them to send me any traffic. I send my visitors to these pages through search forms on my site.

Sand




msg:4595756
 1:44 pm on Jul 23, 2013 (gmt 0)

Thanks everyone for your help. Unfortunately, none of your suggestions will work for my situation. I stayed up all night working with their API and will be taking that route instead so I maintain control.

The stuff we do for Google...

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved