Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Hiding affiliate links from Google

         

realmaverick

8:59 pm on Jan 16, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm starting my first affiliate website and I want to get it right first time. I play on getting 100% of my traffic from Google organic results.

I've been researching different methods to disguise and hide my affiliate links from Google.

The best method appears to be, using shorting URL services such as bit.ly and then disallowing bit.ly via the robots.txt

Of course this would have the added bonus of hiding affiliate links from visitors.

Is there something I'm missing with this method? It seems perfect but as I say, I want to get it right first time!

Any advice would be awesome :)

goodroi

9:24 pm on Jan 16, 2012 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



How are you going to disallow bit.ly using your robots.txt?

I prefer to have my outbound affiliate links go through a subfolder on my site (something like example.com/info/exampleaffiliate.html). I then use robots.txt to block Googlebot's access to the /info/ folder and I redirect the traffic sent to exampleaffiliate.html to the external affiliate landing page. This blocks Googlebot from following the links but I have confidence Google can still figure it out using other sources of information (toolbar users, isp log files, etc).

My primary focus for this is not to hide from Google but to keep track of how many outbound clicks I am sending to the affiliate pages. Also IMHO example.com/info/exampleaffiliate.html looks alot better to users who hover over the links than exampleaffiliate.com/affid?=weare#trackingyou.

Sgt_Kickaxe

10:39 pm on Jan 16, 2012 (gmt 0)



Doesn't work anymore goodroi, well, not at keeping google from knowing about them anyway. I did exactly what you describe on an older site and got a shock 2 years later when I removed the affiliate pages and removed the robots.txt block. Google webmaster tools reported every single one of the aff pages as missing despite them not ever being indexed and it placed the discovery date after the robots.txt was already on (I set it up before I added the pages).

Though you can avoid having such links indexed you cannot "hide" them anymore, not even with javascript, Google's been able to find the page destinations anyway. robots.txt says do not index, not do not crawl through.

The only remaining option, that I am aware of, is not optimal either - serving up a redirect or blank page to googlebot only... and I suspect they test for that too. (a human review would spot this instantly and score the page accordingly, Google says don't do it)

Your BEST bet right now is to limit the number of links per page and seriously limit the number of pages that have nothing but affiliate links.

realmaverick

11:48 am on Jan 17, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How are you going to disallow bit.ly using your robots.txt?


Wow, good point. I have no idea what I was thinking. Mixing 2 different ideas, without thinking it through.

robots.txt says do not index, not do not crawl through.


It's the opposite, it says do not crawl through. You're forbidding them access. But that's not to say they will honour it. Though they should.

TBH even if google does crawl the pages, I'd still opt to take goodroi's advice, even if it's only to neaten up the destination URL a little.

Thanks a lot guys.

enigma1

1:56 pm on Jan 17, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



you cannot "hide" them anymore, not even with javascript, Google's been able to find the page destinations anyway

You can hide any link with javascript and the googlebot won't see it.

atlrus

1:58 pm on Jan 17, 2012 (gmt 0)

10+ Year Member



Google's been able to find the page destinations anyway. robots.txt says do not index, not do not crawl through.


It's actually the opposite, from my experience, i.e. Google is indexing the URLs of those pages (since there are links to them from your website), but does not crawl them. In other words, yes, Google knows that there is a page at that URL, because, again, you link to it, but it has no idea what's on the page.

londrum

2:14 pm on Jan 17, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



even if you could hide the links from their bots, you cant hide them from Chrome. if a user follows the link and they're using chrome, or they've got the toolbar installed, then google will straight away know where it leads. and they will straight away know what the content is as well -- because they have to display it.

i think it's pointless trying to hide affiliate links from google anymore.

enigma1

3:12 pm on Jan 17, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



if a user follows the link and they're using chrome, or they've got the toolbar installed, then google will straight away know where it leads

Although I agree google may know where a user lands after clicking something on a page, I totally disagree that google could use it reliably. The page content is on the client end now, so if that was the case, it could be manipulated to anything you can imagine.

londrum

3:31 pm on Jan 17, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



you could say that about any link though.
its chrome that receives the data from the server and displays it to the user, so google is bound to be aware of what it is.
i would suggest that there is no possible way to hide an affiliate link from google when you remember all the weapons they've got -- bots, browsers, toolbars and analytics. some of them might be on your site (which you can remove), some of them might be on the destination site (which you cant), and the rest will be embedded in the users software. you cant fool all of that.

enigma1

4:02 pm on Jan 17, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The googlebot is a neutral entity that retrieves the content directly from the server, where the browser that passes information to another party is subject to what the user wants to pass, that is my point.

So if the bot sees an "a" tag or in general something that resembles a link it could follow it and it's a trusted source because it comes directly from the page accessed and well, google decides what to do next. The rest (toolbars, browsers etc can be manipulated). It won't be any different than having Chrome displaying a page the content is identical with what the bot has on record. It's what happens when the user clicks on a particular spot inside a page. What the browser sends right then can be set to whatever the user wants.

Therefore setting aside the bot factor, the rest can be manipulated by anyone for any site and can't be trusted. It can be associated too for any number of sites, so I could make you look like you have lots of affiliate links just by using Chrome and setup web pages on a server as the destination.