Forum Moderators: Robert Charlton & goodroi
If the link to your site is www.example.com?ref=349782595 will Google give you less linking juice / benefit than for www.example.com?
I'd love to have some kind of dynamic tracking parameter but worried that Google might think that this is not a natural link and so any kind of linking benefit will be ruined.
In similar situations, people sometimes 1) capture the tracking information 2) 301 redirect to url without the query string.
I find there are different solutions for implementing tracking codes depending on what you are tracking, how accurately you have to track, whether you are using a scripting language (PHP, ASP, .NET, etc),and various other factors.
For example, on the home page for the site I inherited they might display 3 different links to a single content or product page. The business wanted to know which link placement had the highest click thru rate (CTR) from consumers when viewing the home page. Not understanding the impact of using query string parameters to SEO, a previous developer simply added a tracking code to each of the URLs so that our web traffic analyzer software could report on CTR by placement. So the links on the home page might have appeared as follows:
<a href="http://www.example.com/myproductpage?placement=1">productname</a>
<a href="http://www.example.com/myproductpage?placement=2">productname</a>
<a href="http://www.example.com/myproductpage?placement=3">productname</a>
This provided the business with a way to track on which link people were clicking with 100% accuracy (or so they thought), but it also created 3 URLs for the same product page. This creates duplicate content issues as Tedster mentioned as there were now 3 URLs with exactly the same content. It gets worse if you think about it.
If 100 other websites link to my product page with the first URL, 50 sites link with the 2nd URL, and 25 sites link with the 3rd URL, my prdouct page does NOT get credit for 175 inbound links. The link equity/link juice/PR for the 175 links is split over 3 URL rather than focused on a single URL.
Also, if any of the URLs containing the tracking codes actually ranked and got organic traffic from the engines, the organic traffic would distort the CTR reporting unless the traffic analyzer software is told to only count requests for each URL as a click if the referring URL was another page on our site.
I eliminated these types of tracking codes by using cookies. I created a JavaScript function named SetCookieList() to parse a pipe delimited string of 'cookiename:cookievalue' pairs. This allows me to pass multiple internal tracking codes and their values to a target page each time a click event occurs using links similar to the following:
<a href="my clean url" onclick="SetCookieList('cookie1:value1¦cookie2:value2¦...¦cookien:valuen');">anchor text</a>
I then converted the above URLs to:
<a href="http://www.example.com/myproductpage" onclick="SetCookieList('placement:1');">productname</a>
<a href="http://www.example.com/myproductpage" onclick="SetCookieList('placement:2');">productname</a>
<a href="http://www.example.com/myproductpage" onclick="SetCookieList('placement:3');">productname</a>
Every time a page on my site loads, the first thing my scripting language now does before rendering the HTML is to call a standard function to check for the various tracking cookies. If one or more tracking code cookies have a value then the page will log the tracking codes and their value in such a way that the web traffic analyzer software can reassociate it with the requested URL. Then the function may or may not clear the tracking cookie (depending on whether it needed to live for a single click or possibly longer for certain types of tracking codes).
I'm not sure exactly what percentage of consumers have JavaScript or Cookies disabled, but if it's 5% then 95% of the time I am now able to track clicks by link placement. A 95% sample size is more than enough to get a very accurate idea of which placement has the best CTR and what proportion of clicks each of the links receive.
Now when other webmasters copy my URLs out of their browser to create a hyperlink back to one of my pages, they always get a URL that is free of query string parameters, so this solution eliminated the split link equity problem and reporting inaccuracies. In the above example my produtpage would now get credit for all 175 back links.
AND for this type of internal tracking, I don't have to include a 301 redirect to eliminate the tracking code from the URL...
Sometimes it's impossible to avoid using query string parameters to pass tracking codes and their values. Query string parameters are appropriate when you want to pass a tracking code from an external site to a page on your site when a consumer clicks on a link on the external site. They are also appropriate if you have to be 100% accurate in counting clicks.
The site I inherited had plenty of these external tracking codes as well. I modified the scripting code described above that handled the internal tracking codes so that it also handles tracking codes passed in from external sites as query string parameters. I modified it so that it first looks to see if the requested URL has tracking codes in the query string. If so it creates a cookie for each of the trackingcode:value pairs, strips them out of the URL, and then 301 redirects to the clean URL. If no tracking codes are found in the URL (which will be the case after they've been logged, stripped from the URL and redirected to the clean URL) then the code checks for tracking cookies. If found they are logged so the web traffic analyzer software can reassociate it with the URL for reporting, and possibly clears the cookie as described above.
Sorry so long but based on your questions I thought you might want a little detail... ;)
[edited by: ZydoSEO at 6:06 am (utc) on Dec. 5, 2008]
Could I ask for you advice?
I have a site with many, many affiliate URL's
for example
www.mydomain.com/?Campaign=abcde
www.mydomain.com/?Campaign=feghi
www.mydomain.com/?Campaign=xyz
I want to maximise the benafit of all the links. Is there benefit from 301 redirecting the URL's with the tracking code to the clean URL?
Or should I use the hash tag approach, eg
www.mydomain.com/#Campaign=abcde
www.mydomain.com/#Campaign=feghi
www.mydomain.com/#Campaign=xyz
Has anyone actually done either of the above? Did it work?
This article says that Google takes into account tracking parameters in URL's: [googlewebmastercentral.blogspot.com...]
and when I google with info:www.mydomain.com/?Campaign=xyz Google lists the clean domain name www.mydomain.com
So, I'd love to hear from you if you've actually addressed this issue for affliate links with tracking paramaters in the URL AND seen a postive benefit.
[webmasterworld.com...]
The question is if Google sees a URL with a unique tracking parameter on an external site as unique?
Example -links from external websites:
www.mydomain.com/?Campaign=abcde
www.mydomain.com/?Campaign=feghi
www.mydomain.com/?Campaign=xyz
Are all these unique as far as Google is conecerned?
Has anyone tested the effect of using hash tags or 301 redirects to optimise external link juice?
canonical tag is only for use on pages within a single site
That's definitely true but it doesn't change my comment. If the TARGET site uses the canonical tag, then no matter what extra parameters are on the LINKING site, the juice that those duplicate urls generate should be combined in Google's back end.
Yes, historically those urls are considered to be different by search engines - they certainly are different technically, and they can serve different content. Google has been working to combine them, but the algo may not always get it right. Giving them explicit instructions
All my URL's from 3rd party sites link to the home page - each contains a tracking parameter, each link shows the same content.
So you're saying that I should add the canonical tag to my home page as this will make sure the search engines recoginise this as the main page, and so pass on all the link juice?