Forum Moderators: Robert Charlton & goodroi
I have a situation, where a site of mine is being penalized for duplicate content (I believe), as a result of the fact that I have offered up my site for affiliates to promote.
I am using an affiliate service, and my main site, let's call it:
/widget.htm
is being promoted by this services' affiliates, who are promoting my service, as URL's:
/widget.htm?hop=abc
/widget.htm?hop=def
/widget.htm?hop=ghi
.
.
etc.
where "abc", "def", and "ghi" are hopcodes provided by the affiliate service to individual affiliates, so that these affiliates can promote my product.
Now, I notice in my weblogs that Googlebot has indexed all of these "hopcode" links:
/widget.htm?hop=abc
/widget.htm?hop=def
/widget.htm?hop=ghi
as well as my main page:
/widget.htm
These hopcode links above, are obviously being offered up on other websites, and google is equating these URL strings to my website directly.
It appears that my native /widget.htm has been penalized for a "duplicate" penalty, as a result.
How can I continue to allow affiliates to work for me, with this hopcode architecture, but not fall into the duplicate content bucket for my main page ?
[edited by: Robert_Charlton at 1:11 am (utc) on May 19, 2008]
[edit reason] changed to widget [/edit]