Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Penalty resulting from Affiliate Programs ?

         

doughayman

12:55 am on May 19, 2008 (gmt 0)

10+ Year Member



Hi all,

I have a situation, where a site of mine is being penalized for duplicate content (I believe), as a result of the fact that I have offered up my site for affiliates to promote.

I am using an affiliate service, and my main site, let's call it:

/widget.htm

is being promoted by this services' affiliates, who are promoting my service, as URL's:

/widget.htm?hop=abc
/widget.htm?hop=def
/widget.htm?hop=ghi
.
.
etc.

where "abc", "def", and "ghi" are hopcodes provided by the affiliate service to individual affiliates, so that these affiliates can promote my product.

Now, I notice in my weblogs that Googlebot has indexed all of these "hopcode" links:

/widget.htm?hop=abc
/widget.htm?hop=def
/widget.htm?hop=ghi

as well as my main page:

/widget.htm

These hopcode links above, are obviously being offered up on other websites, and google is equating these URL strings to my website directly.

It appears that my native /widget.htm has been penalized for a "duplicate" penalty, as a result.

How can I continue to allow affiliates to work for me, with this hopcode architecture, but not fall into the duplicate content bucket for my main page ?

[edited by: Robert_Charlton at 1:11 am (utc) on May 19, 2008]
[edit reason] changed to widget [/edit]

tedster

1:41 am on May 19, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You can use one of several approaches. A very simple one is to use a robots.txt for all urls with query strings. That idea and a few more are discussed in this thread: [webmasterworld.com...]

doughayman

2:01 am on May 19, 2008 (gmt 0)

10+ Year Member



Tedster,

Thanks for that reference...it is quite useful.

The Robots.txt statement:

Disallow: /*?

may help this situation for sure. It would make more sense
though as:

Disallow: /*?*

tedster

2:32 am on May 19, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



No difference, really. Disallow rules mean "disallow any url that begins with..."