Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

affiliates, duplicates

google seems to approve of duplicate data

         

mattlamb

5:25 am on Dec 28, 2005 (gmt 0)

10+ Year Member



One of my competitors uses an affliate program that provides the affliates with a complete script that all but duplicates there website on a new/different domain name.

Anyone offer any insight as to why/how they get pass the google spam filter?

Brett_Tabke

1:47 pm on Dec 28, 2005 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



What spam filter? Why would that be a problem at all?

mycutegoddess

9:15 am on Dec 29, 2005 (gmt 0)

10+ Year Member



It could be that what mattlamb is a duplicate filter...

reseller

9:25 am on Dec 29, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



mattlamb

Did you mean that the other "duplicates" affiliate sites pages wouldn't rank on Google serps, because they are duplicates of the affiliate merchant site?

jtara

1:58 am on Jan 5, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Did you mean that the other "duplicates" affiliate sites pages wouldn't rank on Google serps, because they are duplicates of the affiliate merchant site?

I'm sure that is what he means.

Google doesn't like duplicate content. In particular, they don't like complete copies of merchant sites, like that big bookseller which gives this practice it's blessing.

Google used to index these copy-cat sites but I think early last year they made a change. Lots of screaming by affiliates who previously had nothing but a copy of the mothership, and got traffic from Google searches.

Now they have to create a "content site". It seems OK to have the clone site attached to the backend of your website. It's a particularly good idea to use nofollow or hide links in Javascript. (I'd suggest the former - the latter looks too sneaky, and Google might eventually penalize you for it.)

To clarify, Google wants to index your unique content. They just don't want to index 50,000 copies of that big bookseller's website.

As to how one particular implementation seemingly gets around the ban. Dunno. Maybe they rearrange the order of the data. One obvious technique would be to select reviews only from the "n"th page of reviews, which Google might not have indexed from the mothership.

But do you want to take the chance of having your site banned in the future by doing this?

It's not nice to fool Mother Google!

mycutegoddess

2:16 am on Jan 5, 2006 (gmt 0)

10+ Year Member



I don't know the reason why G index Affiliate link which have no such content anymore, just only redirect by using 302 method and send user to our real URL. If i will replace 302 redirect method to 301 one. Will G accept it and index our own URLs.

Any suggestion and solutions will be prefer...

Sobriquet

2:40 am on Jan 5, 2006 (gmt 0)

10+ Year Member



I guess Google is aware of this and imposes hude penalties.

I have a very successful site ( about 2000 pages ), about 5 year old, and last year i joined a major affilate program, adding thousands of pages and products wirh reviews etc. It soared in search results for abotu 2 months and i was happy. then, my whole site was dumped, including my orignal 2000 pages also. my traffic died and my profits were killed too.

i removed the affiliate and it took another three months for my orignal site to be back on page one again. now it is returnign back to normal search results and traffic. the downside is that i lost my pr6 to much lower.

google penalties for duplicate affiliate content are heavy in the long run.