Forum Moderators: open
I searched the archives but didnt find info on this.
Lets say a contents site called widgets has a partner/affiliate program, which many large portals use.
They are given the subdomain portal1.widget.com
The actual contents of the site is 99% the same as
portal2.widget.com (just a different cobranded top-using a table)
Because they are directly linked from the portals main and other high PR sites, they are given a high ranking.
In addition, the sites have only been listed by their urls for the past few months.
I dont think this is spam, but I can't figure out how to classify it.
IMHO, for the end user, having 3 of these sites on the top 10 for a major keyword isn't of much value.
Shouldn't the google algo be able to handle it as duplicates?
A site with about 100 affiliates each with a different subdomain. Each subdomain has a different wrapper (with navigation back to the partner) but the content of every page is identical.
So they created 100 different urls (partner1.xyz.com, partner2.xyz.com,etc..).
Each subdomain has about 15,000 identical pages indexed on google.
When you do certain keyword searches on google, they take up the vast majority of the first 100 results.
Shouldn't google put all these different subdomains under 1 result?
I noticed a similar problem a few months ago. A (dutch language) directory-like site with about 150 subdomains showed up in the first 150 search results for a huge number of Dutch language search terms.
The problem looked more like:
bluewidget.dutchportalXYZ.nl
redwidget.dutchportalXYZ.nl
purplewidget.dutchportalXYZ.nl
bluethingy.dutchportalXYZ.nl
redthingy.dutchportalXYZ.nl
purplethingy.dutchportalXYZ.nl
etc.
I sent an email to Google to complain about this (it was very, very annoying when looking for Dutch language sites). They didn't reply my email, but the problem was solved quite quickly. I don't know if they took action or their software did it automaticly.
You can send them an email, but I guess just waiting two weeks or so won't harm.
Bas