Msg#: 3131361 posted 5:53 am on Oct 23, 2006 (gmt 0)
I have studied a website running with its 25 subdomains and 125 link pages for each subdomain. originally website subdomain have 7 other page includes general information and this same information is available on all 25 subdomains without any change.
all subdomains are having high PR and rankings as well. Is there any rule remains for duplicate penalty. Content might be copyrighted to but it does not mean that they can publish the same content on 25 subdomains and ranking high.
The site is not worth of anything bcoz there is no actual information.and all pages r having a search form to search.
Msg#: 3131361 posted 8:29 am on Oct 24, 2006 (gmt 0)
Sounds like you think you may have spam to report -- but that's not the function of our forums. Everything you mention sounds like a method that is perhaps not what Google wants to see working, but they are the only ones who can be the judge of that. So you should communicate directly with Google on this.
For spam reporting, use this form on the Google website: Google: Search Quality and Your Feedback [google.com]. Be precise but brief in your note to Google. Report both the actual search keywords and explain the problem you see.
Msg#: 3131361 posted 11:34 am on Oct 24, 2006 (gmt 0)
thanks to both of you for the reply.
MY concern is not to report anybody but just curious about to know how this bull**** content is doing excellent job over the engine. And all sub domains r having excellent rankings beating high competition. Bcoz as far as the duplicate content is concern, google is not leaving behind anybody but still... strange.
Degree of the trust might be a case for not supplemental results. But since this website is born, they have the same content. and degree of trust would have come later on. Now the thing is, how google leave this website in initial stages when they all sub domain were child.
anyways.. thanks for your precious time an comment..
Msg#: 3131361 posted 5:24 pm on Oct 25, 2006 (gmt 0)
I've had one site severely hit by duplicate penalties due to the content management system it is using. I lost all rankings and Google indexed all of them.
My second site which is "authority" in its field, I had the same content management script, but that one never lost its rankings for very competitive keywords.
So yes there is a sort of filter which does not hurt "trusted" sites (at least at the moment) for duplicate content issues. For the sake of it I am trying to de-index all the duplicates from both sites.
Msg#: 3131361 posted 7:49 pm on Oct 25, 2006 (gmt 0)
a sort of filter which does not hurt "trusted" sites
My experience is that it *does not hurt "trusted" sites AS MUCH* -- but fixing duplicate issues and such still helps the URLs perform a lot better in the SERPs.
The high PR (or trust or whatever combination of factors plays in here) may help the URLs not to be classed as Supplemental so easily, but they still may be "omitted results" or ranked lower than they could be -- because different versions of a URL are siphoning off Page Rank, for example.