| 6:09 am on Oct 24, 2006 (gmt 0)|
any body ....
| 8:24 am on Oct 24, 2006 (gmt 0)|
is this case is very hard to reply for anyone..?
| 8:29 am on Oct 24, 2006 (gmt 0)|
Sounds like you think you may have spam to report -- but that's not the function of our forums. Everything you mention sounds like a method that is perhaps not what Google wants to see working, but they are the only ones who can be the judge of that. So you should communicate directly with Google on this.
For spam reporting, use this form on the Google website: Google: Search Quality and Your Feedback [google.com]. Be precise but brief in your note to Google. Report both the actual search keywords and explain the problem you see.
| 9:08 am on Oct 24, 2006 (gmt 0)|
I have noticed busy sites with dupe content do not go supplemental. If you gave a degree of trust then you can do what you want.
| 11:34 am on Oct 24, 2006 (gmt 0)|
thanks to both of you for the reply.
MY concern is not to report anybody but just curious about to know how this bull**** content is doing excellent job over the engine. And all sub domains r having excellent rankings beating high competition. Bcoz as far as the duplicate content is concern, google is not leaving behind anybody but still... strange.
Degree of the trust might be a case for not supplemental results. But since this website is born, they have the same content. and degree of trust would have come later on. Now the thing is, how google leave this website in initial stages when they all sub domain were child.
anyways.. thanks for your precious time an comment..
| 4:05 pm on Oct 24, 2006 (gmt 0)|
Google will probably get around to this dupe content. Sometimes I think these kind of housekeeping activities occur when there's surplus capacity to deal with them.
But i may be wrong.
| 6:41 am on Oct 25, 2006 (gmt 0)|
each subdomains has approx 200 backlinks listed in google. Might b this is the case. Links have made the degree of trust ;-) so now conclusion is Links are the only thing that is going to help...
Not good at all.
| 5:24 pm on Oct 25, 2006 (gmt 0)|
I've had one site severely hit by duplicate penalties due to the content management system it is using. I lost all rankings and Google indexed all of them.
My second site which is "authority" in its field, I had the same content management script, but that one never lost its rankings for very competitive keywords.
So yes there is a sort of filter which does not hurt "trusted" sites (at least at the moment) for duplicate content issues. For the sake of it I am trying to de-index all the duplicates from both sites.
| 7:49 pm on Oct 25, 2006 (gmt 0)|
|a sort of filter which does not hurt "trusted" sites |
My experience is that it *does not hurt "trusted" sites AS MUCH* -- but fixing duplicate issues and such still helps the URLs perform a lot better in the SERPs.
The high PR (or trust or whatever combination of factors plays in here) may help the URLs not to be classed as Supplemental so easily, but they still may be "omitted results" or ranked lower than they could be -- because different versions of a URL are siphoning off Page Rank, for example.