homepage Welcome to WebmasterWorld Guest from 54.243.17.133
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google News Archive
Forum Library, Charter, Moderator: open

Google News Archive Forum

    
Double content penalty, what can I do?
Jeka




msg:66040
 5:00 pm on Oct 9, 2004 (gmt 0)

Hi! A few months ago I put up a index.html file linking to 50 of my domains, and I was that dumb to put that file up on all 50 of them.
I didn't use them so I thought they wouldn't get spyderd anyway but I was wrong, I had a PR 5 domain in there and now all domains are PR 0.

Can it be that my whole server IP got banned / penaltized?
A few days ago I moved a PR 4 domain to my server and now 2 days later after the DNS change it's PR 3. Is that because of the double content penalty or the google dance right now?

I would really appreciate some help, thanks.

 

ThomasB




msg:66041
 7:56 am on Oct 12, 2004 (gmt 0)

Jeka, first of all welcome to WebmasterWorld!

Avoiding the dupe content filters is easiest by adding unique content to every individual site. A few sentences should be enough.

The drop of the PR is unlikely to be caused by the move from another server. I'd say it's just the normal PR update that took place last week. Nothing else.

I'd try to avoid complete crosslinking of the domains and try to link like in a circle with 5-10 links on every domain. That way it's pretty hard to detect and the sites don't look too spammy. :)

sem4u




msg:66042
 8:01 am on Oct 12, 2004 (gmt 0)

It has been the crosslinking of 50 domains which has caused the problem. 50 domains! :o

getvisibleuk




msg:66043
 9:35 am on Oct 12, 2004 (gmt 0)

Avoiding the dupe content filters is easiest by adding unique content to every individual site. A few sentences should be enough.

It won't be enough.

MHes




msg:66044
 9:48 am on Oct 12, 2004 (gmt 0)

>...A few sentences should be enough.
>It won't be enough.

I agree, it won't be enough. Only those few sentances will be taken as unique, the rest will probably be ignored for ranking purposes. Google seems to have stepped up the duplication filter, and even if you avoid having the page pr0, you will not rank well.

prairie




msg:66045
 11:23 am on Oct 12, 2004 (gmt 0)

If duplicate content is ignored in ranking, how does Google determine which version to display? ...

MHes




msg:66046
 11:38 am on Oct 12, 2004 (gmt 0)

>If duplicate content is ignored in ranking, how does Google determine which version to display?

Having been trying to figure this out for months, the conclusion so far is that the choice is made at run time and dependant on supporting occurances of the search phrase and relevant pr. My theory is that all text is counted for an initial search, then all duplicate pages with duplicate text surrounding the relevant words is clumped together. Apart from the surrounding text, perhaps 4 or 5 words either side, overall word frequency is another factor and ip addresses of sites. Once this clump of sites is established, perhaps pr, anchor text in or many other factors determine which site is chosen. IP is probably a big factor, google only taking one site from each ip.

In short, if we knew how to get our duplicate content above everyone else its time to order the new ferrari :)

RobBroekhuis




msg:66047
 11:46 am on Oct 12, 2004 (gmt 0)

Where can I read more about the duplication filter? For some months now, Google no longer returns my dynamic database-driven pages. They all have lots of unique content, yet share a common layout, headers, etc. I'm not sure if the drop from the index is because of perceived duplicate content, or because of the url structure, a simple one-parameter querystring (#*$!x.php?plantid=yyy). I'm tackling the second problem with a workaround that gives each page its own name (ccccc.php), but if Google's gripe is with duplicate content, that won't help.
Rob

ThomasB




msg:66048
 1:50 pm on Oct 12, 2004 (gmt 0)

>...A few sentences should be enough.
>It won't be enough.

Considering that you limit the links to 5 and you have 5 sentences with at least 8 words each of unique content describing what the domain will be about it should be ok imho.

Jeka




msg:66049
 4:58 pm on Oct 12, 2004 (gmt 0)

Thanks for all the answeres :) but does anybody know what would be best to do now? I want them unlisted from the "double content penalty list" as fast as possible.

Should I start linking them from other sites or won't that help?
Anybody knows how long a double content penalty normally takes?

jk3210




msg:66050
 2:23 am on Oct 13, 2004 (gmt 0)

>>but does anybody know what would be best to do now?<<

Remove all the cross-linking and any duplicate content from ALL pages. Then from the time when Google again spiders those pages, wait 30 days --your PR will be back.

Chad




msg:66051
 8:16 am on Oct 13, 2004 (gmt 0)

This sounds to me like a cross-linking penalty which, in my experience, is the kiss of death for all sites involved.

I had a dozen PR7 sites receive this penalty. If you remove the cross-links then write several groveling letters to Google, they will switch the sites' GoogleBar PR back on -- just to make you think something was done -- but the real penalty will remain unchanged. The GoogleBot will continue to ignore your sites for-ever-more.

My overly-bitter conclusion ;) is that Google is evil and they never forgive cross-linking -- so don't risk it. And now that you've made this mistake, be ready to toss all 50 domains.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google News Archive
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved