Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Duplicate Content - Error -How fast will G move to penalise?

Duplicate Content - How fast and how hard will G penalise

         

Whitey

2:58 am on Aug 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



If you have a duplicate content error, caused by an admin breakdown or something, how fast will G pick up on it and what are the consequences?

Has anything changed or been noticed since this post?

http://www.webmasterworld.com/forum30/31430.htm

The fact that even within a single site, when pages are deemed too similar, G is not throwing out the dups - they're throwing out ALL the similar pages.

The result of this miscalculation is that high quality pages from leading/authoritative sites, some that also act as hubs, are lost in the SERP's. In most cases, these pages are not actually penalized or pushed into the Supplemental index. They are simply dampened so badly that they no longer appear anywhere in the SERP's.



Our problem is that we have just discovered some erroneous pages creeping into G's serps which are duplicates from within our own site. Those pages are caching and throwing out the good ones with PR.

Although we don't know if it's connected our site:tool shows the number of pages dropping from 120k to 83k o/nite.

All of this occurred in the last 5 days and we're moving as fast as we can to find out how this occurred and how to rectify it.

The pages in question currently return 200 server response, which means that Google sees them as “real” pages and will potentially be seen as duplicates of the /Widget/ pages. We are asking our webmaster to ensure that these pages return a 404 code.

My question is how fast and how hard is Google going to jump on us?

Is it immediate and do we know how long the penalty will last for?

Will Google respond to a re-inclusion? Matt Cutt's recent video seemed to indicate that for "hard luck" stories the Sitemaps team do not have enough resources to respond to such requests.

That's a pity, because the average webmaster is sometimes going to be confronted with errors and circumstances beyond their control.

Quadrille

9:54 pm on Aug 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is no 'penalty' for duplicate pages;Google simply (and sensibly) will list one and others willbe ignored or 'supplementary' results.

The solution is simply to remove all duplicates - then the problem disappears.

tiori

9:59 pm on Aug 2, 2006 (gmt 0)

10+ Year Member



The solution is simply to remove all duplicates - then the problem disappears.

Sounds so easy. But in reality it isn't. I've insured that I don't have duplicate content on several sites for almost two years and they are still supplemental. So go figure!

Bewenched

10:32 pm on Aug 2, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I wonder if it would be worth it as an ecommerce site to not show full descriptions on the page with the item. Maybe make the user have to click yet another link to show more details or something.

We have alot of items that all have the same basic description, but may have small differences in them... kinda like different size.

It's almost like we are punishing the customer by not giving them all the available information on a products, but then again without the search engine bringing the customers what good is the site?

Whitey

6:32 am on Aug 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Quadrille - Maybe i should have said how quickly are we affected and how long before we re appear, since Big Daddy?

http://www.webmasterworld.com/forum30/34365.htm

g1smd wrote: As long as the old site says "nothing here" to bots and the new site says "index this" then Google will eventually fix it.

You might be looking at 6 months though (hard to tell because we have no clue what has changed with the new BigDaddy infrastructure, so I base my comments on how the "old google" used to behave).

[edited by: Whitey at 6:32 am (utc) on Aug. 3, 2006]

Quadrille

8:23 am on Aug 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If the problem is duplication, then removing the duplicates will fix it - probably after the next 'cycle' of spidering; for most people, a couple of weeks, tops.

If that doesn't fix it, then it wasn't the problem (or was only part of the problem).

the next step would be to check the www/non-www issue, ensure meta tags were appropriate and unique, and so on.

But also look at page content in the context of the whole code; if one page has one para on blue widgets, and another has one para on red widgets - in identical pages loaded with shared navigation, promos, etc., etc., then it will still look like duplicate content to Google.

I have heard of cases where it takes several months; but I strongly suspect there are more complex problems in those cases, such as duplication between sites.

If the duplication is within your site, and you have good navigation and regular spidering, and you fix the problem, it shouldn't take so long.

I'd recheck for other problems.

[edited by: Quadrille at 8:25 am (utc) on Aug. 3, 2006]

tiori

11:26 am on Aug 3, 2006 (gmt 0)

10+ Year Member



But also look at page content in the context of the whole code; if one page has one para on blue widgets, and another has one para on red widgets - in identical pages loaded with shared navigation, promos, etc., etc., then it will still look like duplicate content to Google.

If what you are saying is correct, that would affect thousands upon thousands of sites. Sites that have been built on templates to have a common look and feel for the user with shared navigation are used all over the place in all industries.

If Google is using that criteria to determine duplicate content; then very, very many websites are doomed.

mattg3

1:58 pm on Aug 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think the myth that Google selects for "user experience" pretty much died with the link posted in this thread.

[webmasterworld.com...]

If the site mentioned in the adsense blog is a "user experience", well ...

You have to actually go to the site to, well, experience it ..

Quadrille

4:31 pm on Aug 3, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If what you are saying is correct, that would affect thousands upon thousands of sites. Sites that have been built on templates to have a common look and feel for the user with shared navigation are used all over the place in all industries. If Google is using that criteria to determine duplicate content; then very, very many websites are doomed.

It does affect many thousands of sites; I wouldn't say they are doomed - but many, many thousands of site do suffer supplementary listings; especially template/database sites where 'unique content' is very small.

A site that makes a user visit ten pages to see one para on each size of a red widget, doesn't get the same listing as a site that lists 10 paras on one page, one for each size widget.

It's not just an SEO issue; it's an issue about providing a better website. Me, I'd never publish a page with just one unique para - let alone just a few words - but I see it all the time.

I don't like it - and nor does Google.