homepage Welcome to WebmasterWorld Guest from 23.20.19.131
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
How to Fix Supplemental due to Duplicate Content
Now what?
Dougy




msg:3200806
 2:25 pm on Dec 27, 2006 (gmt 0)

Hi

I recently discovered that a large proportion of my site (600 out of 850 or so pages) are supplemental. My rankings have been slowly dropping.
After reading around i have learnt that it is probably due to some of the duplicate content i have on some of my pages.
What i cant seem to find is the best solution to remove these duplicate content pages from supplemental results.

Do I
A) Rewrite the content on these pages and wait for them to be re-crawled
B) Delete these pages and any internal links to them
C) Use robots.txt to block the spiders accessing these pages
D) Use the blank page method as discussed [webmasterworld.com...]
E) Forget that site and concentrate on another site

Any advice would be much appreciate

 

ruchirasharma




msg:3201507
 6:44 am on Dec 28, 2006 (gmt 0)

A) Rewrite the content on these pages and wait for them to be re-crawled
- Certainly not, there is very little chance that Google will re-crawl the supplimental pages.

B) Delete these pages and any internal links to them
- Wont help, they will still continue to be in Google index for quite some time. Infact 404 is one of the major reasons why pages go supplimental.

C) Use robots.txt to block the spiders accessing these pages
- This is a good idea if none of these pages are ranking for any of your keywords and you can afford to lose them out of the index completely.

D) Use the blank page method as discussed [webmasterworld.com...]
- Probably the best way, works great if you want to retain the pages in Google index.

E) Forget that site and concentrate on another site
If your site isnt doing great on Google ranks, why not :-) But then you will have to re-write all the pages and meta tags for the new site.

shogun_ro




msg:3201647
 11:36 am on Dec 28, 2006 (gmt 0)

I recovered 3 sites from supplementals hell doing nothing.
Just wait and all be fine.It's a temporary problem on G.

photopassjapan




msg:3201648
 11:39 am on Dec 28, 2006 (gmt 0)

G does recrawl supplemental pages.
Only at a much lower frequency.

Umbertide




msg:3201653
 11:53 am on Dec 28, 2006 (gmt 0)


>>>I recovered 3 sites from supplementals hell doing nothing. Just wait and all be fine.It's a temporary problem on G.

Really? How do you know it's a google prob? My site is back and forth inthe rankings. in and out. driving me nuts!

Dougy




msg:3201726
 2:10 pm on Dec 28, 2006 (gmt 0)

I deleted a badly run forum from my site about a week ago and now im seeing the number of pages indexed down to around 400. The majority of these pages were supplemental
I still have around 20 pages of duplicate content articles on my site. Leaving them there, hoping for the best isnt the best idea.

Having read further into the blank(nearly blank) page method, that doesnt seem like a good option.

Im quite happy to lose these 20 pages, so would simply deleting them be a good option or would they remain floating around in the supplemental results.

pageoneresults




msg:3201740
 2:27 pm on Dec 28, 2006 (gmt 0)

I recently discovered that a large proportion of my site (600 out of 850 or so pages) are supplemental. My rankings have been slowly dropping.

Did the drop in rankings equate to a drop in ROI?

After reading around i have learnt that it is probably due to some of the duplicate content i have on some of my pages.

Can you describe what you are considering duplicate content?

What i cant seem to find is the best solution to remove these duplicate content pages from supplemental results.

The only people who know about Supplemental pages are us. The average consumer has no idea what the Supplemental index is and nor should they.

The simplest and most effective solution is to remove those pages and serve a 410 gone.

You could also do what I do in most instances like this and drop a robots directive in the <head></head> of those pages...

<meta name="robots" content="none">

Just cover your butt in regards to that content getting indexed again. Google probably indexed everything first time around, that could have been months ago, and now it is purging duplicates found through their natural process.

I'm not too certain the "blank page method" is in the best of your interest from a variety of viewpoints. For one, it creates a technical nightmare and two, the maintenance just isn't worth it.

P.S. Don't block that stuff from the robots.txt file. Google already knows about it. Now you have to deal with it at the page level. If you block via robots.txt and use the meta robots element, it won't work as Google won't see the meta robots directive. But, Google will list those robots.txt entries as URI only when doing site: searches. The average consumer will not see those unless they are doing site: searches themselves which few do.

tedster




msg:3201742
 2:28 pm on Dec 28, 2006 (gmt 0)

Lots of relevant information in this thread:
Supplemental Results: What exactly are they [webmasterworld.com]

Note that Google may keep a supplemental version of your url cached long after you change or exclude the content, or remove/redirect the url so that it now returns a 404 or 301. So I'd suggest that you fix what you see you need to fix and move on. No need to worry about when the supplemental result (which is a url PLUS a cache date) vanishes from public view.

Dougy




msg:3201760
 2:50 pm on Dec 28, 2006 (gmt 0)

Thanks for the advice

In response to your questions PageOneResults, the duplicate content is a mixture of articles that were added from article directorys and pages that were created as landing pages for PPC campaigns which were only slightly reworded (me being lazy)
About a year ago i used to be #1 for a very highly competitive term and ranked well for other search terms. Since then i have bounced around the rankings and not been receiving much traffic. This has resulted in a loss of revenue.
To be honest, i had heard of supplemental results but never fully understood what they were about until 2 weeks back. So i am presuming they have had an effect on my rankings. I am determined to get back to where i was a year ago

pageoneresults




msg:3201768
 3:00 pm on Dec 28, 2006 (gmt 0)

The duplicate content is a mixture of articles that were added from article directories and pages that were created as landing pages for PPC campaigns which were only slightly reworded (me being lazy).

Were those from your own article directories or someone else's? I ask that because if they were someone else's, you have another level of duplicate content to contend with.

Typically landing pages for PPC usually don't have links to them from the main site. At least that is how I handle them. I don't want those pages indexed as they typically contain a replication of content from other pages already indexed. Plus I like to have some control over how those landing pages are tracked, etc. Blocking them from getting indexed from the start prevents some potential issues from arising down the road. ;)

About a year ago i used to be #1 for a very highly competitive term and ranked well for other search terms. Since then i have bounced around the rankings and not been receiving much traffic. This has resulted in a loss of revenue.

Being #1 for a highly competitive term returns unbelievable results. That #1 spot is where it happens! But, as the index grows and more competition comes into the space, that #1 spot becomes a target. At some point, you're going to begin to slide, not all the time, but it happens.

The difference in traffic from being #1 as opposed to #2 or #3 is fairly large. Even dropping to a #2 position will cause a slight loss in revenue. It would be nice to see an outline of someone who has watched as their positions have dropped in that type of order and how it impacted their ROI as each drop occured.

I have a few pages like that. They've only dropped a couple of times when Google was doing their juggling bit but they bounced right back after all settled down. During the time they drop, there is a noticeable drop in sales.

I'd be concerned about those Supplementals and correcting whatever caused them in the first place. Once you've corrected the issue, don't worry about them anymore. Those who are familiar with this topic have observed Google's behavior with Supplementals and they could be there for as long as a year or more.

I'd like to add that you should be double and triple checking the server header responses being returned by those Supplemental pages after you've made the corrections.

Dougy




msg:3201799
 3:31 pm on Dec 28, 2006 (gmt 0)

The articles i have are from some of the major article directorys you see around. Some of them may still be unique, others are used by a few other sites too. These pages were built when i thought adding any content was good practice.

The landing pages i made were slight variations of the home page to attempt to aid conversions for certain search terms. Some of these pages i foolishly linked to from my sitemap.
The #1 ranking i had did at first slide to #4 then it dropped to around page 3/4 and has remained in that area since.

Once again, many thanks for the advice.

g1smd




msg:3201807
 3:40 pm on Dec 28, 2006 (gmt 0)

In order to know what to fix and how to fix it, you need to know what type of Supplemental Result you have.

Previous threads here should explain the different types and how they arise...

trinorthlighting




msg:3201833
 4:01 pm on Dec 28, 2006 (gmt 0)

One of our sites recently went supplemental due to a shopping cart upgrade. A sort by price and product function we suspect is causing it and we have just blocked the url's via our robots.txt

The urls looked like this:

[mysite.com...]

(One we want to keep in the main index and the url that went supplemental on us)

[mysite.com...]
[mysite.com...]
[mysite.com...]
[mysite.com...]

(What we suspect made our url go supplemental. All these urls did was sort by product name price)

[edited by: trinorthlighting at 4:12 pm (utc) on Dec. 28, 2006]

photopassjapan




msg:3201835
 4:03 pm on Dec 28, 2006 (gmt 0)

The articles i have are from some of the major article directorys you see around.

You know this is what i kind of figured from the first post.
This is duplicate content.
Those pages should be supplemental.

Write some articles and upload them to replace the same pages.
See what happens.

I did hear "i was too lazy not to copy paste content onto my site" didn't i? Wow... sometimes i wonder how many people on here are doing things like that... then come here for an explanation on WHY their sites don't rank. Sorry if it was a good-will case of duplicated content but... how many pages on your site are actually like that? Percentage-wise?

Okay got carried away.
Could be anything... i mean canonicals, low PR, whatever.
g1smd is right, first you need to know what's causing it, then address the problem. But if they're all PR1 and above, and are served on a single URL and in no other way, it's the articles.

shogun_ro




msg:3201850
 4:25 pm on Dec 28, 2006 (gmt 0)

Really? How do you know it's a google prob? My site is back and forth inthe rankings. in and out. driving me nuts!

Really!
I was to concentrate in recovering other two sites with adsense on them since I lost ranks on all five on 17.12
On this 3 sites I don't do anything and yesterday all 3 recovered.This 3 sites don't run adsense.
Now I'll rollback all the changes I've made on those 2 sites and wait.

atlrus




msg:3201923
 5:21 pm on Dec 28, 2006 (gmt 0)

Our website does rank well, however, about a month ago we translated about 30 pages from our website in 3 foreign languages - and all those pages (the translations only) went supplemental.

We usually translate every new page in those 3 languages, except the 30 pages, which were about an year old, we created them before we started the translations, and we thought - we might as well have all the pages translated - well, it was not worth it.

I didnt know that Google considers translations a duplicate, even though they are from our own website...

pageoneresults




msg:3201928
 5:25 pm on Dec 28, 2006 (gmt 0)

Our website does rank well, however, about a month ago we translated about 30 pages from our website in 3 foreign languages - and all those pages (the translations only) went supplemental.

That is Google's normal behavior in some instances. Supplementals mean many things. In this case, they are new pages that Googlebot has spidered that don't meet a minimum criteria as of yet for pushing them into the main index. Give it some time, they will most likely come out of Supplemental.

Well, it was not worth it.

I think it was. Your visitors didn't appreciate the translations?

I didnt know that Google considers translations a duplicate, even though they are from our own website.

They don't consider translations duplicate. They are treated separately. There are some additional things you can do to make sure that they are. Language headers help to define the page in a particular language.

Frederic1




msg:3202114
 8:03 pm on Dec 28, 2006 (gmt 0)

shogun_ro wrote :
I recovered 3 sites from supplementals hell doing nothing.

How long did you wait?

shogun_ro




msg:3202132
 8:22 pm on Dec 28, 2006 (gmt 0)

How long did you wait?

From 17.12 to 27.12 2006 .Ten days.
All sites list supplementals above index page for site: search.
Now is back to normal and rank as it was prior 17.12.2006

atlrus




msg:3202216
 10:04 pm on Dec 28, 2006 (gmt 0)

pageoneresults - I think you might be right, we had a new page once that went supp in the begining but now it's fine, I hope the same thing will happen now, as we have foreign visitors from Google only.

And we do make sure G knows which pages are in what language, but thanks for the suggestion anyways ;)

Marcia




msg:3202339
 12:14 am on Dec 29, 2006 (gmt 0)

I've had a whole small site just come out of Supplemental in just a few months time, in fact the number of pages indexed has even gone up. Strictly an IBL issue.

I'd like to add a section or two but there aren't enough links and not enough PR to support more yet.

g1smd




msg:3202966
 4:41 pm on Dec 29, 2006 (gmt 0)

>> The urls looked like this: <<

That looks like the standard "duplicate content nightmare" that comes free with every default installation of osCommerce and derivative products.

I have written about that topic several times before. For forums, carts, and CMS, vBulletin and many other scripted systems also suffer the same problems.

atlrus




msg:3204374
 1:42 pm on Dec 31, 2006 (gmt 0)

Just to follow up - some of those pages left the supplemental already. Sweet.

atlrus




msg:3204377
 1:51 pm on Dec 31, 2006 (gmt 0)

Which makes me think - as in our case - could the supplementals be the new sandbox for internal pages? As in having a link from an old internal page to a new internal page?

If other people have experienced simillar problems, there may be a pattern showing...

JoeHouse




msg:3205081
 6:20 pm on Jan 1, 2007 (gmt 0)

Hi All

I have a website and about 2 months ago discovered "ALL" but the homepage was supplemental.

This was 100% due to dup content. I then proceeded to change everything 2500+ pages from dup content to totally unique and originally.

Then today I discovered hundreds of pages are now coming out of supplemental and back to main index.

Here is my question though.

I continue to get new products coming from my manufacturers. Literally hundreds of new items at a time.

The problem is that these manufacturers send me product descriptions for each item they sell. In the past I was always using their descriptions but quickly found out it was causing dup content issues because many of their customers were using this same manufacturer descriptions.

Because I get so many new products at one time and need to get them displayed as quickly as possible, I was wondering if this idea below is a temp quick fix to avoid my entire site going supplemental again.

Here is my idea.

If I were to create a link on my site called "New Items" and every new item with the manufacturer description on it would fall under this category link.

Seeing that every new items with manufactuer description fall under this one link/category page, would this protect the rest of my pages from going supplemental (2500+ pages) that now has its own unique content?

Seeing that google ranks pages and not websites I think this would save those 2500+ pages from going supplemental again, right?

In the meantime I can continue to use the dup content that fall under the New Items link and I can work on the new items slowly to make each item content unique and originally.

Is this a good guideline to follow when having hundreds of new items that need to get listed quickly?

How are you guys with ecommerce businesses handling this type of situation?

trinorthlighting




msg:3205084
 6:24 pm on Jan 1, 2007 (gmt 0)

Are you the only one selling online for those manufacturers?

JoeHouse




msg:3205087
 6:28 pm on Jan 1, 2007 (gmt 0)

No I am not...there are others.

tedster




msg:3205091
 6:36 pm on Jan 1, 2007 (gmt 0)

There's also some relevant discussion in another thread. Looks like something different may be happening with the supplemental index -- but the question is, what is it.

[webmasterworld.com...]

Marcia




msg:3205102
 6:45 pm on Jan 1, 2007 (gmt 0)

The problem is that these manufacturers send me product descriptions for each item they sell. In the past I was always using their descriptions but quickly found out it was causing dup content issues because many of their customers were using this same manufacturer descriptions.

There are many online merchants selling the same products using the same descriptions from the manufacturer. How much other text is on those pages, or are those descriptions the only content on the pages?

Also - in view of the problem being pages going Supplemental, is there enough PR in the site to support that many pages?

[edited by: Marcia at 6:47 pm (utc) on Jan. 1, 2007]

JoeHouse




msg:3205167
 8:51 pm on Jan 1, 2007 (gmt 0)

Marcia

The site is a PR 5, and for the product pages the content is basically the only thing displayed on those individual pages.

However, since I redid these item product pages with their own unique content, hundreds of pages have started to come out of supplemental and back into the many index.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved