Forum Moderators: Robert Charlton & goodroi
I recently discovered that a large proportion of my site (600 out of 850 or so pages) are supplemental. My rankings have been slowly dropping.
After reading around i have learnt that it is probably due to some of the duplicate content i have on some of my pages.
What i cant seem to find is the best solution to remove these duplicate content pages from supplemental results.
Do I
A) Rewrite the content on these pages and wait for them to be re-crawled
B) Delete these pages and any internal links to them
C) Use robots.txt to block the spiders accessing these pages
D) Use the blank page method as discussed [webmasterworld.com...]
E) Forget that site and concentrate on another site
Any advice would be much appreciate
B) Delete these pages and any internal links to them
- Wont help, they will still continue to be in Google index for quite some time. Infact 404 is one of the major reasons why pages go supplimental.
C) Use robots.txt to block the spiders accessing these pages
- This is a good idea if none of these pages are ranking for any of your keywords and you can afford to lose them out of the index completely.
D) Use the blank page method as discussed [webmasterworld.com...]
- Probably the best way, works great if you want to retain the pages in Google index.
E) Forget that site and concentrate on another site
If your site isnt doing great on Google ranks, why not :-) But then you will have to re-write all the pages and meta tags for the new site.
Having read further into the blank(nearly blank) page method, that doesnt seem like a good option.
Im quite happy to lose these 20 pages, so would simply deleting them be a good option or would they remain floating around in the supplemental results.
I recently discovered that a large proportion of my site (600 out of 850 or so pages) are supplemental. My rankings have been slowly dropping.
Did the drop in rankings equate to a drop in ROI?
After reading around i have learnt that it is probably due to some of the duplicate content i have on some of my pages.
Can you describe what you are considering duplicate content?
What i cant seem to find is the best solution to remove these duplicate content pages from supplemental results.
The only people who know about Supplemental pages are us. The average consumer has no idea what the Supplemental index is and nor should they.
The simplest and most effective solution is to remove those pages and serve a 410 gone.
You could also do what I do in most instances like this and drop a robots directive in the <head></head> of those pages...
<meta name="robots" content="none"> Just cover your butt in regards to that content getting indexed again. Google probably indexed everything first time around, that could have been months ago, and now it is purging duplicates found through their natural process.
I'm not too certain the "blank page method" is in the best of your interest from a variety of viewpoints. For one, it creates a technical nightmare and two, the maintenance just isn't worth it.
P.S. Don't block that stuff from the robots.txt file. Google already knows about it. Now you have to deal with it at the page level. If you block via robots.txt and use the meta robots element, it won't work as Google won't see the meta robots directive. But, Google will list those robots.txt entries as URI only when doing site: searches. The average consumer will not see those unless they are doing site: searches themselves which few do.
Note that Google may keep a supplemental version of your url cached long after you change or exclude the content, or remove/redirect the url so that it now returns a 404 or 301. So I'd suggest that you fix what you see you need to fix and move on. No need to worry about when the supplemental result (which is a url PLUS a cache date) vanishes from public view.
In response to your questions PageOneResults, the duplicate content is a mixture of articles that were added from article directorys and pages that were created as landing pages for PPC campaigns which were only slightly reworded (me being lazy)
About a year ago i used to be #1 for a very highly competitive term and ranked well for other search terms. Since then i have bounced around the rankings and not been receiving much traffic. This has resulted in a loss of revenue.
To be honest, i had heard of supplemental results but never fully understood what they were about until 2 weeks back. So i am presuming they have had an effect on my rankings. I am determined to get back to where i was a year ago
The duplicate content is a mixture of articles that were added from article directories and pages that were created as landing pages for PPC campaigns which were only slightly reworded (me being lazy).
Were those from your own article directories or someone else's? I ask that because if they were someone else's, you have another level of duplicate content to contend with.
Typically landing pages for PPC usually don't have links to them from the main site. At least that is how I handle them. I don't want those pages indexed as they typically contain a replication of content from other pages already indexed. Plus I like to have some control over how those landing pages are tracked, etc. Blocking them from getting indexed from the start prevents some potential issues from arising down the road. ;)
About a year ago i used to be #1 for a very highly competitive term and ranked well for other search terms. Since then i have bounced around the rankings and not been receiving much traffic. This has resulted in a loss of revenue.
Being #1 for a highly competitive term returns unbelievable results. That #1 spot is where it happens! But, as the index grows and more competition comes into the space, that #1 spot becomes a target. At some point, you're going to begin to slide, not all the time, but it happens.
The difference in traffic from being #1 as opposed to #2 or #3 is fairly large. Even dropping to a #2 position will cause a slight loss in revenue. It would be nice to see an outline of someone who has watched as their positions have dropped in that type of order and how it impacted their ROI as each drop occured.
I have a few pages like that. They've only dropped a couple of times when Google was doing their juggling bit but they bounced right back after all settled down. During the time they drop, there is a noticeable drop in sales.
I'd be concerned about those Supplementals and correcting whatever caused them in the first place. Once you've corrected the issue, don't worry about them anymore. Those who are familiar with this topic have observed Google's behavior with Supplementals and they could be there for as long as a year or more.
I'd like to add that you should be double and triple checking the server header responses being returned by those Supplemental pages after you've made the corrections.
The landing pages i made were slight variations of the home page to attempt to aid conversions for certain search terms. Some of these pages i foolishly linked to from my sitemap.
The #1 ranking i had did at first slide to #4 then it dropped to around page 3/4 and has remained in that area since.
Once again, many thanks for the advice.
The urls looked like this:
[mysite.com...]
(One we want to keep in the main index and the url that went supplemental on us)
[mysite.com...]
[mysite.com...]
[mysite.com...]
[mysite.com...]
(What we suspect made our url go supplemental. All these urls did was sort by product name price)
[edited by: trinorthlighting at 4:12 pm (utc) on Dec. 28, 2006]
The articles i have are from some of the major article directorys you see around.
You know this is what i kind of figured from the first post.
This is duplicate content.
Those pages should be supplemental.
Write some articles and upload them to replace the same pages.
See what happens.
I did hear "i was too lazy not to copy paste content onto my site" didn't i? Wow... sometimes i wonder how many people on here are doing things like that... then come here for an explanation on WHY their sites don't rank. Sorry if it was a good-will case of duplicated content but... how many pages on your site are actually like that? Percentage-wise?
Okay got carried away.
Could be anything... i mean canonicals, low PR, whatever.
g1smd is right, first you need to know what's causing it, then address the problem. But if they're all PR1 and above, and are served on a single URL and in no other way, it's the articles.
Really? How do you know it's a google prob? My site is back and forth inthe rankings. in and out. driving me nuts!
Really!
I was to concentrate in recovering other two sites with adsense on them since I lost ranks on all five on 17.12
On this 3 sites I don't do anything and yesterday all 3 recovered.This 3 sites don't run adsense.
Now I'll rollback all the changes I've made on those 2 sites and wait.
We usually translate every new page in those 3 languages, except the 30 pages, which were about an year old, we created them before we started the translations, and we thought - we might as well have all the pages translated - well, it was not worth it.
I didnt know that Google considers translations a duplicate, even though they are from our own website...
Our website does rank well, however, about a month ago we translated about 30 pages from our website in 3 foreign languages - and all those pages (the translations only) went supplemental.
That is Google's normal behavior in some instances. Supplementals mean many things. In this case, they are new pages that Googlebot has spidered that don't meet a minimum criteria as of yet for pushing them into the main index. Give it some time, they will most likely come out of Supplemental.
Well, it was not worth it.
I think it was. Your visitors didn't appreciate the translations?
I didnt know that Google considers translations a duplicate, even though they are from our own website.
They don't consider translations duplicate. They are treated separately. There are some additional things you can do to make sure that they are. Language headers help to define the page in a particular language.
And we do make sure G knows which pages are in what language, but thanks for the suggestion anyways ;)
That looks like the standard "duplicate content nightmare" that comes free with every default installation of osCommerce and derivative products.
I have written about that topic several times before. For forums, carts, and CMS, vBulletin and many other scripted systems also suffer the same problems.
I have a website and about 2 months ago discovered "ALL" but the homepage was supplemental.
This was 100% due to dup content. I then proceeded to change everything 2500+ pages from dup content to totally unique and originally.
Then today I discovered hundreds of pages are now coming out of supplemental and back to main index.
Here is my question though.
I continue to get new products coming from my manufacturers. Literally hundreds of new items at a time.
The problem is that these manufacturers send me product descriptions for each item they sell. In the past I was always using their descriptions but quickly found out it was causing dup content issues because many of their customers were using this same manufacturer descriptions.
Because I get so many new products at one time and need to get them displayed as quickly as possible, I was wondering if this idea below is a temp quick fix to avoid my entire site going supplemental again.
Here is my idea.
If I were to create a link on my site called "New Items" and every new item with the manufacturer description on it would fall under this category link.
Seeing that every new items with manufactuer description fall under this one link/category page, would this protect the rest of my pages from going supplemental (2500+ pages) that now has its own unique content?
Seeing that google ranks pages and not websites I think this would save those 2500+ pages from going supplemental again, right?
In the meantime I can continue to use the dup content that fall under the New Items link and I can work on the new items slowly to make each item content unique and originally.
Is this a good guideline to follow when having hundreds of new items that need to get listed quickly?
How are you guys with ecommerce businesses handling this type of situation?
[webmasterworld.com...]
The problem is that these manufacturers send me product descriptions for each item they sell. In the past I was always using their descriptions but quickly found out it was causing dup content issues because many of their customers were using this same manufacturer descriptions.
Also - in view of the problem being pages going Supplemental, is there enough PR in the site to support that many pages?
[edited by: Marcia at 6:47 pm (utc) on Jan. 1, 2007]