Forum Moderators: Robert Charlton & goodroi
The fixes seem to be
Remove duplicate content
Develop incoming links
make pages W3C compliant
Does any one know of any other ways to get out of this type of listings?
And can anyone remember the name of the elusive programme where you can register content to avoid people duplicating it?
One more important reason for pages being put into the supplemental index is low PageRank (not enough link juice passed to them). This can sometimes be helped simply through improvements in the site's internal link structure.
Also, don't be fooled by the phrase "duplicate content" - this can be "duplicate urls" that resolve to the same content on your site. Getting supplemental urls from duplicates are more your business on your server than it is the fault of plagiarists. One site that can help you with plagiarism is copyscape.com
Also a type of duplicate issue for pages with borderline PR (3 or so) is duplicate titles and/or meta descriptions. For low PR pages, Google often doesn't give them full indexing treatment and tends to grab a quick indexing of these two areas.
There are several helpful threads in our Hot Topics [webmasterworld.com] area which is always pinned to the top of this forum's indexed page.
And finally, supplemental pages are now being seen in search results more frequently than they were last year, and they're being spidered more frequently as well. So it's not the same "kiss of death" that it used to be to see pages get the supplemental tag.
I don't think Google takes that into account much, if at all. Only a minority of website maintainers are even aware of what the W3C is, let alone how to make their site compliant.
And as far as the average user is concerned compliance makes no difference to them, as long as the page renders in their browser.
Some of the Supplemental results are because of pagination and some are product pages that are only linked to from the sitemap because they are discontinued, but the vast majority deserve to be non-supplemental. After some digging around, I have two suspected causes.
1. The internal links pointing to them come from supplemental pages.
2. Maybe Google is REALLY sensitive about duplicate descriptions. All of my descriptions are written with this in mind:
The WIDGET_NAME looks DESCRIPTION.
Could that description format be triggering Google's duplicate filter? WIDGET_NAME and DESCRIPTION are both unique, but "The" and "looks" are always the same.
Once I fix the issue(s) causing the supplemental status, when can I expect the pages to be set free?
My home page has the most PR by far (3/10), so I'm considering adding a lot more internal links to the home page. I'll also write more content.
so I'm considering adding a lot more internal links to the home page
Do you mean links ON the home page linking to the supplemental pages? I know it may sound stupid, but I just wanted to clarify for myself as I have supplemental issues.
Cheers
Tedster probably means stuff like unclosed tags and so on.
Exactly -- unclosed tags, unclosed quotes (especially on attributes), missing one of the angle brackets around a tag, etc. Little niggling things that are very hard to see with the eye, but that can cause a problem if they occur near some critical content.