Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Supplemental Pages - revisiting the causes

         

pazang

3:59 pm on Jul 15, 2007 (gmt 0)

10+ Year Member



For some reason over a 100 of my pages have gone noved into supplemental pages.

The fixes seem to be

Remove duplicate content
Develop incoming links
make pages W3C compliant

Does any one know of any other ways to get out of this type of listings?

And can anyone remember the name of the elusive programme where you can register content to avoid people duplicating it?

tedster

5:16 pm on Jul 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



W3C compliance is not really an issue here - but it is important to fix gross html errors that might block full spidering of the page.

One more important reason for pages being put into the supplemental index is low PageRank (not enough link juice passed to them). This can sometimes be helped simply through improvements in the site's internal link structure.

Also, don't be fooled by the phrase "duplicate content" - this can be "duplicate urls" that resolve to the same content on your site. Getting supplemental urls from duplicates are more your business on your server than it is the fault of plagiarists. One site that can help you with plagiarism is copyscape.com

Also a type of duplicate issue for pages with borderline PR (3 or so) is duplicate titles and/or meta descriptions. For low PR pages, Google often doesn't give them full indexing treatment and tends to grab a quick indexing of these two areas.

There are several helpful threads in our Hot Topics [webmasterworld.com] area which is always pinned to the top of this forum's indexed page.

And finally, supplemental pages are now being seen in search results more frequently than they were last year, and they're being spidered more frequently as well. So it's not the same "kiss of death" that it used to be to see pages get the supplemental tag.

g1smd

7:17 pm on Jul 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That about sums most of it up, as I see it.

tedster

10:32 pm on Jul 15, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do have one other suspicion about causes for Supplemental status -- it's not proven, and possibly not applicable to many sites. I discussed it here: Supplemental status - from "low impressions" in search results? [webmasterworld.com]

new_seo

5:27 am on Jul 16, 2007 (gmt 0)

10+ Year Member



1. Same Title and Heading tags in different pages.
2. canonical issue(Like www.example.com,example.com/,www.example.com/index.html,example.com/home.asp)
These can also be the reasons of supplemental index.
Verify these also.:)

zaneta

12:55 pm on Jul 19, 2007 (gmt 0)

10+ Year Member



Tedster,

what exactly do you mean when you say

"fix gross html errors that might block full spidering of the page"

Can you give an example of serious html errors? (putting javascript and .swf flash objects aside)

Thank you.

g1smd

1:40 pm on Jul 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Tedster probably means stuff like unclosed tags and so on.

gibbergibber

1:49 pm on Jul 19, 2007 (gmt 0)

10+ Year Member



--make pages W3C compliant--

I don't think Google takes that into account much, if at all. Only a minority of website maintainers are even aware of what the W3C is, let alone how to make their site compliant.

And as far as the average user is concerned compliance makes no difference to them, as long as the page renders in their browser.

g1smd

2:12 pm on Jul 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



They do not "take it into account" as far as giving the page a score for W3C-ness goes, but if you have some sort of HTML coding error such that the bot does not finish spidering the whole page then you cannot rank for any of the content that the bot didn't spider and index.

Tonearm

3:30 pm on Jul 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been having some Google problems and I just traced it to Supplemental Pages. Google says I have 875 pages, and only about 190 are not tagged Supplemental.

Some of the Supplemental results are because of pagination and some are product pages that are only linked to from the sitemap because they are discontinued, but the vast majority deserve to be non-supplemental. After some digging around, I have two suspected causes.

1. The internal links pointing to them come from supplemental pages.

2. Maybe Google is REALLY sensitive about duplicate descriptions. All of my descriptions are written with this in mind:

The WIDGET_NAME looks DESCRIPTION.

Could that description format be triggering Google's duplicate filter? WIDGET_NAME and DESCRIPTION are both unique, but "The" and "looks" are always the same.

Once I fix the issue(s) causing the supplemental status, when can I expect the pages to be set free?

Tonearm

8:53 pm on Jul 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I read the sticky Supplemental thread, and it sounds like my problem is most likely based on insufficient PR making its way to the Supplemental pages, and insufficient content on the Supplemental pages.

My home page has the most PR by far (3/10), so I'm considering adding a lot more internal links to the home page. I'll also write more content.

johnblack

9:44 pm on Jul 19, 2007 (gmt 0)



Tonearm when you say

so I'm considering adding a lot more internal links to the home page

Do you mean links ON the home page linking to the supplemental pages? I know it may sound stupid, but I just wanted to clarify for myself as I have supplemental issues.

Cheers

Tonearm

9:53 pm on Jul 19, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, that really wasn't clear. I'm thinking of adding links on the home page that point to the supplemental pages.

I also wonder if my supplemental issues could have to do with adding too many pages too quickly. My pages increased from about 250 to about 500 in about a month.

tedster

4:16 am on Jul 20, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Tedster probably means stuff like unclosed tags and so on.

Exactly -- unclosed tags, unclosed quotes (especially on attributes), missing one of the angle brackets around a tag, etc. Little niggling things that are very hard to see with the eye, but that can cause a problem if they occur near some critical content.