Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Effects of Supplemental Results in the Index

Limbo or Purgatory?

         

cws3di

7:42 pm on Sep 4, 2005 (gmt 0)

10+ Year Member



I have been looking around and found there are a lot of posts in scattered forum topics that bring up the issue of the recurring Supplemental Results in Google, so I am hoping to get some direct discussion going that is focused on what people are seeing as effects of the Supplemental Results.

As far as I know, pages become listed as Supplemental Results for several different reasons:
1. They return 6 months after the site owner has used the Google Removal Tool
2. They have little or no content, or might even be seen as having some duplicate content
3. They are no longer linked through from the top of the site (either the links are broken or a new website is launched and the old pages are not deleted from the server).

For Case 1:
I do not want to use the Google Removal Tool again - it seems logical that it would just start a weird 3 or 6 month up and down roller-coaster -Remove pages, then they come back, Remove pages, then they come back.

It seems to me that once pages "return" as Supplemental Results, the entire website seems to go into a sort of Limbo state - as if the Googlebot is stuck and no longer spidering the site. After a lengthy period of "good" pages not getting a fresh cache, they seem to drop off as a url-only page when doing a site: search.

Another effect I see is that sometimes the Supplemental Results are showing up in the serps, even though those pages may no longer exist (404). They also seem to be higher placed in the serps than the "good" pages - maybe because they have been in the Google index longer, so they have a better "historical" rank component?

I have had some luck in getting Googlebot kick-started to start spidering again by putting backlinks on my other websites that point to the OLD pages - it seems to me that once Googlebot gets starts spidering the Supplemental Results and finds true 404s, then it will go ahead and start spidering the "good" part of the site again too.

This, of course takes a long time to get the process going.

For Case 2:
Probably a good idea to get rid of (404) any pages that might have duplicate content, and pages with little or no content seem pretty useless too.

For Case 3:
Once again, leaving dead-end pages on the server gets them demoted as Supplemental Results, but it seems that after another index update cycle, they are still Supplemental Results but start showing up in the serps again, often at higher positions than the "good" pages of the new website. My guess is that they are given a higher "historical" component because the pages are older (have been in the Google index longer).

In my opinion, I don't think anybody wants their dead-end pages to be the first thing that their users are presented with, right?

Anybody have any better ideas or experiences with these Supplemental Results? I think very few of us want them there - we need a clean way to get rid of them.