Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Supplemental Results

Google Search "Supplemental Results"

         

ak47mars

7:16 am on Nov 4, 2005 (gmt 0)

10+ Year Member



Does anyone know what "Supplemental Result" means when you do a search on google?

g1smd

11:24 pm on Nov 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It is a page with an old cache, a page where they froze the snippet and cache for all time. You see them a lot when a site has duplicate content, or the site is no longer online.

If a page is still online, the supplemental result may represent an older version of the page, the page showing as a normal result when you search for some content that is actually on the live page at that time.

ak47mars

3:31 am on Nov 7, 2005 (gmt 0)

10+ Year Member



But I'm very wondering.
The old pages have deleted a long time, why google not index my new pages. when you search site: www.domain.com in google.

g1smd

8:17 pm on Nov 7, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Good question. See posts #25, #37 and #400 in this thread:

[webmasterworld.com...]

dgdclynx

7:51 am on Nov 8, 2005 (gmt 0)

10+ Year Member



I, too, had been puzzled about Supplemental Result being attached to my files. But now I see it is a consequence of me having duplicate text, being a literary site, which caused my downfall under Bourbon. Jagger hasnt affected me at all yet. The Googlebots still visit but not as much as before.

cleanup

10:07 am on Nov 8, 2005 (gmt 0)

10+ Year Member



My site went Supplemental some time at the end of Sept.

Still no idea why. There are snippets copied all over the net, but the site has been around for years and nothing has changed recently for this to happen now.

So, anyway as I was saying suplimental... and all the pages are listed as non www.

So what to do?.. I have added a 301 redirect as everyone says should be done.

What else must I do now? the site only has about 50 pages, do I have to tell Google or something.

Do people ever recover from these situations or do I just block Googlebot and forget about the site ever apearing in Google again!

Would dearly like to nurse this unoffensive, useful little site back to life if at all possible. :/

bumpski

3:11 pm on Nov 8, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've tried to post this as a new topic, but to no avail.

Does anyone know how to filter results such that only "supplimental" results show? Or only non-supplimental results show?

g1smd

11:54 pm on Nov 8, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



There is no such filter, but new Google results for site:domain.com searches seem to be listing fully indexed results first, and supplemental results last - only helps if your site has less than 1000 pages though...

annej

11:57 pm on Nov 9, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The way I look to see if a page has supplimental results is to take a sentence from the page. I try to find one that would't likely be anywhere else. I put it in parenthesis and search on Google. Usually that gets it down to just a few pages then click on "repeat the search with the omitted results included." to get the supplimental ones.

world3d

12:17 am on Nov 10, 2005 (gmt 0)

10+ Year Member



I'm not sure if this is off-topic or not...
In serching for Supplemental results on my own site, I have found that a good dozen other sites have copied some of my text word for word. Could this cause a duplicate penalty for me? Does Google likely recognize that I wrote it first? Do I need to change my text?
I was the only site on page one of the serps for my KW to drop (from 3 to 10) in Jagger. Any insight into this would be greatly appreciated!

ak47mars

12:47 am on Nov 10, 2005 (gmt 0)

10+ Year Member



only helps if your site has less than 1000 pages though...

I can't agree with you, because my website is dynamic.
Over 70000 pages, and the space of a whole page is updated for 3 months. But google haven't index other new pages, please tell me why?
thank you.

g1smd

1:18 am on Nov 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> only helps if your site has less than 1000 pages though <<

My response was answering someone who was asking if there is a way to find ONLY supplemental results. There isn't, but the new SERPs now order normal results first and supplemental results last. For a site with 10 000 pages and 5 000 supplemental results, you will therefore never see the supplemental results in a site: search. So, this ordering, while useful to group the results, only helps if your site has less than 1000 pages (if it is just the supplemental results that you want to see).

ak47mars

1:26 am on Nov 10, 2005 (gmt 0)

10+ Year Member



sorry, now I'm very impatient. Because over 70000 pages, 3 month later, google index less than 5000. Very important is over 4000 pages are supplemental results.
Do you have a way remove it?
How to index my new pages by google?

g1smd

1:28 am on Nov 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would assume that they obviously think that the pages are way too similar to all be listed.

Are you sure that each title tag and each meta description is dfferent for every page of the site?

ak47mars

1:54 am on Nov 10, 2005 (gmt 0)

10+ Year Member



Because my website is dynamic, so in "title""description", none but important keywords is not similar. onpage layout are similar is unavoidable, content is not same.

annej

4:02 am on Nov 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



a good dozen other sites have copied some of my text word for word

I find that several scraper site have copied either the first line of my articles or the meta description. In fact I find that searching for meta descriptions is a good way to find scrapers that have linked to my pages. Some don't even link, they just copy text.

I don't think Google will penalize this as duplicate copy. It has to be a much greater percentage of the page that is the same.

walkman

4:33 am on Nov 10, 2005 (gmt 0)



>> 70000 pages

how different are those pages between each other, and between other sites online?

ak47mars

5:34 am on Nov 10, 2005 (gmt 0)

10+ Year Member



eg: laptop battery have kinds of brand.
In product page, layout is similar, but product description, fit model and part number is differernt.
The key is the website have corrected three months.
Why not google index my new pages?

g1smd

11:08 am on Nov 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I didn't undeestand your comment about title and description: if they are the same for multiple pages that Google will not list them all.

Can you confirm what you are actually doing?

Marcia

11:31 am on Nov 10, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ak47mars, is it a datafeed site?

Nuttakorn

12:18 pm on Nov 10, 2005 (gmt 0)

10+ Year Member



It is almost found the supplementary result from dynamic websites. It should be reduced this problem by using Google Sitemap to feed to dynamic page.

ak47mars

12:18 am on Nov 11, 2005 (gmt 0)

10+ Year Member



Marcia, yes, it's dynamic, transfer datebase automatism create.

ak47mars

6:06 am on Nov 12, 2005 (gmt 0)

10+ Year Member



Now I'm very hurried, please help me. thank you

cleanup

11:24 am on Nov 12, 2005 (gmt 0)

10+ Year Member



Can someone help me out here?

I am not very clear on whether I have a dup penalty or canonical,can it be seen from the following results?

site:www.mysite.com gives ->

www.mysite.com/ - 20k - Cached - Similar pages
www.mysite.com/index.html - 17k - Supplemental Result - Cached - Similar pages

site:mysite.com I get back ->

www.mysite.com/ - 20k - Cached - Similar pages
plus all the pages in the site listed without www

Does this show duplicate or canonical symptoms?

Marcia

11:32 am on Nov 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>Marcia, yes, it's dynamic, transfer datebase automatism create.

How different are the pages from other sites' pages that are using the same datafeed?

g1smd

9:09 pm on Nov 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



cleanup: you have the "Canonical HomePage Problem".

I answered your question in whatever other thread you asked a duplicate question.

cleanup

12:23 am on Nov 13, 2005 (gmt 0)

10+ Year Member



G1smd,
Thanks, for your verdict, here and at the
"other" forum really apreciate someone
more experienced taking a look.

I see a lot of people confused about these
supplemental results and their interpretation.
I was pretty sure that duplicates were not the
issue as the site has been written by me and
only scraped by others about the same amount
as any other (previously) successful site.

I have the 301's in place since the begining of
the week. So I guess there is not much more I can
do except monitor.

cleanup

1:09 pm on Nov 13, 2005 (gmt 0)

10+ Year Member



Just when I though I had all my non-www's 301's sorted and my head-refs in place,I read that we should redirect all references from .index.html to "/".

I suppose there must be lots of references both in back links and within my own site
both to [mysite.com...] and [mysite.com...]

Geeezz I wish Google would sort this, seems crazy when we all have to become .htaccess geeks just to keep our sites from not being penalised!

So, is this extra redirect from .index.html to "/" really neccesary?

g1smd

5:51 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



No so much a redirect, but simply making sure that all of your internal links (and as many external incoming links as possible) point to the domain or to the folder name followed by a trailing / on the URL. Omit the index file filename itself.

Link to www.domain.com/ or to /folder/ (with a <base> tag to set the default domain) or to www.domain.com/folder/ each time.

blend27

7:17 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I am currently faced with the similar issue where someone had pointed a link to my site with 4W in the URI.
*ttp://WWWW.mydomain.tld/
in reality this would be a sub domain, but its not, never set it up that way but URI resolves 100%. So when Gbot came to visit it took almost entire site pages that did not have *ttp://WWW.mydomain.tld/ with it. Causing 100% Duplicate Content. The pages with WWWW are in supplemental index now, and the normal ones dropped in ranking 50 to 60 positions from page 1 to no where.
The question is how to get improper URIs to get crawled again. and is all my work for 1 year is down to drain?
This 34 message thread spans 2 pages: 34