Forum Moderators: Robert Charlton & goodroi
Despite the lack of an indicator, the supplemental index still exists with all of the resulting problems for the pages stuck in it. My pages in the supplemental index don't rank for anything and have had their images removed from Google image search, so being labeled supplemental, whether it's visible or not, still does matter.
site:example.com/&
if I do a site:example.com
I get exactly 100 LESS results than if I put the amphersand.
for example sake... lets say I have 5000 total pages
if I do
site:example.com/&
I get 5100
if I do
site:example.com/
I get 5000
It's showing more than I have and we do not have duplication issues
...Supplemental Results are fresher and more comprehensive than ever. We're also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we've been able to make so far, and thinking ahead to future improvements, we've decided to stop labeling these URLs as "Supplemental Results."
What is a supplemental index?
How to differentiate between a supplemental index than a non-supplemental index from the Google search?
site:www.yoursite.com
site:www.yoursite.com *** -sljktf
I used both the queries above and it returns all the pages of my website. Does this mean my entire site is in supplemental?
The alternative search mentioned later on in that thread site:example.com/& still returns the same urls that it did before - but now the Supplemental label is gone. It still looks to me that it returns supplemental urls -- at least for now. Maybe not all, and maybe some of those urls have both supplemental and regular spots with different cache dates or whatever -- but the urls returned by the search do seem to be the weakest on the domain. And it still seems to work for subdirectories as well.
The site:example.com works well, it really wouldn't be that hard to write a program that when run checks googles results against a sitemap and returns a list of pages google didn't include.
Would this be of value to anyone? More importantly perhaps, would Google grant an api for such a program?
Edit: this is assuming the supp pages are omited, if they are included but not labeled... thats a whole new problem. I'll admit, this would stop webmasters from upgrading pages based on supplemental result listings.
[edited by: JS_Harris at 9:10 am (utc) on Aug. 1, 2007]
Google has a reason for doing this I would really like the reason for this change.
As far as being an SEO I guess I am not as I can't quite find a solution knowing what pages are supplemental or not.
Sure wish the SEO Man here would let us all know as I assume he knows...