Welcome to WebmasterWorld Guest from 188.8.131.52
you may find that product pages are relegated to the supplemental index, they will still come up in searches but only when they are very relevant (above other websites who's pages arent in supplementals)
look at those pages in the supp index and see how you can add to them and make them unique such as adding unique copy. also look at how your site links internally. you may find more internal links point to pages that arent in the supp index than to the pages that are.
good luck :)
They are blog entries, therefore each having there own individual pages as well as being part of the archive and the frontpage until they get pushed out by other entries.
I know there was an email address for those of us, me included, who lost pages a while back to send a url to but am wondering if there wasn't something similar for supplementals.
I'm also going to check to see if these have PR or not yet as a matter of interest.
Whats more interesting is that I have pages that are 3-4 clicks from home that are not supplemental, while pages 2 clicks from home are.
For a page that goes 404 or the domain expires, Google keeps a copy of the very last version of the page that they saw, as a Supplemental Result and show it in the index when the number of other pages returned is low. The cached copy will be quite old.
For a normal site, the current version of the page should be in the normal index, and the previous version of the page is held in the Supplemental index.
If you use search terms that match the current content, then you see that current content in the title and snippet, in the cache, and on the live page.
If you search for terms that were only on the old version of the page, then you see those old search terms in the title and snippet, even though they are not in the cache, nor found on the live page. That result will be marked as Supplemental.
There are also supplemental results where the result is for duplicate content of whatever Google considers to be the "main" site. These results seemingly hang around forever, with an old cache, a cache that often no longer reflects what is really on the page right now. Usually there is no "normal" result for that duplicate URL - just the old Supplemental, based on the old data. On the other hand, the "main" URL will usually have both a normal result and a Supplemental result (but not always).
Right now I see some interesting bugs in the Supplemental logic.
site:domain.com inurl:www brings 98000 www pages all with a recent cache.
site:domain.com -inurl:www brings 24000 www pages (even though the search says to exclude all www pages) all of them marked as Supplemental and all showing a cache date of almost a year ago.
That should not be happening.
Add to that the pages with meta robots noindex tags on them that have been indexed and cached, and are showing as Supplemental Results with a cache from 2005 June or July, and Google has a bit of a problem on their hands right now.
Oh, and searches with a hyphen in them are not fixed either. Search for an email address with a hyphen in it. See what results you get. Search again, replacing the hyphen with a space and see that thousands of supplemental pages appear from nowhere - all for pages that have (or had) the email address printed on them at some time.
It makes me wonder if they're doing two different updates? Pluse my site (PR7, 3 years old) got whacked June 21 (not June 27)?
site:domain.com -inurl:www - shows thousands of www pages that are Supplemental.
Huh? The -inurl: parameter breaks the search.
Oh, and site:www.domain.com -inurl:www shows Supplemental www pages too! Errrr.....
This is kind of becoming a joke really. If it's not a 301 redirect, it's a bad link. If it's not that, then it's that your title tag is too long, if it's not that then it's because your site is too young, if it's not that it's because you use a redirect from an old page to a new one.
Whatever guys, sort yourselves out Google. KISS - Keep it simple stupid
My thoughts exacly but ... why should they? What is thier reward for sorting out a free search to work properly?
They are nothing but a brand name built on the back of what used to be good technology. It is now broken and has been for some 2 years. THey are trying to do too much with what limited resources they have and software that does not work properly.
The Berkley guys could not give a hoot cause they have made thier billions.
Sucks but there we have it.
joined:June 11, 2005
I had always assumed that supplementals were the residue of old pages that a) were no longer listed within my site and b) were not listed in any other site online. This led me to thinking that maybe I need to reawaken some links and see if I can get Google to re-visit just one more time.
I built a hefty robots.txt file containing disallows for every supplemental page that Google lists for my site. I then instructed googlebot to not have access. Then I added links to these non-existant pages into my sitemap file and also one to my robots file as well.
I then waited and about 2 weeks later I notice that I have around 50 - 60 % less supplementals in the index. I don't know if this is just a fluke or a direct result of the new tactic, but something's definately happening.
All the best
I have achieved this on one site by adding unique text to a number of pages. The site is a very small directory for a specific field. I believe these pages originally contained only a small amount of unique content thereby tripping a duplicate content filter.
The updated pages returned to normal listings after a few weeks
The remaining unaltered pages are still supplemental and will be addresses at a later date.
Yes they can be recovered from what I am seeing. We have had 10 pages in the supplemental last week and now we only have 4 with two removed since yesterday but in saying that I think anything can trigger a page to go supplemental so we'll probably see another change soon.
Here a little personal note, dont change to much on you site these bad google days, they have a lot of trouble with everything and MSN on VISTA is early next year.
1. Made the pages w3c compliant
2. The pages that were supplemental had ProductA-Mycompany, ProductB-Mycompany in many of the title tags. I removed the -Mycompany. (I think google was looking at that as being duplicate....
Now my pages are in the regular index.