google is doing this all over, it is putting pages into supplementals that 'appear' to offer the same content as another page or are considered less important (by your own internal linking structure) than other pages on your site.
you may find that product pages are relegated to the supplemental index, they will still come up in searches but only when they are very relevant (above other websites who's pages arent in supplementals)
look at those pages in the supp index and see how you can add to them and make them unique such as adding unique copy. also look at how your site links internally. you may find more internal links point to pages that arent in the supp index than to the pages that are.
good luck :)
if pages go supp have you had any luck getting then back into the normal index following the guides you lay down which is looking at the internal linking structure and trying to get more unique content onto the pages?
if you look at the sort of content that appears in the supp index compared to what doesnt it does make sense
im applying my theories and pages are coming out of supp so i am assuming im on the right track ;)
these supp pages are still causing a lot of problems for me and to a certain extent some of the pages on my site I can understand why they have gone this way but others have completely unique content and are not even similar in layout to the rest of the site but have still gone supp - odd
These pages all have unique content.
They are blog entries, therefore each having there own individual pages as well as being part of the archive and the frontpage until they get pushed out by other entries.
I know there was an email address for those of us, me included, who lost pages a while back to send a url to but am wondering if there wasn't something similar for supplementals.
I'm also going to check to see if these have PR or not yet as a matter of interest.
The pages in question have no PR.
On top of this, when doing a content search between "text", text that is clearly on the site and quite unique doesn't return any results.
And the saga continues...
One of my sites went supplemental and I think it was due to on site spam. The guy before me had keyword stuffing and the site was using co-op so there were irrelevant footer links which, according to Matt Cutts, can lead to these types of problems.
A site:search -inurl:www shows all supplementals for my site, however a site:search shows all active urls. What's really interesting is that the page Titles of the data returned from a site:search show Titles that havent been used in over 7 months, as if Google is returning ancient data for each one of those pages...
Whats more interesting is that I have pages that are 3-4 clicks from home that are not supplemental, while pages 2 clicks from home are.
Once a page URL has gone supplemental can it be recovered?
After the Jagger farce, I completely built a new site to replace an old one which was defunct due to all pages being supplemental due to session ID problem. The new site was fine, and then more and more pages were dropped until only the index page left. I removed the site from Google with the removal tool and replaced it. New site was duly crawled and indexed fine. Within weeks most pages have gone supplemental - the problem not mine this time - instead of showing the correct meta title and description tags, Google has merged together page content to make title tags, including the home page alt tag on every page. I give up.
My personal experience thus far with several sites is that once a page has gone supplemental, googlebot never touches it again, however much you update it (cache on pages of the last site was over a year old and never updated).
Remember that there are several types of Supplemental Results.
For a page that goes 404 or the domain expires, Google keeps a copy of the very last version of the page that they saw, as a Supplemental Result and show it in the index when the number of other pages returned is low. The cached copy will be quite old.
For a normal site, the current version of the page should be in the normal index, and the previous version of the page is held in the Supplemental index.
If you use search terms that match the current content, then you see that current content in the title and snippet, in the cache, and on the live page.
If you search for terms that were only on the old version of the page, then you see those old search terms in the title and snippet, even though they are not in the cache, nor found on the live page. That result will be marked as Supplemental.
There are also supplemental results where the result is for duplicate content of whatever Google considers to be the "main" site. These results seemingly hang around forever, with an old cache, a cache that often no longer reflects what is really on the page right now. Usually there is no "normal" result for that duplicate URL - just the old Supplemental, based on the old data. On the other hand, the "main" URL will usually have both a normal result and a Supplemental result (but not always).
Right now I see some interesting bugs in the Supplemental logic.
site:domain.com inurl:www brings 98000 www pages all with a recent cache.
site:domain.com -inurl:www brings 24000 www pages (even though the search says to exclude all www pages) all of them marked as Supplemental and all showing a cache date of almost a year ago.
That should not be happening.
Add to that the pages with meta robots noindex tags on them that have been indexed and cached, and are showing as Supplemental Results with a cache from 2005 June or July, and Google has a bit of a problem on their hands right now.
Oh, and searches with a hyphen in them are not fixed either. Search for an email address with a hyphen in it. See what results you get. Search again, replacing the hyphen with a space and see that thousands of supplemental pages appear from nowhere - all for pages that have (or had) the email address printed on them at some time.
I'm thinking about redirecting my whole site to subdomains, page by page. Google likes subdomains. Maybe then I can get out of supplimenthell.
It's weird that one of my site have totally different symptoms than most you guys here. Most are reporting their entire site has gone supplemental where NONE of my pages are supplemental. All intact and indexe in Google, just not showing in SERP (no keyword at all)
It makes me wonder if they're doing two different updates? Pluse my site (PR7, 3 years old) got whacked June 21 (not June 27)?
site:domain.com - shows normal www pages.
site:domain.com -inurl:www - shows thousands of www pages that are Supplemental.
Huh? The -inurl: parameter breaks the search.
Oh, and site:www.domain.com -inurl:www shows Supplemental www pages too! Errrr.....
|The -inurl: parameter breaks the search. |
Hmmm... -com seems to do almost the same thing for a dot com, and -org for a dot org.
This is kind of becoming a joke really. If it's not a 301 redirect, it's a bad link. If it's not that, then it's that your title tag is too long, if it's not that then it's because your site is too young, if it's not that it's because you use a redirect from an old page to a new one.
Whatever guys, sort yourselves out Google. KISS - Keep it simple stupid
My thoughts exacly but ... why should they? What is thier reward for sorting out a free search to work properly?
They are nothing but a brand name built on the back of what used to be good technology. It is now broken and has been for some 2 years. THey are trying to do too much with what limited resources they have and software that does not work properly.
The Berkley guys could not give a hoot cause they have made thier billions.
Sucks but there we have it.
Please remember that this "search engine" is still their core business, the rest is build around it and most of that is depending on people using the google search engine. Right now the search results are lame, after 2 tries getting crap users will move to other search engines to get better results. Since they are now on the stock market they should be carefull. That is maybe also the reason why nobody from google is mentioning anything about this.. I dont agree that they shouldnt care because they allready made billions, google has stock holders to take care of now .
Here's what I did to clear my supplemental issue:
I had always assumed that supplementals were the residue of old pages that a) were no longer listed within my site and b) were not listed in any other site online. This led me to thinking that maybe I need to reawaken some links and see if I can get Google to re-visit just one more time.
I built a hefty robots.txt file containing disallows for every supplemental page that Google lists for my site. I then instructed googlebot to not have access. Then I added links to these non-existant pages into my sitemap file and also one to my robots file as well.
I then waited and about 2 weeks later I notice that I have around 50 - 60 % less supplementals in the index. I don't know if this is just a fluke or a direct result of the new tactic, but something's definately happening.
All the best
Just sent a nice email to Google, let's see what comes of it...
Today, most of my site pages are out of supplemental and showing recent cache date i.e. 18 July 2006. Last week Home page PR dropped from 5 to 0. I think its because of reinclusion request which I sent 2 weeks back.
So can you send a reinclusion request to get a site out of supplementals? (Assuming changes have been made to a site to get rid of reason for being in supplementals in eth first place!)
Once a page URL has gone supplemental can it be recovered?
I have achieved this on one site by adding unique text to a number of pages. The site is a very small directory for a specific field. I believe these pages originally contained only a small amount of unique content thereby tripping a duplicate content filter.
The updated pages returned to normal listings after a few weeks
The remaining unaltered pages are still supplemental and will be addresses at a later date.
Once a page URL has gone supplemental can it be recovered?
Yes they can be recovered from what I am seeing. We have had 10 pages in the supplemental last week and now we only have 4 with two removed since yesterday but in saying that I think anything can trigger a page to go supplemental so we'll probably see another change soon.
if you want to get out of supplemental results, just change your meta discription so its unique on every page, but still I think it suc.., because a page can be with totaly different content, but if the discription is the sam, smack it get supp.
Here a little personal note, dont change to much on you site these bad google days, they have a lot of trouble with everything and MSN on VISTA is early next year.
Yes, supps can revert back to real pages. As of last week, I've completely recovered from supp land, and traffic has quadrupled! In my communication with Matt Cutts I was told that my issue was simply a matter of PR- and it looks as though this was the case. I simply week on a two week link aquisition campaign and 2 months later... back in the game.
My pages went from supplemental to the regular index. I did two things.
1. Made the pages w3c compliant
2. The pages that were supplemental had ProductA-Mycompany, ProductB-Mycompany in many of the title tags. I removed the -Mycompany. (I think google was looking at that as being duplicate....
Now my pages are in the regular index.
To repeat again, you can't do anything about supplementals. They can not be "recovered". What you can do is have both a normal listing and a supplemental listing for the same URL. That has always been easy to do, especially for sites without dupe content issues. Millions of pages that have normal listings also have supplemental listings lurking in the shadows which appear if a normal page drops out of the index for some reason.
Does a site suffer from ranking issues at the root (index) page if it has too many duplciate content supps on internal pages?
[edited by: CainIV at 11:27 pm (utc) on July 21, 2006]
| This 40 message thread spans 2 pages: 40 (  2 ) > > |