| This 56 message thread spans 2 pages: < < 56 ( 1  ) || |
|Supplemental and Omitted Results|
Why do they even exist and why are they so slowly updated?
Ok we all know Google must have some HUGE problems these days, we have NEVER seen so many posts with wierd stuff related to Google the last 1-2 years and all looks to be related to index problems.
Omitted results why?, why not just list all what is in the DB and dont let the user after 4 pages, have to click the link omitted results and then start the search again.
Supplemental results what do we need that for, if they think there is a dublicate page dont index it, but the problem is those "supplemental results" often just have the same meta description as other pages, then they are supplemental, EVEN if the body of the page is a 100% different, also IF by misttake there is included a dub page, then the webmaster has changed it, even a 404, then the supplemental result page is still in the index even if the page dont exist, that is just not a updated search engine.
Look at MSN, Wisenut they dont have those wierd problems and they are up to date with there index everyday.
I get the odd result that I cannot account for (yet!), but 99% of the Supplemental issue is completely logical.
There are just a handful of reasons why a URL is tagged as Supplemental. There is a fixed timescale that they get fixed in. There is a fixed procedure for investigating why a URL is tagged Supplemental, and an easy way to fix those that are there because of Duplicate Content issues.
|TOTALY NUTS because a normal person can see the uniqur content and we may NOT make pages for search engines, but seems lately WE HAVE TO |
If that wasn't the case, spending time on SEO wouldn't make any difference at all.
Before my website went online, I spended my time:
90% on making usefull content for people.
10% on making SEO for SE's.
Since then I have changed my focus and now spend approx.:
60-70% on making SEO for SE's (especially Google).
30-40% on making usefull content for people.
Google do encourage webmasters not to make pages and content for the SE's. Yes - but it seems to me that if a webmaster uses his time and energy on building usefull content for people then he will pay the penalty in the SERP's.
Who will appreciate a website build with people in mind if no one can find it in the SERP's?
the last weeks time I have also see on a site:domain.com search that the descriptions on the results under each page has about 7-8 lines of text, not the usual 2 lines.
Also i have noticed another site of mine which had a rewrite done, changed, title, meta description to uniques, which a week later also where spidered and got out of the supplemental results in that week with pages only in the index with the rewrite urls, search engine friendly, but a few days ago I saw the whole site again is supplemental and I see only old cache with old URLs, once again this is getting on my nervs.
Why does this site supplemental issues get fixed in a few days and other sites it takes years, als why change the serps back, whats the point and im getting tied of typing about google bugs and hick ups, I rather write text as for 2 years ago.
I realy hope MSN will take over next year they dont have ANY of those wierd problems.
For URLs that now redirect, or are 404, Google marks them as Supplemental and shows them in the index for one year.
They will still bring you traffic via your redirect. Once they are redirecting, they are no longer classed as duplicates.
The problem IS fixed. You now only need to investigate URLS that return "200 OK" and which are marked as Supplemental.
Supplemental Results are:
- URLs that used to show content but are now redirecting. These are dropped after one year.
- URLs that used to show content but are now 404. These are dropped after one year.
- URLs that are Duplicate Content and still return that content as "200 OK". These are the only ones that need any more fixing.
- URLs that are relegated there due to very low PageRank for the site as a whole. They are not "good enough" for the main index. This accounts for perhaps 1% of the Supplemental Results that I see.
There is also some interaction of Supplemental Results caused by pages that simply have duplicate title tags and duplicate meta descriptions for pages that are actually very different in their on-page visible content. Usually the results are just "omitted" from the initial SERPs, needing another click to reveal them. However, some do appear to slide into Supplemental. The "omitted results" message is, however, yet another a warning that something needs fixing [threadwatch.org].
g1smd - the wierd thing is just now I see the results which got rewrited no supplemental, so its a switching between servers, I would not say a supplemental DB update (which should happen every 3 month)
What about pages that are in the supplemental index that are still active pages in the site? Let's say they are supplemental due to duplicate meta tags or a 301 redirect from www to non-www. If those pages are doomed to spending a year in supplemental hell, will they then get restored to the regular index after a year? This is assuming all the duplicate issues that got them there in the first place are taken care of.
I thought it was interesting in Matt Cutts' post (in the "fixed" link above) that the duplicate tags could be either deleted or fixed.
Supplemental Results for URLs that return "200 OK" are the only ones that need fixing. That is usually a sign of duplicate content of some sort.
Duplicate Content is simply that where two different URLs (might only be different by one character, or be on different sub-domains, or different domains, or have URL parameters in a different order, etc) return content that is deemed to be the same.
The fixes for "exact duplicates" are to redirect non-www to www, redirect multiple domains to one domain, fix multiple parameter orders for the same content (by using redirects, or by meta robots noindex on the alternative URLs), fix URL capitalisation problems (IIS only), and fix http and https issues. Additionally, fix all of the "index" vs. "/" issues too.
The fixes for "pseudo-duplicates" are to make sure that every remaining active "200 OK" page has a unique title tag and a unique meta description.
Once those are done, the "wanted" "200 OK" URLs should return to the normal index within weeks (when you search for current content), but the same URL will still show up as Supplemental when you make searches for content that used to be on the previous version of the page but is no longer on the new version page. These I call "historical" supplemental results. They are harmless.
The redirected URLs (the ones that return "301" that is) will continue to show up as Supplemental Results for one year after the redirect is put in place. Those URLs are NOT treated as being Duplicate Content. In fact, after the fixes are put in place, you will see that some redirected URLs that were dropped out of the index early on, reappear a few months later as Supplemental Results, with old cache dates, and then stay visible for a year. Those, too, are harmless. They are not treated as Duplicate Content. You cannot control them.
Again, only Supplemental Results for URLs that return "200 OK" are those that you should concern yourself about in any way. Even then, "historical" Supplementals can be ignored, as above.
are you sure there harmless? im not convinced that deleted content isnt used to rate the current page. For example if you overwent a keyword threshold and clened it up. After the cleanup the penalised page still affects the new version rankings. Even after it no longer appears in the supplemental results. Google still keeps a history of that page to rank the current version.
Yes, I am sure they are harmless once the URL returns 301 or 404 to the bots.
For a green widget search I saw:
- www.domain.com/green.widget.html, returning "200 OK", at #2 as a normal result, and
- www.domain.co.uk/green.widget.html returning "301", at #8 as a Supplemental Result, with the cache dated 6 months ago, and the snippet showing old content from before the redirect was applied.
>>>There is also some interaction of Supplemental Results caused by pages that simply have duplicate title tags and duplicate meta descriptions for pages that are actually very different in their on-page visible content. Usually the results are just "omitted" from the initial SERPs, needing another click to reveal them.<<<
i have this exact problem, it's the "pseudo-duplication" issue... i re-wrote the titles and meta descriptions for a bunch of the pages a month ago, but there has been no change in the google search results.
depending on the type of site: query, google returns different total page counts, and worst of all, it never allows you to see all of the pages that it claims are in the search results.
when i put up new content with the correct tags/titles, it gets ignored completely... the new pages have naturally-generated incoming links, but the pages still never appear in the search results.
will a google sitemap get the re-written pages re-crawled and back in the index correctly, along with the new pages?
>> The other problem is that once a page goes supplemental it's nigh on impossible for it to be in the main index again and rank for even secondary or minor keywords. <<
I have now seen so many examples to know that what you state is totally untrue. That includes a site with 180 pages, and one with 50 000 pages. Both were a mess 18 months ago. Today, and for at least the last several months, both have "perfect" listings.
g1smd - have you noticed an increase in new pages indexed in Google once those fixes are made?
After these issues are fixed entirely, by using the tools you have given (check each listing and make sure any supps that are listed return 404, 301 or fix the Title and description content for each, how long can one expect to begin to recover?
How far does one go in terms of fixing issues, such as:
These are not listed in searches for site:mysite.com, site:www.mysite.com with or without inurl:-www
Do we need to be concerned or is that fine?
My yardstick site [webmasterworld.com] now shows "1 to 7 of about 184" and you can then "click to see omitted results" and you'll only then get to see just "1 to 24 of about 184" instead.
The site: command is totally broken. Nothing has been changed on the site for at least several weeks.
[edited by: jatar_k at 1:34 am (utc) on Oct. 6, 2006]
[edit reason] fix link [/edit]
I'm not closely looking at the speed of indexing of new pages at all... so I can't really say too much about that.
However, some of the sites that I have looked at are forums that have fixed these issues, and I do often see their content in the SERPs within a few days or so. I haven't compared speed of indexing against non-fixed forums though.
I have noticed that some WebmasterWorld threads take weeks to be indexed. A few days ago I used Google to search for some exact words that I wrote a week or more ago, and Google did not return a WebmasterWorld result for the query. That surprised me.
Ok, You've used all the techniques for ridding and changing supps, now what is the secret to get Googlebot to revisit them so as they can be reindexed?
This is the first time i post here, i usually read your advises and things that change at Google. Sincerelly i've never reached this point of concern. We have one of the biggest tech sites and forums of Latin America and lately we dropped pagerank (just a little bit) and suddenly lots of our pages went supplemental (forum pages)cause of that yesterday I changed the rewrite rules for the forum urls, today all pages indexed of our website dissapeared and that really worried me cause we are not a MFA sitel.
When I check the live pagerank site it seems that is upgrading, but the last two days this stopped. What you think guys? this is just a momentary problem or Google really messed up things this time?
We actually make a living out of the website, and besides we are not going to lose all of our users because of this, indexation it's really important for us. I'm really woried cause of this
>> how long can one expect to begin to recover? <<
After fitting the redirects, the wanted pages, say the www pages on the .com rather than the non-www or the .co.uk (or whatever), will be more numerous within a few weeks.
Those www .com entries that were URL-only in the SERPs will gain a title and description, and those that were Supplemental will partly return to the normal index: many will show in both indexes for a while depending on the query involved: "historical supplemental".
The other URLs, like the non-www and the .co.uk, will decline in number. Those that are normal results will mainly drop out of view, but those that are already Supplemental will hang around for a year. A few months after the fixes are put in place, some of the normal URLs that were dropped will then return as Supplemental Results and they will remain listed for a year or so too. That is NOT a problem.
[edited by: g1smd at 11:34 pm (utc) on Oct. 5, 2006]
>> suddenly lots of our pages went supplemental <<
This is now widely reported. Google is truly broken at the moment.
>> suddenly lots of our pages went supplemental <<
This is now widely reported. Google is truly broken at the moment.
I don't know what the hell they're doing but only the first page of site:mysite.com search is not supplemental.
This has been going on for a couple of days:
Thanks g1smd for helping many webmasters clarify the issue regarding the supps.
I see my web page in supplemental results.
It is quite uniquye to all other pages for 100%.
So I guess the reason it is in supplemental - it is duplicate(from Google point of view) to some other page on another site. How can I learn which page my duplicate?
Still I dont know why such a system is any good (supplemental/omitted results)
Also the situation where a site wich meta nad title is changed is cleared in the serps 3-4 days after, but another site with the same problem stays in supp hell.
Oh here a few colleges of us in supp hell, those small site are also supplemental
Hmm now 5 min. after my post thoses dont have a supplemental results problem, but I bet in 30 min they are back in supp.
The "omitted results" section hides extra results that return the same title and/or meta description over and over again: [webmasterworld.com...]
The "Supplemental Results" are URLs that are now redirects, are 404, the domain has expired, or the URLs are still active duplicate content, or the pages are deemed not important due to some Pagerank considerations.
See the second of three blocks of text here: [webmasterworld.com...] begins "Supplemental Results are...":
I'm not sure what is going on and if it is across all datacenters but some domains are nearly all supplemental. Have a look at Yahoo & Amazon.
Yes. See the posts linked to above. This has been going on for several days or more.
Something is very broken. Again. It is on most datacentres now.
| This 56 message thread spans 2 pages: < < 56 ( 1  ) |