I understand it might be broken but there is still something going on. I guess I can chalk it up to just one of those days but traffic this bad has only happened once or twice in the past 6 years.
It makes me a little nervous. Complete sections of our site got ZERO hits today and that has never happened. Saying the Site command is "broken" is fine as long as those results do appear in the search engines. I am having a hard time finding those pages on the data centers without using the site command. We all thought things were broken before but it turned out to be a new Algo
Same problem - large portions of site going supplemental. Things seemed to be going well - and the bang. Site command returning tons of supplementals. I did find a few issues to fix. Now, just the waiting time.
The site: operator is not currently returning the kind of results we are used to -- not for any domain that I try, including WebmasterWorld. Do not panic, and do not attempt to adjust your horizontal and vertical.
If you are seeing problems with your Google traffic right now, then the site: operator is not the useful diagnostic tool that it once was. It may come back from its current AWOL condition, or it may not. But I think it is wise in moments like this to stay grounded in the most essential metrics of website health you can find: conversions, total traffic, search traffic, and then rankings. Rankings have got to be down the chain because today, they are often different for different people, and even different one second apart for the same person, same computer, same browser.
If you are not regularly looking at server logs, it's time to start putting those analytics into your regular routine.
I agree with uptil7000 that "it might be broken but there is still something going on". We just don't know what that something is at the moment.
Even my "yardstick site [webmasterworld.com]" now shows "1 to 7 of about 184" and you can then "click to see omitted results" and you'll only then get to see just "1 to 24 of about 184" instead. On those affected datacentres, I can't find a site that does return a proper and correct result anywhere at all now.
The preffered domain just kicked in on one of our sites, all our pages that used to be in the regular index just went supplemental.....
>>site: operator is not the useful diagnostic tool that it once was
They foobarred; they can't intentionally break this tool and blame it on a WW member's recommendation like they did w/ the link: operator! Adsense for Search is useless now except for the ads.
I just checked out my site
333 results returned, only the first 2 aren't supplemental, but after clicking on the "show omitted links" I get up to 290 pages, then it dead ends.
There's been a great disturbance in the Force....
lol i crack myself up.
[edited by: youfoundjake at 1:31 am (utc) on Oct. 6, 2006]
Something is going on that is very strange...
I was writing some code for google... UPDATE webpages SET suppresult = '1' WHERE count(content) > 1;
I awoke in a cold sweat with the thought... Oh my gosh, how could I undo what I just did?
|More than six of Google's existing 44 Class-C IP blocks are showing truly whacky results. The site: command is truly broken on those. |
I agree that the results are very whacky, and that the site: command is showing varied results, but the "broken" results reflect the decrease in traffic. As soon as the majority of my pages went supplemental (and my most popular page missing altogether) using the site: command across multiple data centers, my traffic dried up immediately. They are definately connected, not just broken.
Whacky site: results = whacky traffic patterns.
I had 1 page that performed well above the rest of my pages, was indexed well, getting lots of traffic, and many good inbound links. After Sept 30th refresh, the page is GONE, MIA, NOWHERE TO BE FOUND on any of the google data centers and traffic has disapeared 100% from google.
All summer I have seen these refreshes treat my site like a roller coaster. 1 goes up, then next down, then back up, then back down.
You would at least expect SOME kind of consistency with Google after all this time in the search business, but the only thing that has been constant is their inconsistancy.
internetheaven should resolve all duplicate content issues and make sure to take care of the so called canonical domain 301 www/non-www and wait at least 2 weeks before thinking again about using the removal tool.
I can confirm that using the site: command is now likely to cause more issues and will lead to lots more unfounded and misleading speculation.
A quick example for you. I rank on the first page of google for lots of fairly competitive two word phrases and have done for some years. The site commmand show that I have 3 non supplemental pages for my site and if I repeat the results with the omitted results included then the number of visible pages falls well short of the number of actual pages. The page that ranks is not visible using a site command and must have therefore been removed - wrong! Run the search on google and it is visible on the first page. Next phrase - the site command shows the page is supplemental and a search for the phrase shows me on the first page with the page being non-supplemental.
In other cases the page that ranks is shown as supplemental using the site: command BUT as supplemental when running a search to retrieve the page in other case non-supplemental. Visible and not visible, etc and every combo that you can test. Completely screwed up, basically, and useless.
The consequencies are that my traffic is not affected one iota even though I have gone completely supplemental and 95% of my website has been removed! The fact is that it has not gone supplemental or been removed at all just that the site command misleads you to this conclusion.
My conclusion, you can work out for yourself how I arrived at it, is that there is no such thing as a supplemental index, sandbox, etc. What you are seeing is Google applying a series of "visibility" filters to a SINGLE database and integration with the site: view just got screwed.
As for the Google crew, then no doubt thay will continue to perpetuate the various myths that there are. As for the rest of us, if you want a fairly easy way to check whether your pages are in the index and visible via search then run a search something like - "phrase one" "phrase two" - where you know that the two phrases are on your webpage and you will easily prove to yourself that the site: command is so useless that it should be disposed of as soon as possible. DO NOT USE IT, OR IF YOU DO, DO NOT MAKE ANY CONCLUSIONS.
Might not relate to what looks like other issues, but on the general suplpemental issue...
|I have a theory that any page on a site that doesn't have a significant backlink(s) is going supplemental |
Phil C at W3bW0rksh0p was among others who spotted this at the end of August and quoted Matt Cutts as saying (edited)
|having supplemental results these days is not such a bad thing. In your case, I think it just reflects a lack of PageRank/links. We’ve got your home page in the main index, but if you look at your site ... you’ll see not a ton of links ... So I think your site is fine ... it's just a matter of we have to select a smaller number of documents for the web index. If more people were linking to your site, for example, I’d expect more of your pages to be in the main web index. |
It is a good thread to save to point out how looking at "my site" can lead you to draw bad conclusions and consider all sorts of wrong things.
This is a global Google problem...
I wouldn't try to touch anything on any site that was working just fine before this bug...
Many datacentres no longer seem to have this bug in the site search. Last night more than 80% of all DCs had the bug. Today, I tried 10 (out of 44) and none of them had it.
I'm not seeing the bug anymore either; looks like they fixed it.
How do you check if your site is supplemental or no?
Would you please let me know how do you check and what do you look for?
if site: is producing supplementals you either have insufficent PR (I bet the pages don't even have a decent link to them - as in a majourity of the pages have at least one decent link to the URLs)
you have duplicate content on pages within your site. Might be a tiny footprint, but it will have an effect.
- ouch, time to build something worthwhile I say - you filthy spammer. LMAO.
(also g1smd do you ever work?)
"Might be a tiny footprint"
What do you mean with this?
Has anyone else heard anything about search operators being removed from public use and made exclusively available via Webmaster Tools? I have heard whispers about this for a while now, with some suggesting that operators will only be available via (paid?) subscription. Personally, I would be only too happy to pay for this service. Comments? Intel?
>> How do you check if your site is supplemental or no? <<
It is not the site that is Supplemental. Supplemental shows up on a URL-by-URL basis, and those that are, have the word "Supplemental Result" printed in green text right there in the SERPs. There are several recent threads about "duplicate content" and "supplemental results" that are well worth reading first.
>> g1smd do you ever work? <<
Yeah, but only about 80 hours per week.
|Has anyone else heard anything about search operators being removed from public use |
Not on G, but Y has recently redirected site:xyx.tld type searches to Site Explorer, (and asking you to log in to get more features).
Maybe G is following suit...
|I agree with uptil7000 that "it might be broken but there is still something going on" |
So do I, very stongly in fact. There is some history of seeing what first appear to be abberations and oddities prior to G changing the fundamental way it does things, or display some kinds of information.
Looks to me that there could be a fundamental shift afoot in the way supplemental designations are applied. Just a guess, but it seems that this might especially apply to pages in supplemental hell because of duplicate content issues. Very interesting. Makes me realize that this could make confirmation of dup issues in some cases a whole lot harder, and would place a higher value on experience in dealing with such matters. SEO consultants: Grab your crystal balls. They're going to come in more handy than ever soon, or so it seems from where I sit.
What happens if a page is supplemental but it starts receiving incoming links from trusted sites? Does it leave supplemental?
First, a "page" does not go supplemental -- its the "url plus cache date" that results in a Supplemental tag. Whether a direct link to a given url will remove the Supplemental tag or not depends very much on why the assessment was made in the first place. So the answer is "it might work, and it might not." A more detailed discussion is available here:
Supplemental Results - what exactly are they? [webmasterworld.com]
If it is a URL that delivers a page of content with HTTP "200 OK" status, and it is an "exact duplicate" of some other URL that delivers that content, then whichever one has the most PR is likely to win. That is, the other URL will slip into Supplemental instead. Avoid all that by using the 301 redirect so that only one URL is ever indexed for that content.
If that URL is now returning either a 301 or a 404 status then it will never appear in the normal index again. It will remain tagged as Supplemental for a year, and then drop out of view.
PageRank plays a part in the Supplemental Problem for some sites, but mostly the Supplemental URLs are either simple duplicates, or they are cached copies of URLs that used to directly deliver content, but now no longer do so.
So, my site has been supplemental for a while now. It's not really possible to have duplicate content as it's a price comparison site in South Africa and there are only 2 such sites, having completely different designs and content.
Strangely, 6 random pages are not supplemental. The home page and 5 other search results pages.
This is all so confusing, and the recent update from Google is not very encouraging.
Would it be better for me to remove these supplemental pages and start again? I know Google spiders all my pages quite regularly.
"First, a "page" does not go supplemental"
If an up-to-date version of the page is listed in the index, then suddenly a very out-of-date version (with the supplemental tag) replaces it, I'd call that going supplemental.
What Tedster was trying to explain is that a Page of your site, what you would call a "page" of content, might actually have more than one URL that can be used to access it.
Google does not index pages; it indexes URLs. If two or more URLs return the exact same content, then most, or all, of them get tagged as Supplemental.
Check back what I was saying about "exact duplicates" and "pseudo duplicates" posted in about a dozen different threads in the last couple of weeks, for more information.
somehow, there seems to be more to it. I'm not the most experienced SEO guy around, but checking the progress of my site through google everyday has got me bothered!
My pages all have unique meta tags and titles. Content is different because every page returns a different list of products for price comparisons.
If certain niche sites get to stay indexed preferentially, surely price comparisons are in this category as they show product names, short descriptions and prices.
So, after trying and trying, somehow only 6 URLs are non-supplemental and there is no www, non-www issue as no-one (including) spiders has ever visited the non-www version. Also in SiteMaps I specified to only use the www version.
There seems to be something sinister going on, but because of my inexperience I can't put a handle on it.
| This 120 message thread spans 4 pages: < < 120 ( 1  3 4 ) > > |