I have the same problem too, I have over 250 Pages, however, 125 of them are now falling into Supplemental results though the content is unique and the number of supplemental results are increasing day by day....
The question is whether The supplemental results will affect the ranking of the homepage and other pages? or Google will devalue my site.
P/S: And now I think that we:
- Write new page and SEO for that page at the same time.(Avoid Supp)
- Write all pages and then SEO for all (Supp increasing)
How are you checking the status of your pages - are you using the site:tool?
I'm really unclear how much you can reply on this essential tool as it appears to have been broken for 2-3 months or more
I am using the site tool. All of my supplemental pages do have a cache date.
|All of my supplemental pages do have a cache date. |
Out of interest, what dates are showing? I've noticed the same problem over the last couple of weeks or so, but also notice the cache date goes back to early February.
I am suffering from the same problem also, has left me scratching my head somewhat.
I see this more often on more sites.
I find that I can double-check to see if a page is Supplemental by searching for some long-tail terms relevant to that page. Then compare that for equivalent terms for a similar page that is marked as non-Supplemental.
Supplemental pages will feature far lower in the SERPS; Google will always return a page from the Regular index if it can.
If your site architecture is 'prioritising' the pages you want to be non-Supplemental then you need more links - preferably directly to the pages that are Supplemental.
My cache dates range from 3/7 - 4/8....guess that is promising?
Never forget that a URL can be shown as a Supplemental Result for some search terms and the same URL can be shown as a normal result for other search terms - usually representing the newer version of the content on that page.
Supplemental index bloat is the new way Google has adopted to reduce their main index load.
We all better get used to it and I doubt if there is anyway we can change it.
|Never forget that a URL can be shown as a Supplemental Result for some search terms and the same URL can be shown as a normal result for other search terms - usually representing the newer version of the content on that page. |
I understand how to monitor this on "some search terms".
BUT ..... how are we to reliably monitor it sitewide with the current site:tool broken. I'd have thought this was fundamental to the monitoring abilities of webmasters.
Don't use the site: tool...Use the inurl: tool. Much more accurate. Also what is the PageRank of all of your sites? MC was quoted as saying that PR and inbound links are directly attributed to Supplemental Results.
[edited by: MLHmptn at 7:53 am (utc) on April 12, 2007]
|Use the inurl: tool. Much more accurate. |
It certainly is different. On all the sites that I've checked inurl: doesn't show any 'Supplemental' results.
I checked one site in detail so far - inurl: shows pages in a different order - and it shows different pages, although the cache dates are all the same.
The site has new pages weekly and has over 100 pages - 19 in the regular index and another 50 Supplementals according to site: but 21 in the regular index according to inurl.
The 2 'extra' pages that were attributed to the site according to inurl: were both ranking 1st place for their terms.
This might be a silly suggestion - but when I added a bunch of content (28 pages in one hit) to a client site recently, it was supplemental for around 6 weeks and then came out - is this just a time delay/sandbox effect ie google getting more aggressive about waiting a while until it trusts the new stuff?
Thanks - that shows no supplementals on our sites. Still I'm intrigued why it shows 1 of 1 and then the omitted results. PR, unque content etc. are OK.
[edited by: Whitey at 1:23 am (utc) on April 13, 2007]
Thanks - that shows no supplementals on our sites.
inurl: will not show supplemental results, it's just a quick means of finding how many of your URL's are actually in the index not in the "SUPPLEMENTAL" index.
Are any of your sites a TBPR 6 or greater? I know, I know, TBPR is useless, blah, blah, blah, but what other PR tool can we go off of? I'd be surprised to see a TBPR 6 site going supplemental at all unless it has duplicate content in Google's eyes. My TBPR sites that are 5 or greater seem to be gaining "REGULAR INDEX" status whereas my lower PR sites in the 3-4 range are going supplemental FAST, unique content or not.
[edited by: MLHmptn at 5:52 am (utc) on April 13, 2007]
site:yourdomain.com *** -view
will show you what is supplemental. Big G have a lot of supplementals :)
Found it this morning on another forum.
inurl is not much different for me.
thanks for the tip
using inurl: i'm getting six results non supplemental, and about 165 supp results
When I use site: i get about 25 non supp results and the rest are supplemental.
Cache dates range from feb to mid-march, all pages have unique titles and meta descriptions.
For whatever its worth, some of my supplemental pages are actually number 2 or 3 for certain query's so even though its being marked as supplemental against the whole site, the pages in and of themselves are ranking well against other web sites, which is I guess is what really matters.
|Still I'm intrigued why it shows 1 of 1 and then the omitted results. PR, unque content etc. are OK. |
I don't know why it displays that way either, but from what I can tell, those omitted results aren't really omitted - they're coming up in plenty of searches for me, and ranking quite well.
A site: tool command on WebmasterWorld website returns 13 results... I think there is nothing we can do against going supplemental. For now I don't see any effects of that supplemental on the SERPS. Let's wait and see what GG will make with this.
The site tool is broken.
On one site, I get 1 to 1 of 1, then when I click "omitted" I get 1 to 190 of 190.
On another, I get 1 to 3 of about 2, then when I click "omitted" I get 1 to 45 of 22.
On another, I get 1 to 8 of about 15, then when I click "omitted" I get 1 to 93 of about 142, and never get to see the rest.
On another, I get 1 to 5 of about 6, then when I click "omitted" I get 1 to 120 of 120, but I know the site really has 186 pages.
When I get beyond about page 2 or 3 of the SERPs (whether on 10 or 100 results per page), I stand a Very Good Chance of getting a "Sorry... Your request looks like an automated query or a virus" message, and no more results.
I want to mirror g1smd's observation, exact same behavior on my 3 sites. I think the site: and inurl: tool has been broken for awhile, but as long as my pages are ranking for their terms, I'm willing to let it slide. Its when one outage affects another is when I become concerned.
|Are any of your sites a TBPR 6 or greater? I know, I know, TBPR is useless, blah, blah, blah, but what other PR tool can we go off of? I'd be surprised to see a TBPR 6 site going supplemental at all unless it has duplicate content in Google's eyes. My TBPR sites that are 5 or greater seem to be gaining "REGULAR INDEX" status whereas my lower PR sites in the 3-4 range are going supplemental FAST, unique content or not. |
We have a range of visible PR from 6 to 1 , but the reality is that there's no way of checking without the site:tool working.
I don't know what happened to those promises at Google for it to be fixed, and i don't know how any webmaster can accurately comment on the overall health of their site's pages without it.
I've found that most of our supplimentals have a cache date of Feb 22, 2007 through March 16th
Is this a good thing or bad... are these possibly in a temporary holding or sandbox?
|i don't know how any webmaster can accurately comment on the overall health of their site's pages without it |
I too have mourned the loss of dependable data from the site: operator -- but it was never the be-all end-all for me. It was more of a quick snapshot that was easy to check.
The most important data comes from the traffic in our server logs and our conversions from that traffic. If our logs show Google sending traffic to a particular url from a certain search phrase, then that is the truth of the situation. This is true no matter what the site: operator says, and no matter what rankings are visible to us in our specific location. If Google labels a url Supplemental somewhere or other but we still get traffic and sales - then that's healthy and who cares about the label? If they don't show a url as Supplemental but it gets no Google traffic, then that's not good.
And if Google chooses to eliminate 90% of the urls in a domain from their index, but they send healthy amounts of well-targeted traffic to the other 10%, that can be a very healthy picture.
More than ever, it is essential to stay focused on traffic, conversions and profitability, and to have our own analytics that can provide us with actionable data. And yes, we can still all hope for more informative metrics from Google in the near future.
I'm seeing hundreds of pages that come up in positions 1-4 for searches with more than 2 million results - and all of those pages are supplemental. Doesn't look very efficient to me to pull first page results from another index.
Maybe (beware) - they will split the indices later and now simply indicate where we'll wind up after the big split bang.
Or - (please) they are just having some trouble with the site: command, which is not a top priority for them because only webmasters will use this tool.
"More than ever, it is essential to stay focused on traffic, conversions and profitability, and to have our own analytics that can provide us with actionable data."
That is backwards. More than ever it is important to see symptoms of disease before they become fully cancerous. A person with a disease can function normally for some time before being debilitated by the disease. The current profitability of a page is only one thing to help focus your attention.
If suddenly 25 of your 94 site pages are supplemental, but you make the same amount of money that day or week, you are an looney to ignore that.
Waiting until you start losing money is foolish, like waiting hours until you feel lightheaded from blood loss before putting a bandage on a cut.
I would agree 100% steveb -- if Google were using the Supplemental Index the same way they did last year -- or if the site: operator made consistent sense. But we tend now to be in a situation where we can't always trust the "lab report" that Google gives us when we go for our cancer checkup. And to take actions based on mistaken information can also create a real mess.
I've noticed that many people ONLY know what Google tells them, and that's not a good idea either. So I wanted to underscore the importance of analytics that you control and can vouch for.
I certainly won't stop checking the site: operator results, nor would I suggest that anyone else do that. i do, however, suggest not reacting too quickly to what we see, especially if it isn't mirrored in our server logs.
So are we saying that pages may actually be in the index, regular or supplemental, regardless of what site: and inurl: command return?
| This 32 message thread spans 2 pages: 32 (  2 ) > > |