| 6:09 am on May 8, 2007 (gmt 0)|
Get some links. Better yet, if people link to you naturally you'll see those pages get indexed. So I guess I should say get some traffic ;)
| 7:11 am on May 8, 2007 (gmt 0)|
G is going on to index as supplemental results, 5 pages at time.
I'm so disappointed because in every page there is USEFUL content for users.
Don't be evil, of course. LOL
| 8:05 am on May 10, 2007 (gmt 0)|
I wouldn't worry much about the supplementals. I have mostly supplemental pages. They show up in SERPs and they bring traffic just like the non-supp pages. A few pages that brought significant traffic while they were supplemental have become non-supplementals.
Basically the only difference I've seen between supp and non-supp is that the SERP text snippets shown for a supplemental page tend to be sort of an incoherently mangled selection of the text from the page, whereas non-supp page text snippets usually state concisely what the page is about. It's as though supp pages have a lower priority for a "high level" of text processing.
Like I said, though, they both bring traffic.
About "useful" content, my opinion is that supplemental status says nothing about the usefulness. It does say something about the market for that information. In other words, you might have the best site in the world for a particular niche, but if it's a niche that only 100 people in the world care about, your pages will be supplemental, not because they're bad but because Google has no market of people seeking that information, so the pages get lower priority for processing.
On the other hand, when those 100 people do seek your information, your pages will still turn up in SERPs, whether supplemental or not.
Something strange happened recently that made me think Google might also apply the "how big is the market for this?" measure with newly indexed pages, too. Most of my new pages take a couple or few weeks to get indexed. They have AdSense on them, so Mediapartners crawls them right away, but Googlebot might not get around to them for quite a while. Recently I posted an article that I didn't realize was about a hot topic. (It also was barely related to what my site is about; oh well...) Googlebot snatched it up almost immediately, it went to the top of the search results, and is my most popular page. Seemed to me like Googlebot looked at the page and "thought", "Oh, yeah, there's lots of people looking for this information."
[edited by: SteveWh at 8:17 am (utc) on May 10, 2007]
| 12:55 pm on May 10, 2007 (gmt 0)|
No, supplemental results have to do with PageRank and backlinks, not on the popularity of a search query.
| 5:30 pm on May 10, 2007 (gmt 0)|
Yes you need to get more high PR backlinks, in many cases supplimental is about low PR.
| 5:40 pm on May 10, 2007 (gmt 0)|
I don't believe supplementals have anything more to do with Page Rank (well only coincidentally) than they do search query volume. And it's certainly not about duplicate content, unique content or any kind of content (well, not for most sites). You tell google which pages are supplemental on your own site by the way you link them.
| 5:44 pm on May 10, 2007 (gmt 0)|
And those links are "votes" that send PageRank.
| 6:38 pm on May 10, 2007 (gmt 0)|
But I think the page rank itself is a red herring.
| 7:29 pm on May 10, 2007 (gmt 0)|
|But I think the page rank itself is a red herring. |
PageRank is a metric Google uses to asses page value. Enough quality links to a page tells Google its important enough to get indexed. If no one links to a page, the value of the page is up in the air.
When people link to both non-www and www versions, link juice is split whereas if people linked to only the www version, PageRank would be consolidated into one url. That's why duplicate content caused by cannonical issues can lead to supplemental results.
It used to be that a spammer could inject thousands of pages into Google, point a few low quality links at them, and use internal PageRank to rank for thousands of queries. Google countered that tactic by requiring a minimum PageRank threshold for pages to be allowed in the main index.
Think of IBL as a pizza pie. When you split that up into thousands of slices, you get a bunch of teeny weeny slices. Say you're distributing that among thousands of spam pages. Because many page doesn't have enough juice (minimal PageRank + sum of IBL PageRank/100,000 = not alot o' juice), they drop out of the main index. Because those pages are no longer in the index, their internal links don't pass juice either. And this leads to kind of a domino effect, where at the end of the day, only a handful out of thousands of spam pages remain in the main index. And without the internal link support, those pages rank for very few queries. The supplemental pages will still pull traffic, but they will not rank for competitive queries. This way, Google tries to restrict spam to long tails where the top 10 results are generally low TrustRank, low PageRank pages
Google's set up a system where given a set of IBL with total PageRank X, there's a limit of how many pages stay in the main index. So a TBPR 2 site with say 2 links from a TBPR 5 site pointing at the home page will have an uphill battle trying to get Google to index 100,000 pages.
A few Matt Cutts quotes:
|PageRank is the primary factor determining whether a url is in the main web index vs. the supplemental results. |
|typically the depth of the directory doesn’t make any difference for us; PageRank is a much larger factor. So without knowing your site, I’d look at trying to make sure that your site is using your PageRank well. |
| 7:55 pm on May 10, 2007 (gmt 0)|
Listen to Tedster, he is a genious and extremely helpful.
| 8:43 pm on May 10, 2007 (gmt 0)|
I stand corrected. If Matt himself said it, it must be true?
I had figured it was more to do with google cutting off swathes of pages like branches from a tree because those branches were not wll connected enough to the main - they hun g by a thread and therefore couldn't be the crux of the site/ the most important pages.
| 9:35 pm on May 10, 2007 (gmt 0)|
Aside from the "supplemental" :) remarks and speculations I offered about supplemental result causes, I stand by my suggestion not to get too bent out of shape about supplemental results. I believe Matt Cutts has said that, as well.
I checked again today (I don't check often), and another dozen or so pages that used to be supplemental are now not. There is no good reason that some of these pages should have made the transition. They are of historical interest only, have absolutely no backlinks, are not likely ever to have any, get nearly zero traffic, and they realistically have little value to most of the people in the world, so to speak. The only thing that would explain the transition is that they have existed on the web for 18 months.
Other pages that do have backlinks and do get traffic are still supplemental.
Furthermore, a site: search done 1 hour later turned up 100 less pages than the previous site: search!
Google does not want you to be able to manipulate your own PageRank, SERP rankings, or supplemental status. They take active steps to prevent you from figuring out the system and manipulating it. Every time someone posts a message saying, "I've tried and tried and just can't figure this out!", someone at Google high-fives their neighbor and shouts, "It works!"
| 10:14 pm on May 10, 2007 (gmt 0)|
"I stand by my suggestion not to get too bent out of shape about supplemental results."
Might as well not get bent out of shape if Google doesn't crawl your website.
Back to reality, having only a supplemental result for a URL is poison. Pages that rank #1 when not supplemental seldom rank in the top 500 when supplemental. When you find supplementals you should deal with them immediately (and "dealing with" them could include deciding not to care because the page is trivial). Get more link, use the URL removal to to get rid of the page and start over, whatever, if the page is important at all then doing nothing is suicide.
Supplementals are like a car without any gas. You can put more gas in it by getting more links, or use a different car by making a new page, or push the car out of gas car around town by doing nothing. The latter is the worse solution for any page that matters.
| 10:20 pm on May 10, 2007 (gmt 0)|
I agree -- if a "page that matters" acquires a supplemental status, then it's time to understand and fix. But there's not much chance of a site of any significant size that's completely free of supplemental results on today's Google. Just make sure you agree with Google's assessment that those really are backwater pages, and that they only matter for the rare searches where they do get retruned.
| 12:22 am on May 11, 2007 (gmt 0)|
I don't post often... I usually just lurk...but I have to comment on this one.
I WOULD BE concerned about having WebPages in the supplemental index... do any reasonably competitive / popular search using Google and tell me if you see any TOP (position 1-10) search results that are labeled as "supplemental"... you don't see it!
G pulls from the supplemental index only when G has scoured their main index and determines there are not enough search results in their main index to draw from... so G then serves up results from their supplemental index... this usually happens for less competitive terms or long tail terms.
I own and operate 14 websites. A few are PR6 > all the way down to one website that used to be a PR1. My PR1 website had 90% of its pages in the supplemental index. It wasn't until I put some link building effort into this PR1 website and raised it to a PR6 that 95% of the pages dropped out of the supplemental index and were moved into the main G index. I have observed this time and again with my various websites... and I have observed this also for many of my friend's websites I help SEO.
SteveWH, with all due respect, you are missing the boat on this one. Your website you have listed in your profile is a PR1 > 90%+ of your WebPages (about 200) are in G supplemental index.
To all you newbies... listen to Tedster and Halfdeck they offer sound advice and are right on target. Matt Cutts (G employee) has explained and offered hints about G's supplemental index > if you want to rank at the top in the natural serps for semi-competitive / popular to really competitive terms... you better care about PR which has a high correlation to how deep and frequently your website gets crawled... and your PR for a webpage has a direct correlation to whether or not your webpage is in G's supplemental index or G's main index.
| 12:56 am on May 11, 2007 (gmt 0)|
So at the end of the day - you need to get more good quality on topic links to your site to boost your PR - This flows to your deep pages and brings them out of supplimental correct? The larger my site grows the more supps I have the less traffic I am getting from Google..
| 2:09 am on May 11, 2007 (gmt 0)|
On topic doesn't matter, which of course is partly why Google sucks. It crawls pages with lots of off topic, irrelevant spam/blog/guestbook links.
It's it not just PR. PR3 pages go supplemental all the time. You need VOLUME of links. 1000 PR1 links will normally be better than 1 PR4 link. PR1 and PR2 pages with no links better than PR pointing at them can be nonsupplemental while PR3 pages with less than five links will be.
You need volume of links, multiple crawl paths, and PR also... and then not have duplicate issues.
| 2:47 am on May 11, 2007 (gmt 0)|
Thanks Steve I will keep that part in mind. I think a volume of links is not so hard to overcome. In fact it is dead simple if they can come from anywhere...
| 11:29 pm on May 11, 2007 (gmt 0)|
I believe if a PR3 page goes supplemental the reason is not PR.
| 11:44 pm on May 11, 2007 (gmt 0)|
But of course it could happen they update their real time PR and discount the links. Result: the page gets PR0 instead of PR3 and becomes supplemental. In this case toolbar still shows PR3 but in reality it's PR0.
[edited by: SEOPTI at 11:44 pm (utc) on May 11, 2007]
| 7:18 pm on May 19, 2007 (gmt 0)|
Thanks all for sharing valuable information.
Can anyone tell me please which kind of backlinks work well for the supplemental index. i.e. from related sites, forums links, articles, directory submission etc.
Please name some good backlink methods.
| 9:38 pm on May 19, 2007 (gmt 0)|
One of my sites (fully indexed and OK for 5+ yrs) only has 175 pages on it. 12 pages were fitting guides. I had this "brainstorm" that rather than doing the 20 hrs or so of updating them every 2 mos. that I would set up a 301 re-direct for them to new versions, which had the on page text in place as before but instead of the huge amount of "fittings" they only had a single outbound link to the manufacturer's fitting guides.
I thought this would save me tons of hours of work and the OBL to the corporate fitting guides might even be good since they were easier on my customers and always up to date, plus to the authority site for my products.
Last PR update Google put half of my pages in supplemental (unique tiles, tags, on page text - no spam). They also reduced my site's PR to PR3 on index page, PR3 on second layer (sections) and PR2 on the third layer (item pages).
It's a very flat site. Used to be PR4 1st layer, PR4 2nd layer, PR3 3rd layer.
What the heck did I do wrong? I have since created new fitting guide pages on my site (as before) and took down the outbounds to the factory fitting guides. Maybe this will help?
Maybe tedster already explained what happened since my re-directs (12 of them) all went to the home page 301'd:
"I would not suggest using a 301 to your home page, as over time this results in many different urls all having the home page's content. Usually those urls just end up as Supplemental Results, but if the "duplicate pages" number gets very high, I have seen it start to impact the rank of other pages from the domain in the SERPs. It may have something to do with poisoning any links on the dupe conetnt page -- but I'm not sure on that, nor where the threshold of safety may be."