Welcome to WebmasterWorld Guest from 126.96.36.199
Googlebot started to index very slowly them, and till now only 15 pages are in the index. all of them are supplemental results :-(
All pages and content are similar (due to geographical organization) but there is REAL original content and absolutely white-hat seo technique.
Do you think this a situation will change in the future? What can I do to "move" something?
I know that uploading 1000+ pages should be "strange" for Google but my bet is that in this case it IS perfectly legitimate.
Any help much appreciated.
Basically the only difference I've seen between supp and non-supp is that the SERP text snippets shown for a supplemental page tend to be sort of an incoherently mangled selection of the text from the page, whereas non-supp page text snippets usually state concisely what the page is about. It's as though supp pages have a lower priority for a "high level" of text processing.
Like I said, though, they both bring traffic.
About "useful" content, my opinion is that supplemental status says nothing about the usefulness. It does say something about the market for that information. In other words, you might have the best site in the world for a particular niche, but if it's a niche that only 100 people in the world care about, your pages will be supplemental, not because they're bad but because Google has no market of people seeking that information, so the pages get lower priority for processing.
On the other hand, when those 100 people do seek your information, your pages will still turn up in SERPs, whether supplemental or not.
Something strange happened recently that made me think Google might also apply the "how big is the market for this?" measure with newly indexed pages, too. Most of my new pages take a couple or few weeks to get indexed. They have AdSense on them, so Mediapartners crawls them right away, but Googlebot might not get around to them for quite a while. Recently I posted an article that I didn't realize was about a hot topic. (It also was barely related to what my site is about; oh well...) Googlebot snatched it up almost immediately, it went to the top of the search results, and is my most popular page. Seemed to me like Googlebot looked at the page and "thought", "Oh, yeah, there's lots of people looking for this information."
[edited by: SteveWh at 8:17 am (utc) on May 10, 2007]
But I think the page rank itself is a red herring.
PageRank is a metric Google uses to asses page value. Enough quality links to a page tells Google its important enough to get indexed. If no one links to a page, the value of the page is up in the air.
When people link to both non-www and www versions, link juice is split whereas if people linked to only the www version, PageRank would be consolidated into one url. That's why duplicate content caused by cannonical issues can lead to supplemental results.
It used to be that a spammer could inject thousands of pages into Google, point a few low quality links at them, and use internal PageRank to rank for thousands of queries. Google countered that tactic by requiring a minimum PageRank threshold for pages to be allowed in the main index.
Think of IBL as a pizza pie. When you split that up into thousands of slices, you get a bunch of teeny weeny slices. Say you're distributing that among thousands of spam pages. Because many page doesn't have enough juice (minimal PageRank + sum of IBL PageRank/100,000 = not alot o' juice), they drop out of the main index. Because those pages are no longer in the index, their internal links don't pass juice either. And this leads to kind of a domino effect, where at the end of the day, only a handful out of thousands of spam pages remain in the main index. And without the internal link support, those pages rank for very few queries. The supplemental pages will still pull traffic, but they will not rank for competitive queries. This way, Google tries to restrict spam to long tails where the top 10 results are generally low TrustRank, low PageRank pages
Google's set up a system where given a set of IBL with total PageRank X, there's a limit of how many pages stay in the main index. So a TBPR 2 site with say 2 links from a TBPR 5 site pointing at the home page will have an uphill battle trying to get Google to index 100,000 pages.
A few Matt Cutts quotes:
PageRank is the primary factor determining whether a url is in the main web index vs. the supplemental results.
typically the depth of the directory doesn’t make any difference for us; PageRank is a much larger factor. So without knowing your site, I’d look at trying to make sure that your site is using your PageRank well.
I had figured it was more to do with google cutting off swathes of pages like branches from a tree because those branches were not wll connected enough to the main - they hun g by a thread and therefore couldn't be the crux of the site/ the most important pages.
I checked again today (I don't check often), and another dozen or so pages that used to be supplemental are now not. There is no good reason that some of these pages should have made the transition. They are of historical interest only, have absolutely no backlinks, are not likely ever to have any, get nearly zero traffic, and they realistically have little value to most of the people in the world, so to speak. The only thing that would explain the transition is that they have existed on the web for 18 months.
Other pages that do have backlinks and do get traffic are still supplemental.
Furthermore, a site: search done 1 hour later turned up 100 less pages than the previous site: search!
Google does not want you to be able to manipulate your own PageRank, SERP rankings, or supplemental status. They take active steps to prevent you from figuring out the system and manipulating it. Every time someone posts a message saying, "I've tried and tried and just can't figure this out!", someone at Google high-fives their neighbor and shouts, "It works!"
Might as well not get bent out of shape if Google doesn't crawl your website.
Back to reality, having only a supplemental result for a URL is poison. Pages that rank #1 when not supplemental seldom rank in the top 500 when supplemental. When you find supplementals you should deal with them immediately (and "dealing with" them could include deciding not to care because the page is trivial). Get more link, use the URL removal to to get rid of the page and start over, whatever, if the page is important at all then doing nothing is suicide.
Supplementals are like a car without any gas. You can put more gas in it by getting more links, or use a different car by making a new page, or push the car out of gas car around town by doing nothing. The latter is the worse solution for any page that matters.
I WOULD BE concerned about having WebPages in the supplemental index... do any reasonably competitive / popular search using Google and tell me if you see any TOP (position 1-10) search results that are labeled as "supplemental"... you don't see it!
G pulls from the supplemental index only when G has scoured their main index and determines there are not enough search results in their main index to draw from... so G then serves up results from their supplemental index... this usually happens for less competitive terms or long tail terms.
I own and operate 14 websites. A few are PR6 > all the way down to one website that used to be a PR1. My PR1 website had 90% of its pages in the supplemental index. It wasn't until I put some link building effort into this PR1 website and raised it to a PR6 that 95% of the pages dropped out of the supplemental index and were moved into the main G index. I have observed this time and again with my various websites... and I have observed this also for many of my friend's websites I help SEO.
SteveWH, with all due respect, you are missing the boat on this one. Your website you have listed in your profile is a PR1 > 90%+ of your WebPages (about 200) are in G supplemental index.
To all you newbies... listen to Tedster and Halfdeck they offer sound advice and are right on target. Matt Cutts (G employee) has explained and offered hints about G's supplemental index > if you want to rank at the top in the natural serps for semi-competitive / popular to really competitive terms... you better care about PR which has a high correlation to how deep and frequently your website gets crawled... and your PR for a webpage has a direct correlation to whether or not your webpage is in G's supplemental index or G's main index.
It's it not just PR. PR3 pages go supplemental all the time. You need VOLUME of links. 1000 PR1 links will normally be better than 1 PR4 link. PR1 and PR2 pages with no links better than PR pointing at them can be nonsupplemental while PR3 pages with less than five links will be.
You need volume of links, multiple crawl paths, and PR also... and then not have duplicate issues.
[edited by: SEOPTI at 11:44 pm (utc) on May 11, 2007]
Thanks all for sharing valuable information.
Can anyone tell me please which kind of backlinks work well for the supplemental index. i.e. from related sites, forums links, articles, directory submission etc.
Please name some good backlink methods.
I thought this would save me tons of hours of work and the OBL to the corporate fitting guides might even be good since they were easier on my customers and always up to date, plus to the authority site for my products.
Last PR update Google put half of my pages in supplemental (unique tiles, tags, on page text - no spam). They also reduced my site's PR to PR3 on index page, PR3 on second layer (sections) and PR2 on the third layer (item pages).
It's a very flat site. Used to be PR4 1st layer, PR4 2nd layer, PR3 3rd layer.
What the heck did I do wrong? I have since created new fitting guide pages on my site (as before) and took down the outbounds to the factory fitting guides. Maybe this will help?
Maybe tedster already explained what happened since my re-directs (12 of them) all went to the home page 301'd:
"I would not suggest using a 301 to your home page, as over time this results in many different urls all having the home page's content. Usually those urls just end up as Supplemental Results, but if the "duplicate pages" number gets very high, I have seen it start to impact the rank of other pages from the domain in the SERPs. It may have something to do with poisoning any links on the dupe conetnt page -- but I'm not sure on that, nor where the threshold of safety may be."