Welcome to WebmasterWorld Guest from 22.214.171.124
2. Do the same search on aol.com - that gives you only the regular index urls.
Or, you can also do a search for site:example.com/* on google.com. That syntax currently gives you only the regular index urls as well.
I'm dissatisfied with this of course and working hard to improve this.
[edited by: Asia_Expat at 10:55 am (utc) on Jan. 14, 2008]
site:example.com -site:example.com/* this shows me a lot of pages that are termed supplement. What steps do I take to minimize it a little bit (just for some satisfaction)
[edited by: AjiNIMC at 7:14 am (utc) on Jan. 15, 2008]
do a search for site:example.com/* on google.com
I tried that and my results are frightening for my 8 year old site. It's showing under 2% of my pages as not supplemental, assuming that is what the results really are. The only good information I've seen lately is that the site's backlinks have increased by a factor of 5. Overall, I'm about to simply give up on G ever giving my site decent results or traffic. It's just so disappointing and frustrating after all these years of working at building a completely white hat and clean, pure content site. I'm just beside myself and don't know what to do other than to just continue doing my thing as best I see fit.
The supplemental index still stinks, there is not a single website with more than 20 URLs which has 0% supplementals.
It looks like it is (MOSTLY) related to how many incoming links/PR a website has, the more incoming links the more is the value of your site, the more your value .. more number of pages will be indexed and included in main index.
if your site is pr3, and you have the most amazing original content on each of your 5,000 pages, LOL
on my ecommerce site with 30,000 pages and 89% supp, I have slowed down adding new products - unless I can get to PR7 on my home page, they'll just end up in the supps anyway
the sad irony is that google started off being so successful by giving the user such relevant results - now when you search google, you are getting so many wiki/amazon/shopzilla that the really good stuff just isn't coming up - who wants to search google to get a wiki listing? just go there yourself - i blame the supps for this decline in google relevance and value to the searcher.
The exact ratio doesn't matter. Compare:
1. A TBPR 5 site with 10,000 pages and 900 pages in the main index, or 9%
2. A TBPR 3 site with 10 pages, 9 pages in the main index, or 90%
90% in the main index, therefore, is obviously not necessarily better than 9%. What I look at is how many pages you have in the main index.
There are several factors you need to look at:
- how many indexable pages you have. If you have 100,000 pages you can split PageRank in too many little bits and have a ton of pages go supplemental.
- internal link structure. If you have sitewide links, that can create a situation where 10-20 pages have a lot of pagerank and the rest of the site are supplemental.
- Backlinks. Supplemental issue is, at the end of the day, mostly about lack of PageRank. But if you gain backlinks the wrong way (e.g. excessive reciprocal linking) that can devalue inbound juice to your domain, and create supplemental problems (addressed by matt cutts in response to a forbes' article about Google Hell).
- Page freshness and URL complexity. Two factors Dave Crow from Google's crawl team mentioned that also can turn pages supplemental.
[edited by: Halfdeck at 1:09 pm (utc) on Jan. 26, 2008]