Welcome to WebmasterWorld Guest from 18.104.22.168
The supplementals seem to always start with my Link Directory pages. I'm not sure why 2000 or so other regular pages do come up after the 30 or 40 Link Directory pages , but 200 pages are indexed according to this site:www.#*$!xx.com/* search and the 201st is the Link Directory.
Do you feel a Nofollow code to these pages could help even though they will lose page rank.
Has anyone else noticed this.
[edited by: tedster at 8:14 pm (utc) on Aug. 23, 2007]
[edit reason] switch to example.com [/edit]
I'm currently working with a site that shows 10 urls that are PR5 to PR7 - what can I make of that? My advice for now is not to try anything based on the current reporting tools, whether the official ones or the * hack for the site: operator that you mentioned. Google's reporting features and operators are way too volatile for my taste at this moment. I don't like to take actions on suspect data.
If you are seeing new ranking problems appear, and not just perceived "supplemental index" problems, then you may want to experiment with nofollow to unimportant pages. However, I'm not a fan of that approach either, although I do recognize that some people feel it has helped them. As I see it, many webmasters became terrified of the Supplemental Index because in the beginning around Big Daddy, its implementation was buggy. And now, that traumatic memory continues to get people upset and spend time in needless attempted manipulations - sometimes hurting their traffic.
I would rather let Google do whatever Google is doing with supplementals. We are too much in the dark to try to "help them out". There's nothing wrong with feeding some link love to pages that aren't ranking well, and removing juice from urls that don't really matter -- but it's ranking that counts, not whether the url is apparently in the supplemental index.
That said, I agree that seeing so many urls leave the main index is troubling. It may well be a sign that something's wrong. But again, what does your actual search traffic look like? That is the real key, and not the way Google chooses to partition their data.
Are you sure that you've eliminated multiple urls pointing to the same content? Have you used unique titles and meta descriptions throughout all your important content? Have you addressed any thin content pages? Madesure that extensieve boilerplate copy does not appear on many pages? I would definitely address all of that before going down the nofollow route.
If you have many links pointing to those link index pages, and from many different urls, maybe tone that number way down.
Some low/medium TBPR domains that are on the crawl fringe might see pages go in and out of the main index, especially with an internal link structure that places heavy emphasis on home page/second-tier pages.
If a site has excessive manipulative links (cross links to your own network, link swaps, obvious paid links, manipulative outbounds, etc), a site can see a drop in inbound PageRank and that in turn can cause URLs to "go" supplemental.
And if your deep pages are largely static, that staleness can contribute to your problem.
Like tedster said, the site: search is an approximation and isn't something I'd take at face value.
Don't worry too much about supplemental results. If you focus most of your efforts on marketing, that should improve your traffic and pull some of your pages out of the supplemental index via increase in organic backlinks.
I'd rather not go for the nofollow route as well, but there are some much weaker sites/pages ranking above me where I used to dominate so it is def. tough to figure out the problem and yes traffic is way down for those pages considering they went form page 1 to page 8. I wish there was a tool to let you know where a problem may exist, but that's the game i guess.
The nofollow attribute on links is all about saying "Do not trust the URL this link points to". I feel that to do that to your own pages is not a good idea.