|A Possible Answer to Non Indexed Pages?|
| 8:13 pm on Aug 23, 2007 (gmt 0)|
I have lost almost 2300 out of 2500 pages to the supplemental index over 4 months, but when I keep comparing
The supplementals seem to always start with my Link Directory pages. I'm not sure why 2000 or so other regular pages do come up after the 30 or 40 Link Directory pages , but 200 pages are indexed according to this site:www.#*$!xx.com/* search and the 201st is the Link Directory.
Do you feel a Nofollow code to these pages could help even though they will lose page rank.
Has anyone else noticed this.
[edited by: tedster at 8:14 pm (utc) on Aug. 23, 2007]
[edit reason] switch to example.com [/edit]
| 8:40 pm on Aug 23, 2007 (gmt 0)|
Yes, I've heard similar reports. But the problem, as you may know, is that we have no certainty about the Supplemental Index anymore. To compound that, Google spokespeople have said they were continuing to make changes and upgrades to it. We're rapidly going into the region of having no dependable information.
I'm currently working with a site that shows 10 urls that are PR5 to PR7 - what can I make of that? My advice for now is not to try anything based on the current reporting tools, whether the official ones or the * hack for the site: operator that you mentioned. Google's reporting features and operators are way too volatile for my taste at this moment. I don't like to take actions on suspect data.
If you are seeing new ranking problems appear, and not just perceived "supplemental index" problems, then you may want to experiment with nofollow to unimportant pages. However, I'm not a fan of that approach either, although I do recognize that some people feel it has helped them. As I see it, many webmasters became terrified of the Supplemental Index because in the beginning around Big Daddy, its implementation was buggy. And now, that traumatic memory continues to get people upset and spend time in needless attempted manipulations - sometimes hurting their traffic.
I would rather let Google do whatever Google is doing with supplementals. We are too much in the dark to try to "help them out". There's nothing wrong with feeding some link love to pages that aren't ranking well, and removing juice from urls that don't really matter -- but it's ranking that counts, not whether the url is apparently in the supplemental index.
That said, I agree that seeing so many urls leave the main index is troubling. It may well be a sign that something's wrong. But again, what does your actual search traffic look like? That is the real key, and not the way Google chooses to partition their data.
Are you sure that you've eliminated multiple urls pointing to the same content? Have you used unique titles and meta descriptions throughout all your important content? Have you addressed any thin content pages? Madesure that extensieve boilerplate copy does not appear on many pages? I would definitely address all of that before going down the nofollow route.
If you have many links pointing to those link index pages, and from many different urls, maybe tone that number way down.
| 3:16 am on Aug 24, 2007 (gmt 0)|
One of the primary reasons why there was a huge influx of sites "going supplemental" during Big Daddy I believe is a combination of Google devaluing fake links more aggressively, so that PageRank would not pass as much as they used to through manipulative links, and Google's crawling becoming more dependent on PageRank.
Some low/medium TBPR domains that are on the crawl fringe might see pages go in and out of the main index, especially with an internal link structure that places heavy emphasis on home page/second-tier pages.
If a site has excessive manipulative links (cross links to your own network, link swaps, obvious paid links, manipulative outbounds, etc), a site can see a drop in inbound PageRank and that in turn can cause URLs to "go" supplemental.
And if your deep pages are largely static, that staleness can contribute to your problem.
Like tedster said, the site: search is an approximation and isn't something I'd take at face value.
Don't worry too much about supplemental results. If you focus most of your efforts on marketing, that should improve your traffic and pull some of your pages out of the supplemental index via increase in organic backlinks.
| 9:29 pm on Aug 24, 2007 (gmt 0)|
Thanks. The site has dynamic pages at this point and have made sure all metas are different, perhaps they are not different enough. I'm waiting for some internal pages to get cached again and see if the new content helps.
I'd rather not go for the nofollow route as well, but there are some much weaker sites/pages ranking above me where I used to dominate so it is def. tough to figure out the problem and yes traffic is way down for those pages considering they went form page 1 to page 8. I wish there was a tool to let you know where a problem may exist, but that's the game i guess.
| 4:50 pm on Aug 27, 2007 (gmt 0)|
Isn't <meta name="robots" content="none"> on the target page preferred to nofollow?
| 5:46 pm on Aug 27, 2007 (gmt 0)|
" Isn't <meta name="robots" content="none"> on the target page preferred to nofollow? "
I'm not sure, but I'd like to know as well. Any thoughts/
| 7:44 pm on Aug 27, 2007 (gmt 0)|
Yes, the meta noindex tag is much better. It stops anything about that page being indexed. It cannot even appear as a URL-only entry.
The nofollow attribute on links is all about saying "Do not trust the URL this link points to". I feel that to do that to your own pages is not a good idea.