Welcome to WebmasterWorld Guest from 18.104.22.168
Welcome to WebmasterWorld!
Google's representative Matt Cutts was explaining this situation on his blog earlier this week. (search for 'Matt Cutts Blog') Seems it's a result of the new Big Daddy infrastructure, where they deliberately index more than they think they'll use. Why is a little confusing, but he tries to explain it as necessary.
More important is what does this mean for your site. Following his ideas, it probably means that your site is right on the fringe of getting spidered. It's got just enough good links to see googlebot once in a while, but not enough to earn a spot in the index. If so, the solution is to get more good incoming links.
I can't guarantee this is your issue, but it sounds likely.
When I wait for your reply I was still search answer for my question.
And I found some information... (from end of 2005 yr.)
Well, its not an officially decalred! but google also consider aging factor, okay for the timing keep it in side. Google crawled each and every page on your website which a bot can see, and stored them into repositry. Now for the timing Google indexing frequency depends on PR, AGE, EXTRAPOLATION ALGORITHM( use to calculate how often, this page will change in future.) But Google said "we fully travese our repository once in a pre-defined time." which is about 60 days!
By the way, now google is maintaining a supplemetry index, which is probably bigger than active index.
It's true now?
If so, the solution is to get more good incoming links.
What kind of poo is this?
I've been away from WebmasterWorld and MC's blog for a while, because I saw my pages coming back and figured iI was OK after all. Now I'm right back where I started.
SORRY... I was busy building quality content: articles, listings, reviews, OBL to resources, etc. While I was there, I was imprioving the design so folks could find their way around more easily.
I come back to find that I'm back down to 300+ pages out of 1,800+