gethan - 10:16 am on Jun 2, 2010 (gmt 0)
Gbot is still re-indexing larger sites from scratch as I noted weeks ago, HENCE many sites are still showing under-ranking due the latter fact, not enough indexed pages and pages linked to them are also being re-indexed = lower rank factor until all is re-indexed. When I pointed out what I called the Total-Recall, is basically a recount, re-index, re-rank
Absolutely agree dusky - I operate a large site that has seen a huge drop off in indexed pages - from ~5M to ~500K (least using site:)
I noticed today that WebmasterWorld went from ~700K to ~3.5M - I've started using it as a control site only during this period - but it's seems from memory to be a historically average number.
MC hints that the algo is now more sensitive to long tail terms... but I'd also think that for large sites - the pages optimised for long tail terms are further down in the tree... thus long tail traffic would be lower for two reasons.
Maybe some of the longtail traffic that we seem to have lost - will follow what appears to be a growing inclusion count.