I often try and get my head round what actually happens to those pages that dissapear for days to week s at a time and then come back stable. Im not talking about the constant appear/re-appear observation. My case rlates to totally stable pages that suddenly dissapear for days to weeks then come back totally stable. Often related to just minor changes but often no changes at all. Where the flux pf some pages can easily be nailed to them sitting on a moving threshold, the case im relating is quite different. Its almost as though there is a totally separate index for dubious pages. Pages seem to move from the main index to what i call the scrubbing index where they seem to undergo forensic inspection. If true why would a seperate index be needed? In my musings i hypothesise that in this separate index google does some sort of relationship analysing between the whole dataset. In some way i wonder if they need to seperate good and bad pages to create a clinical environment in which the deep scrubbing/cleansing can take place. Just my thoughts. WSithout google id need to get a real life!