>>Think it was a glitch WG?
It looks to me like a glitch. It is just too wide spread, and it has been applied to far too many sites for me to think it's some kind of intentional spam filter.
If it is a spam filter, what is the point of only removing the PR of the pages, but keeping the content in the database? All of the sites that I've been watching that have experienced the zero pagerank hit always end up back in the database the following update.
I just can't see the logic in a spam filter that only kicks out offending pages every other month. (Unless of course, the real purpose of the filter is to simply rotate results).
I think that server errors may be playing a role. I spent some time digging through some error logs on a site that has been having this problem. I found that just about every other month, Googlebot has been showing up and making numerous requests for a file that hasn't existed on this particular site in almost three years. The file was an old duplicate index page that had been built by a previous seo firm.
On each request for the page, Googlebot was given a standard 404. Like clockwork, the update following the crawl containing all the 404's for the old page resulted in zero pagerank for the entire site.
Apparently, somewhere on the web, there are a couple of pages that still link to the old index page. Because of the old age of these pages, Googlebot doesn't come across these links in every crawl, but when it does, it adds the URL to the que and makes several attempts throughout the month to retrieve it.
When it finally gives up, the PR for all the pages are dropped, but the contnent from the previous crawl remains in the database, and Googlebot shows up again the following month.
Now temporarily removing a site that returns a high number of 404's in a given crawl may make sense, but I think that some of the new things they've been toying with may have caused Googlebot to collect a much higher than normal level of server errors.
Since most hosting companies don't provide access to server error logs, it becomes quite difficult for a lot of webmasters to pinpoint what the problem is, which makes the whole thing that much harder to diagnose.