PageRank is calculated for all pages indexed by Google. Not just one or two iterations, but all of them.
How many is all of them? 5 to 7. Google is interested in ranking pages, not in calculating the exact values of all pages. It has been shown that after 5 to 7 interations the rankings of pages no longer interchange. That is,the calculated value of the #1 result will not drop below the calculated value of the #2 result. Sites do not rank as #1.2704, #2.4370, they rank as integer values, 1, 2, 3, 4 and so on. So, as long as PR is consistent between their ranking order, the calculation is complete.
You can find the original explanation of Page Rank at The Anatomy of a Search Engine [www-db.stanford.edu]. See Section 2 for explanation of PR.
The fact that Google Toolbar PR is calculated only once every 3 months or so should not be confused with the internal rate of PR calculation, which is now virtually continuous. The sandbox is not associates with toolbar PR.
tedster, thank you so much for that brief, refreshing history of google's churn these past few years.
You left out a couple of things that might should be considered. For instance, when the sandbox began in late February 2004, the number of pages indexed on the google home page froze at 2^32, or about 8 billion pages, for about 9 months, when sites began to be released from the sandbox enmasse, and the number of pages indexed doubled.
It was about the same time that the Supplemental Index came into being, something else that happened in the aftermath of Florida update in November 2003.
We had evidence two years ago that there were multiple indexes. I believe that Google ran into problems in the scale of their index, and had to shunt some pages (it actually worked by domains) to a 2nd level of indexes, and that the "sandbox" is an emergent property of Google's dealing with an index limitation problem by creating multiple indexes and the algorithms used to delegate pages (domains) to those new indexes.
About February to March of last year, 2005, I was expecting an announcement from Google that they had created and released a new index, and the end of the sandbox as all indexes were re-integrated to the master index.
Are there not new datacenters that are being called "Big Daddy"? What is that about?
I've noticed in the past couple of days that a LOT of sites are coming out of the sandbox, though I do not see a huge "I'm out of the sandbox" thread, I would not be surprised to see one.
IMHO google is right now in the process of rolling out a new index, Big Daddy, and that the multiple indexes that have been created in the past 2.25 years are all being re-integrated, and that as a result the sandbox phenom will disappear, since it was an emergent property of the creation of the algorithms that tiered sites among the various indexes.
There is a related thread that interested members might should read Major Change in Supplemental Result Handling [webmasterworld.com]
There's also major commotion going with Pages Dropping Out of Big Daddy Index [webmasterworld.com]
If Google was in the process of integrating these cross-calculated indexes into a single index, they I would expect they would show symptoms similar to what is going on now. Also, as a result of all this, I would predict the demise of the sandobx, and I am seeing a lot of sites come out of the sandbox.