The PageRank calculation is iterated - done around the entire web graph time after time, until the effect of the "damping factor" keeps the numbers from changing in any significant way. This alone makes it very difficult to replicate on any smaller, more local scale. Then throw in the factor that the exact math has been changed, but we don't know how.
I think that makes if difficult to test - very difficult. Anchor text is not a factor in PageRank, so you can throw out that variable. But there are many other factors that are quite uncertain - for example, the artificial bump given to "Mom & Pop" sites.
There was also some talk out of Google (around the Big Daddy change I think) that a new mathematical formula for PR was introduced to allow faster, continual calculations. From what I recall, this new approach didn't require actual iteration around the entire web graph, but rather it gave a very good approximation without such resource-intensive number crunching. Some kind of phase space vector math, I assumed at the time.
More speculation about that - the new "approximation" method could also require occasional full blown calculations, in order to correct for the numerical drift that the approximation introduced.