When Google does these updates, I was under the impression that they were adding or adjusting filters and not "rerunning" the data. Are we certain that they are rerunning the data?
We, the people, don't know. But on each Panda, new sites are caught so either G targeted different sites or they added new things in the algo. Then the next one some of the newly hit sites come back, either by fixing it or G removing that filter. We don't know.
But it would be interesting and revealing to know why a site escaped 1.0 and was caught in 2.1.