For Panda, my guess would be that they are building a preliminary list of false positives. But that will not followed immediately by an exception list. Instead, they will modify the algorithm so that kind of site doesn't get caught by it. Matt Cutts did say recently that site quality will be Google's focus for this entire year,
I'm remembering an incident from 2009 where a site owner reported a ranking problem on the Google Webmaster Help forums. In the end, JohnMu found that the part of the algo that simulates the visual page was flagging his site as having "too much whitespace" - something that spammers apparently had.
John said he "set a flag" so that any time the algo picked out his site for that issue in the future, it would let him pass, but trigger a human review instead. I think that's an example of how Google will whitelist a site - just for some specific algorithmic property, rather than giving a free pass to do whatever they want.