My only site that's been impacted by Panda is also a numerical data site -- basically, I take a bunch of government data from different agencies, and then mash it up it to tell a story about a particular social problem. This isn't automated stuff, I put a lot of thought into each individual page.
Basically, all of my pages looked like this when Panda hit:
2-3 paragraphs of unique introductory text, followed by a series of charts.
Many pages do have some kind of data overlap. For instance, I might show how one social issue impacts every state, so there's a table with a list of states with a numerical value for each state. This *could* make all of my pages appear to be very similar to all of my other pages (even though they're very different from a user's point of view).
Do you think that's a realistic cause? Or am I reaching here?
So far, my recovery efforts have been spent on increasing the 2-3 paragraphs of introductory text into much more. But I might try removing the noscript if that could be hurting me.