I think that, in effect (not necessarily as a goal), there were two aspects to Panda:
1) Negative signals (thin content and other sins that have written about by every SEO guru).
2) Positive signals (massive numbers of inbound links, a minimum PR threshold, or whatever) that insulate megasites, including the larger UGC sites, from Panda's effects.
Under this scenario, small and medium-sized sites with negative signals would get hurt by Panda, while the sites with positive signals--i.e., the megasites--would float up higher in the rankings if only because other sites were dropping.
For the search terms that I watch, some of the megasites were slipping back to more reasonable levels even before this latest Panda update. (In one search that I do fairly often, Wikipedia and one of the big UGC sites have been down in positions six and seven respectively for a while, with the #1 result being an About.com page and two of the other top five results being extremely thin pages on EMD sites. So far, the only effect of the latest Panda update has been to push the "official" page for the query down to position number 10.)
My gut feeling:
Google may have been trying to reduce the "megasite effect" in its SERPs (at least for informational queries) even before the current update, but it's having trouble plucking the good stuff from what's left over after Wikipedia, TripAdvisor, etc. have sunk lower in the results. Author reputation and authority ("AuthorRank") could help with this, but it could be a long time before Google has AuthorRank figured out. (Until then, Google might want to dial down the weight that it gives EMDs.)