2. Disagree - lots of small sites have done well under Panda
3. From memory, G did a separate update that dealt with how backlinks were treated (Feb 16th?, Feb 9th? something like that, see archives), it was before Panda anyway, and I think people are conflating the two updates when they shouldn't be. (Also they may have done further backlink updates since which we've missed because everything is now ascribed to Panda automatically!
4. Didn't need to add new pages/content, just needed to fix existing content to get out of Panda
5. Not sure about this one either
My feeling about panda was that they were looking at ratios - ratio of ad space to content, ratio of internal links supporting a page to external links, percentage of good pages on a site - once you improved over a certain percentage, you came back, plus a bunch of other metrics.
Also Panda seems to me to be keen on "basics" - sloppy coding where you accidentally put the same meta description on all pages was tolerated before, but not after. Pages with a lot of spam in the comments were tolerated before but not after. Tag pages that duplicated category pages were tolerated before, but not after. Other people have mentioned slow loading affecting the site. I don't know why this is - perhaps when their original raters seeded the algo, the good sites they found were not sloppy or careless, and they built that in as a benchmark.
Also the long click. One thing I did was go back to my analytics to look at what people were searching for before Panda, and then added information in, so that if that search was performed again, they wouldn't backspace because what they were looking for wasn't there. I ended up adding a lot of info to existing pages, they changed substantially.
I don't know if it was this that helped me or sorting out the sloppy stuff, but I was hurt by Panda 2.1 and came back with Panda 2.2 and have been stable since.