AlyssaS - 12:01 pm on Apr 22, 2011 (gmt 0)
I think its clear that Panda appplies a 2 part filter
(1) All your #*$!ty pages get hammered.
(2) your main pages, usually the not #*$!ty ones, get hammered too.
I think the transmission mechanism of (1) to (2) is via internal links.
In the past, if a page got filtered, it lost it's rankings, but not it's link juice. Now it appears that link juice it passes diminishes too, so if you have too many of these weak pages, they start to topple everything else as the links supporting other pages weakens.
To me this is the reason Hubpages got hurt while Squidoo didn't. Squidoo has never had much internal linking, so a bad page was isolated, other pages never really relied on links from it to rank, and are therefore unaffected.
Of course all pages on sites link back to the homepage, so if you have a lot of bad pages with weakened link juice, the home page should start to topple too, as the stuff that previously supported it gets taken out.
Edited to add: according to Quantcast [quantcast.com], Squidoo's traffic is now higher than it was during Jan and feb - even though they attract the same sort of spam that Hubpages attracts. The difference must be that their site structure isolates the bad pages so they can't contaminate the good.