I have a site of about 4500 pages ... for six years it has been indexed at a steady 4500 pages. It got hit by Panda, and shortly after the page count climbed to a crazy 30K+ pages. I knew Google was making many changes. I waited for the page count to get more reasonable. It didn't. I added canonical headers to all my pages -- they're ignored. Suddenly, Google became completely incapable of handling parameters intelligently on my site. I blocked all the new "bad" pages with robots.txt -- they stayed in the index. I went to Webmaster Tools and removed all the ones I could find using the removal tool. The removal requests were approved. The pages stayed in the index.
Every so often, when I'm being very energetic at trying to get invalid pages removed, the page count will sink down to about 30K or so before it balloons back to 30K+. I never had any trouble with "dumb interpretation" of my pages before Panda, I never had any trouble getting robots.txt to be honored, much less the removal tool. It as if all these tools simply don't exist in regard to my site anymore. I would like to know if anyone else has experienced this after being hit by Panda. It seems to be far from universal. I would especially know if anyone has found the SOLUTION to these inflated and duplicate indexed pages, after experiencing this strange problem. Thanks for reading.