AG4Life - 5:29 pm on Jul 6, 2011 (gmt 0)
Have you blocked the dupe pages using robots.txt as well? If so, that could be a problem, since Googlebot won't actually be able to reach these pages and read the "noindex" tag, and so these pages could stay in the index for a very long time.
It's best to just leave it noindex, *don't* block in robots.txt, and then let Googlebot do the job, however long it takes. Basically, only use robots.txt for new pages that you don't want indexed (and I would also have the noindex tag on these pages just in case), but don't use it for existing indexed pages that you want gone.
Unfortunately, I've found that Google has been quite slow in removing noindex and newly made 404 pages lately. I've had some 404's since late April, and some are still in the index, as are a few nofollows that were added in mid May. For some reason, de-indexing seems to have stopped or slowed around end of May/beginning of June, for my site anyway (despite seeing two crawl spikes in WMT since then).