Did the drop in rankings equate to a drop in ROI?
Can you describe what you are considering duplicate content?
The only people who know about Supplemental pages are us. The average consumer has no idea what the Supplemental index is and nor should they.
The simplest and most effective solution is to remove those pages and serve a 410 gone.
You could also do what I do in most instances like this and drop a robots directive in the <head></head> of those pages...
<meta name="robots" content="none">
Just cover your butt in regards to that content getting indexed again. Google probably indexed everything first time around, that could have been months ago, and now it is purging duplicates found through their natural process.
I'm not too certain the "blank page method" is in the best of your interest from a variety of viewpoints. For one, it creates a technical nightmare and two, the maintenance just isn't worth it.
P.S. Don't block that stuff from the robots.txt file. Google already knows about it. Now you have to deal with it at the page level. If you block via robots.txt and use the meta robots element, it won't work as Google won't see the meta robots directive. But, Google will list those robots.txt entries as URI only when doing site: searches. The average consumer will not see those unless they are doing site: searches themselves which few do.