I was given a portal that was doing poorly in Google, give its decent link popularity. Alexa rank in the range of 10k. Tens of thousands of pages. (No instances of Alexa rank being influenced :-))
It was a Herculean task getting my head around the overall site structure and navigation. Digging deeper, I ascertained only about 10% of pages are useful. Remaining 90% of pages either duplicate pages, search result pages, various forms, print version pages, test folders, and so on. Now, next task was getting non-useful pages out of index through robots.txt, 404ing duplicate pages, setting URL parameters in WMT (there are about 40 different types of URL parameters!). Overall cleaning up the mess. I told it will take time for Google to notice and readjust its index.
The client asked. How about deindexing the site, ensure site is out of Google index completely and let it index, so that only the useful 10% pages are in the index? With URL removal tool it can be done in a matter of days. I searched for and answer, that it might be too drastic, not convinced with my own answer.
Anyone ever done that? Can it be thought of as an option to speed up the process?