tedster - 8:17 pm on Aug 8, 2011 (gmt 0)
Let's start with the difference between crawling and indexing. I've never seen a site of any size that was 100% indexed - but googlebot will often check out (crawl) all or almost all the URLs. Then they have an algorithmic approach to what will and won't be kept in the live index.
I'd say, in addition to getting those 301 redirects implemented ASAP, the best step you can take is an XML Sitemap. And watch your Webmaster Tools reports like a hawk for signs of any technical errors - fixing them as fast as you can.