g1smd - 7:13 pm on Aug 15, 2012 (gmt 0)
200 million pages
Ah, if you had said that up front the thinking may have been a little different.
You have two real choices for the IP indexing:
One is to robots.txt disallow requests arriving at the IP. This will need very careful scripting to ensure that the robots.txt file is not served for non-www or www hostname requests. I would rewrite requests for robots.txt to /robots.php and then detect the requested hostname in the PHP script and serve the right content from there.
The other alternative is to carry on serving the 301 redirect and let Google continue fixing their data that way.