You have to be careful here in how you do this. In Google's case, using a Disallow: in your robots.txt file is not the best method of keeping those pages out of the index.
My understanding is that you should allow the bot to access those pages but drop a Robots META Tag with a directive of noindex or noindex,nofollow in the head of those pages. jdMorgan can offer more advice on that if he is reading this one. Or, you can search WebmasterWorld for previous topics on this issue.
Google will index URIs only for Disallowed pages and they will show up in certain advanced search queries. Typically not something that is seen by the browsing public. But, I'm sure all of those indexed URI only listings account for a large portion of those 4 billion pages just added. ;)
[edited by: pageoneresults at 6:07 pm (utc) on Dec. 30, 2004]