bumpski - 9:10 pm on Apr 28, 2011 (gmt 0)
Both is double protection, a technique I highly recommend.
I do both robots.txt and NOINDEX as a fail-safe which recently saved my bacon when I made a small mistake updating robots.txt. Thousands of pages would've been indexed in days and it takes forever to get rid of the mess. However, the redundant NOINDEX stopped Googlebot from making a big mess in the first place.
Historically, at least, if a page is already indexed and you block it in robots.txt and "noindex" it, the page may remain in the index for a long long time. The only way Googlebot will see the "noindex" is if the page is NOT blocked by robots.txt.
So you're right, using both is powerful, as long as the page was not indexed in the first place.