I recently tried to use
<META NAME="robots" CONTENT="noindex">
on some of my pages and noticed that requests are still made for the pages by googlebot. I'm guessing that in this case googlebot fetches the page and says, "Ok look...no index. This one goes in the trash"
Would placing a directive inside the robots.txt file prevent googlebot from even trying to get the page?
Does telling googlebot what not to get increase the chances of getting more pages in the index?