The simple answer is "No." There is only a "Disallow" directive in the robots exclusion standard.
The complex answer is that you can move your pages into separate directory branches, depending on whether you want them spidered/indexed, and set up robots.txt to disallow all but one directory branch.
You could also use mod_rewrite or a similar approach to make it look as if the directory structure had been re-organized as above, but leave the pages where they are.
Or, you could use a script to generate robots.txt, allowing only the desired pages, and saving you the work of maintaining it.
Depending on your site layout, these approaches might vary from easy to horribly complex or inefficient.