the easiest way would be to put all of your blocked pages into one directory and your good pages into another directory. then you could simply have one line in your robots.txt file blocking the entire bad directory.
if you do want to list individual pages on your robots.txt file be careful that your file doesn't get too big. i once had a client with a robots.txt file several hundred kb and the spiders had a hard time reading it. so avoid the extreme sizes and you'll be ok.