You either have a robots.txt file or you don't.
If you don't have one, you will get lots of 404 errors in your log files, as search engine spiders attempt to look at it. So even if you do not wish to keep spiders out of certain pages, it's a good idea to have a robots.txt file, even if it is blank.
You can also control robots' behaviour by including a meta tag in each of your pages, but this is a second method, unrelated to the robots.txt method.
For more information on robots.txt and the robots meta tag, see the robots exclusion standard [robotstxt.org], and follow the links as well.