Although the robots.txt file is not needed, it is a good idea to use it. My site's robots.txt file is:
# Robots.txt file created by 7/21/06
# For domain: [catanich.com...]
# All other robots will spider the domain
Disallow: */_vti_cnf/ #created directories by the dev tool
Disallow: /_common/ #common source code
Disallow: /_holdit/ #a junk folder
Disallow: /_private/ #
Disallow: /_ScriptLibrary/ #common script folder
Disallow: /rLinks/ #reciprocal link folder
But when I ftp my site upto the server, the "_vti_cnf/" directories are sent too. This means that the SE's will index these directories as well (thank you Python Site Map) and create a great deal of Google errors.
In addition, I don't want my source code directories to be indexed.
And just for information purposes, by adding "Disallow: /rLinks/" to the robots.txt, all my reciprocal links are blocked from being indexing. So no PR bleed.
So as you can see, there are real reasons that you use the robots.txt file.