Try this [searchengineworld.com].
If you want to permit any robot to spider your entire site, you don't need a robots.txt. Robots.txt is exclusionary, if no robots.txt file is presented to spiders, they assume unfettered access. However, each spider that requests the missing robots.txt file will generate a 404 in your logs.
To create an appropriate robots.txt, use a plain text editor (NOT a word processor!) and create a file that contains these two lines:
The wild card in the first line indicates that subsequent lines apply to ANY spider. The second line "disallows nothing," so your entire site is spidered.
Upload the file using your FTP program's ASCII transfer mode, to your site's root directory.
Not much to it really... :)