The User-agent: * means you are not blocking or banning any spiders from accessing your site
The disallow is for any directories which you do not want to be spidered
If you do not have a robots.txt file it will make no difference to your site other than when a spider/robot visits and requests this file because it is not there it will put a 404 error in the server error log file
i made some research and googleguy said here once that it would be better to have an empty robots.txt - even if you don't want to disallow any robot. don't ask my why - he just said it here, so i'll do it.
but what is an empty robots.txt, does it mean nothing in the file or is it this one here:
The "empty file" method is most useful for those who have difficulty uploading a robots.txt file for some reason. Sometimes, it's easier to just create a blank file on the server, and name it robots.txt.
The robots.txt code you posted to allow all robots is better.
The purpose of the blank or "allow all" robots.txt is simply to prevent a large number of 404 errors in your logs caused by robots trying to request robots.txt and not finding it.