robots.txt goes in the document root of your domain, not the root of the server. It goes wherever the file www.yourdomain.com/index.html would go.
If there is no robots.txt, all spiders will feel welcome to index all of your files. The only bad effect of not having a robots.txt file is that your error log will have a lot of 404-Not Found errors cluttering it up as a result of spiders requesting robots.txt.
To avoid this, place a simple robots.txt file in your web root:
User-agent: * Disallow:
With no text following "Disallow:" on the second line, this will welcome all robots to all files on your site.
You can then use this robots.txt checker [searchengineworld.com] on the WebmasterWorld sister site Search Engine World to help make sure that it is correct.
JdMorgan I was trying to say the same thing but you do it so much better -- and much more quickly. Anyway...
No, the robots.txt file does not have to be there. But you might save yourself some future questions (Why are there so many file not found errors for robots.tx), if you simply put a blank robots text file on the site.
Just open a file in Notepad or similar text editor and save it as robots.txt.
Where does it go? The exact location depends on your server but it in all cases it should be at the same level as your home page (index or default).
As you develop your site you might find instances where you would not want a robot indexing your pages. That's when you'll put the robots.txt to use.
When you need to, do a "site search" above to learn more than you really want to knwo about robts.txt.