Forum Moderators: mack
Is this a good chioce when i dont have any pages to not be seen...
The purpose of robots.txt is to tell web crawlers (spiders) what content they are and are not allowed to index.
Your robotx.txt should be a simple text file with the correct syntax used to allow or dissallow spiders from your pages. If you do not have any content that you want disallowed from search engines then I would say leave your robots.txt out, simple dont use one.
For informatrion on writting your file you might want to do a site search for robots.txt There are quite a lot of good examples.
Mack.
Use robots.txt to disallow certain files or folders, like intermedite steps in a shopping cart, include files, and so on.
If there is only a few files to disallow, use the meta robots noindex tag just on those few files.