i have searched for an answer to this, however, perhaps because the discussions of robots.txt are always too technical for me, no luck.
in my logs "/robots.txt" comes up under "documents not found." my site is indexed so i'm wondering what this means and if i should worry about this. thanks, and any comments must be at the "info for idiots" level.
In robots.txt you can give directives, which areas of your site wellbehaving bots should not visit. You can as well suggest, that certain robots sould not read and index your pages. Bots from respected owners i.E. googlebot will read this file and follow your restrictions, others will not.
You do not need to have such a file if you have no guidelines for the robots visiting your site. Not having this file does not exclude you from searchengines.
No. You must create a text file on your server in the same directory as your index page called "robots.txt"! It can be blank (just to stop the 404's you see), or it can contain directives such as the ones you see in the page using wilderness' link.
tyrojds, the link I provided to the Webmaster World robots previosuly you can open and select SAVE AS with your browser and then save it in your wesbite root folder.
Then later open with any text editor and remove the Top portion which belongs to Webamster World.
BTW you might also want to take a look at changing the closing lines for folder exclusions. Your site structure likely don't match Webamster World's. Add in any of your own folders your desire excluded from robot traffic. Save the file and then upload into your websites root folder.
You should also read the link Jim provided to understand the procedures involved in creating your own robots for future use.