i'm new to robots and i would like a simple (if possible) way to prevent robots from indexing all of my site content apart from the contents of the root folder. I have sub directories that i do not want indexing, but would rather not put the name of the folders on a robots.txt, so i was wondering if i can use a wildcard. eg: user-agent: * disallow:/ * Would this allow my root folder to be indexed, but prevent sub-folder from being indexed? Also, how can i hide the robots.txt from all apart from robots? Script examples would be much appreciated as i'm pretty new to web building. Thanks all.