Forum Moderators: goodroi
If this is a possible scenerio, what should I write in robots.txt to make it so this won't end up happening?
Yes, spiders can definitely get into "invisible" subdirectories - I accidentally had a subdirectory called "new" that was a redesign of a large site completely spidered by Google because the google tool bar "told on me".
User-agent: *
Disallow: /
Should stop spiders that pay attention to the robots.txt file.
I would not recommend the above. Once Googlebot gets a Disallow: / on an entire site, I think it may be a while before you can get a regular crawl. I would definitely create a new sub-directory and Disallow that from the spiders. This way it has no effect on the existing site or spidering of the new site once it goes live.
P.S. mcavill's answer was correct based on your question which was how to prevent all spiders from indexing your site.
In this case though, we only want to prevent them from indexing the new site.