Forum Moderators: goodroi
User-agent: *
Disallow: / I currently have a new site under development and I brought it live last week for the purpose of testing and development.
Best course is do not publish the test folder or test domain or perhaps use an IP instead of a domain name.
if you need to test handshaking between your server and another (eg: payment processors)
Require valid-user
Allow from nnn.nnn.nnn.nnn
Satisfy Any
if the response is a 200 OK for any requested url then that url is "published".
i've seen plenty of unwanted duplicate content in the index under IP addresses.
"security through obscurity" is not the solution here.
It's almost impossible to keep a web server secret by not publishing links to it.
If you need to keep confidential content on your server, save it in a password-protected directory. Googlebot and other spiders won't be able to access the content. This is the simplest and most effective way to prevent Googlebot and other spiders from crawling and indexing content on your site.