Welcome to WebmasterWorld Guest from 220.127.116.11 , register , login , search , subscribe , help , library , PubCon , announcements , recent posts , open posts Subscribe to WebmasterWorld
Block test version of site from indexing but keep it available for testing? jaimes msg:3672294 5:21 pm on Jun 11, 2008 (gmt 0) Hi, we have a second version of our site called , #*$!xTest.com , where the URLs seem to be getting indexed and I am worried it will appear as duplicate content and duplictae pages for the entire site. What do I have to do to block the bots from indexing #*$!xTest.com and the sub Test URLs. If I 404 or redirect it than we can't really use it for testing.
Thank you in advance. J
tedster msg:3672399 7:06 pm on Jun 11, 2008 (gmt 0)
Use a robots.txt file in the root folder of your test domain - with one rule:
That will keep ALL well-behaved spiders out so you won't get indexed. After a short period, any urls that were indexed should also be dropped.
Ferro9 msg:3672403 7:17 pm on Jun 11, 2008 (gmt 0)
Use .htaccess to set a password. tedster msg:3672411 7:23 pm on Jun 11, 2008 (gmt 0)
Yes, Ferro9 has a good idea - do that, too. It will also keep out the bad bots and scrapers.
g1smd msg:3672574 11:57 pm on Jun 11, 2008 (gmt 0)
The robots.txt file stops spidering by well behaved robots, but does not disallow the URL for the resource from appearing in the SERPs.
It also does not guard against a scraper stealing all your content before your real site goes live.
I usually disallow in .htaccess/.htpasswd. That stops everyone who does not know the password.
jaimes msg:3672596 12:33 am on Jun 12, 2008 (gmt 0)