Welcome to WebmasterWorld Guest from 54.163.168.15

Block test version of site from indexing but keep it available for testing?

   
5:21 pm on Jun 11, 2008 (gmt 0)

5+ Year Member



Hi, we have a second version of our site called ,
#*$!xTest.com , where the URLs seem to be getting indexed and I am worried it will appear as duplicate content and duplictae pages for the entire site. What do I have to do to block the bots from indexing #*$!xTest.com and the sub Test URLs. If I 404 or redirect it than we can't really use it for testing.

Thank you in advance. J

7:06 pm on Jun 11, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Use a robots.txt file in the root folder of your test domain - with one rule:

User-agent: *
Disallow: /

That will keep ALL well-behaved spiders out so you won't get indexed. After a short period, any urls that were indexed should also be dropped.

7:17 pm on Jun 11, 2008 (gmt 0)

10+ Year Member



Use .htaccess to set a password.
7:23 pm on Jun 11, 2008 (gmt 0)

WebmasterWorld Senior Member tedster is a WebmasterWorld Top Contributor of All Time 10+ Year Member



Yes, Ferro9 has a good idea - do that, too. It will also keep out the bad bots and scrapers.
11:57 pm on Jun 11, 2008 (gmt 0)

WebmasterWorld Senior Member g1smd is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



The robots.txt file stops spidering by well behaved robots, but does not disallow the URL for the resource from appearing in the SERPs.

It also does not guard against a scraper stealing all your content before your real site goes live.

I usually disallow in .htaccess/.htpasswd. That stops everyone who does not know the password.

12:33 am on Jun 12, 2008 (gmt 0)

5+ Year Member



Thanks
 

Featured Threads

My Threads

Hot Threads This Week

Hot Threads This Month