Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Best ways to block duplicate website for test purposes?

         

born2run

6:22 am on Jun 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi we want to run beta.example.com in addition to www.example.com. The beta is a copy of the www site.

We don't want the beta site to be indexed or be visible in SERPs or disturb the www.example.com index in Google etc. The beta site is for development and testing before we roll out the changes to the www site.

My question is, what's the best practice keeping Google SEO in mind for the beta site? Should I password protect it using htaccess and also no follow directive in the robots.txt file of beta.example.com?

Any advice would be greatly appreciated as always! Thanks!

topr8

8:14 am on Jun 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



depends on who is wanted to view the beta and from where.

i just limit access to certain ip addresses at the apache level.

Johan007

9:25 am on Jun 25, 2015 (gmt 0)

10+ Year Member Top Contributors Of The Month



Should I password protect it using htaccess and also no follow directive in the robots.txt file of beta.example.com?
Perfect.

netmeg

12:29 pm on Jun 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've done different things - my top choice is to limit it to specific IP numbers, as described above. If that's not an option for whatever reasons, I will try to put a password on the whole site (for WordPress I use a free plugin called Private Only that works great for this). My final option is to have the entire site slapped with a NOINDEX on every page. But since that still leaves it open to the world (including competitors) I consider that one a last resort.

Blocking it with robots.txt would not be a good option. It could potentially still be indexed.

engine

12:53 pm on Jun 25, 2015 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I agree with what's been suggested. If its a subdomain itll be the same as putting it on any domain.
limit the ips, and password protect (belt and braces).

Dont use robots.txt as thatll just give pointers to whats there.

Spiders and scrapers that ignore all protocols will look for common entries, so dont use anything obvious.

lucy24

4:41 pm on Jun 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Question of fact, assuming this is happening in ARIN/US: Names of newly created domains are publicly available, so robots will show up within days. Are names of subdomains also visible, so you can't rely on "nobody will visit because nobody knows we exist"?

RedBar

5:18 pm on Jun 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How long do you intend leaving the beta domain exposed for?

I have a gobbledygookdomainname.tld that I use and as soon as I've finished delete everything and it simply goes back to being "Test Site - Nothing Here But This Text" however if it's going to be more than a few hours then definitely password protect it etc.

born2run

10:57 pm on Jun 25, 2015 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks guys