Forum Moderators: Robert Charlton & goodroi
A couple of months ago we put up a testing server. I have just noticed that it has somehow made it into the Google index. Since this site basically mirrors our production site, I am assuming we are getting some duplicate content penalties because of it.
I thought of a few different ways to handle the issue (robots.txt on test system, redirects on test system, password protection, etc.). We also have the option of moving the test server to a new IP. What would be the best/quickest way to resolve this?
Thanks,
John
At the same time, move the test site to a new subdomain and put the whole lot behind password protection (except for robots.txt which will be returned as Disallow: / for all agents).
Simply putting up password protection on the test site alone will do nothing to deindex the URLs. They will otherwise stay as Supplemental for a very long time.
I have put in the 301 redirects by changing the DNS to point the testing domain to the real site and then putting a 301 redirect there for requests on testing domain name. I have not seen Googlebot on the test site for 4 or 5 hours, so I am assuming it has detected the DNS change. I was going to hold off on the robots.txt change until this happened.
Any idea how long it will take Google to sort this out? Is there anything else that can be done to expedite the process?
Thanks again,
John