Forum Moderators: open
I have a development server that is occasionally open to the Internet so clients can have a look at it.
Google paid a visit to my development server while I was working on a website for someone and now that person's production website, which used to rank #3 in the SERPs for its single keyword has been dropped from Google. I'm assuming it was due to duplicate content of which there was a lot at that point. The project was more about changing the design than the content since the content, and several dozen high-quality back-links seemed to be responsible for the #3 ranking.
Will disallowing Google in robots.txt help prevent the above problem from recurring? Would a firewall rule that keeps Google out be the best solution? What else might make a difference so this doesn't happen to other clients of mine?
Thanks in advance.
I'm not sure how best to do it, IMHO I'd try and serve google a 404 on it's next visit to resolve the potential duplicate content issue.