Forum Moderators: goodroi
What I wanted to do were have stats and logs set up (we do not host the site ourselves) so I can keep an eye on things, JUST IN CASE. I don't want to get hit with a duplicate content penalty.
Our webhost says it's not necessary because if we don't submit the site, it can't be found.
The site new.widget.com was used to test other things, how can I guarantee there's not a live linking floating somewhere out there that a bot will follow, and promptly ignore my robots.txt file. Correct me if I'm wrong, but this is possible, right?
I realize I'm probably being overly cautious, but we cannot afford to get banned or penalized.
So, should I let it go? Or should I insist that I see stats and logs so I can monitor to be sure?
Best is to simply password-protect the entire sub-domain. It doesn't need to be super-secure, just give everyone the same simple login/password. The simple fact that a password is required will stop the bots in their tracks.