Forum Moderators: DixonJones
We've got a website that we want to keep out of search engines (ie, only the people who are given specific links can find it). At the moment, everything is fine - no crawling, no ranking, nothing.
But, I'm wanting to add some kind of visitor tracking. Now, what would you recommend?
I've had experience with Google Analytics, but there are worries that as soon as we hook it up to Google, it'll start appearing in search engines, and that won't be great.
Has anyone got any suggestions?
Cheers!
B
A robots.txt file would only work if you closed off every single port of entry. With a simple site that isn't very hard to do, however, if you're using a CMS or forum then you would be adding hundreds of ports of entry.
.Htaccess blocks are your best bet as you can seal off the entire site from both machine and human entry.
I can't discuss individual packages on Webmasterworld, but they aren't hard to find.
Personally, I wouldn't trust robots.txt as 100% security on search engine listings, although Google is pretty good at obeying the command, but Google still "indexes for discovery" in this instance I believe. So they index, but just choose not to display in the SERPS.