Forum Moderators: DixonJones

Message Too Old, No Replies

Tracking a website and keeping it hidden

How to keep a website out of search engines, whilst still track visitors

         

bhonda

10:28 am on Apr 17, 2009 (gmt 0)

10+ Year Member



Hey guys,

We've got a website that we want to keep out of search engines (ie, only the people who are given specific links can find it). At the moment, everything is fine - no crawling, no ranking, nothing.

But, I'm wanting to add some kind of visitor tracking. Now, what would you recommend?

I've had experience with Google Analytics, but there are worries that as soon as we hook it up to Google, it'll start appearing in search engines, and that won't be great.

Has anyone got any suggestions?

Cheers!

B

g1smd

2:23 pm on Apr 17, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Put the site behind a .htaccess password.

Bots cannot get through that.

cgrantski

3:03 pm on Apr 17, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you believe that Google follows robots.txt instructions, just use that. I have a site like yours with GA and it works. But a lot of people believe that Google ignores noindex nofollow directives, so I can't guarantee it'll work for you.

bhonda

11:13 am on Apr 21, 2009 (gmt 0)

10+ Year Member



Hmm...ok. Maybe I'll hold off on the Google route for a while!

SwitchFX

5:17 am on May 5, 2009 (gmt 0)

10+ Year Member



.htaccess block. I use it for sites I don't need to be crawled. Also good to couple with a robotx.txt if you want to be super sure nothing gets crawled. :)

AnkitMaheshwari

4:41 am on May 6, 2009 (gmt 0)

10+ Year Member



Robots.txt generally would work, yes .htaccess block is almost fullproof.

SwitchFX

6:17 am on May 31, 2009 (gmt 0)

10+ Year Member



@ AnkitMaheshwari

A robots.txt file would only work if you closed off every single port of entry. With a simple site that isn't very hard to do, however, if you're using a CMS or forum then you would be adding hundreds of ports of entry.

.Htaccess blocks are your best bet as you can seal off the entire site from both machine and human entry.

Receptional

7:04 am on Jun 1, 2009 (gmt 0)



There are hundreds of non-google tracking systems out there. Some are better!

I can't discuss individual packages on Webmasterworld, but they aren't hard to find.

Personally, I wouldn't trust robots.txt as 100% security on search engine listings, although Google is pretty good at obeying the command, but Google still "indexes for discovery" in this instance I believe. So they index, but just choose not to display in the SERPS.

StoutFiles

8:15 am on Jun 1, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Illegal content, eh?