Forum Moderators: DixonJones

Message Too Old, No Replies

Determining a true count of visitors

When tracking programs show different results, which do you believe?

         

jeffb

12:45 am on Jul 25, 2003 (gmt 0)

10+ Year Member



I've found widely different results from checking my traffic with two different programs. I check my site on an ongoing basis with Urchin and then run the weekly logs through ClickTracks at the end of the week to make use of its path tracking ability.

I find, however, that Urchin shows nearly twice as many visitors for the week as ClickTracks does.

Have any others noticed such wide differences when running the same data through two different programs and, if so, what have you done to determine reliable results?

ken_b

12:51 am on Jul 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The number of visitors is nearly useless.

Using it for a general measurement of traffic to see if there is a trend for your site is about all it's good for. As a measurement against other sites, the comparison would be highly questionable.

Using 2 or more programs is probably just an exercise in frustration.

Why? Because different programs use differing criteria, so can't be effectively compared.

pardo

10:28 am on Jul 25, 2003 (gmt 0)

10+ Year Member



Indeed it is frustrating that there is no tool that gives the 'right' user sessions, pageviews e.g.

Ideal there should be 1 industry standard so that all site traffic figures worldwide can compare. In my opinion print magazines have these sort of standards. It should give advertisers a real insight in the business.

But I know there are lot's of issues that influence accurate traffic measurement.

One day there will be...

Till that day I use some tools myself. For the quick daily analysis I use the free version of FunnelWeb, for deep analysis of referrer and keyword tracking I use NetTracker and are impressed of the information I can get out of that tool. But indeed, the visitor and pageview information differs...

MrSpeed

2:33 pm on Jul 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The programs use different criteria for what a visitor is. Mant times a visitor is the same IP address within 20 minutes. Some programs use a different length of time.

Of course you run into problems with proxies etc....

We use webtrends and they have an optional method top track visitors by installing an ISAPI filter to set a session variable for each user. We use it and it's very accurate (supposedly).

graywolf

3:41 pm on Jul 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I use the webtrends ISAPI filter as well, and have been very happy with it. In fact I even use the webtrends cookie to track users for almost all of my debugging.

claus

3:49 pm on Jul 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Cookies are more accurate than log-files, especially when called from a "web bug" (1x1 px img). Log files track "server-level", web bugs track "browser level" and the browser is simply closer to the end user.

A true count is not possible outside the realms of iris-scanning or fingerprinting. We have the industry standard requested in Scandinavia (Denmark, Sweden, Norway) - it's browser based with web bugs. I have quite some experience with this, sticky me for info if interested.

/claus

<edit> edited last line </edit>

cfx211

5:30 pm on Jul 25, 2003 (gmt 0)

10+ Year Member



Uniques are a subjective count and you have to determine the method you want to use to count them. First off to be truly accurate you should exlude the IP block of your office so you are not counting yourselves. Secondly if you use any monitoring service such as Keynote, that needs to be excluded. Finally you need to include a user agent based filter that removes spiders and robots from your counts.

In my experience cookie based counting is the only way to even get close on uniques and visits. IP address/user agent combos are too inaccurate thanks to AOL and their ilk.

The problem with cookie counting is that robots and spiders usually refuse cookies and if your site is setup to try to issue a cookie for every request that comes from a user who has no cookies, then a Google crawl of your site could in theory generate a cookie for every page it crawls. That is why you need to do a user agent based exclude.

Another problem with cookie counting is that you have to make sure that you issue cookies on every page if a user does not have one. Don't wait until a user gets to the product page to do this because the majority will never make it there.

Anyway my point is that unique counts are very subjective and if two different tools take different measuring approaches to anything listed above then you could see the disparity you are talking about.

Try to really get to know your reporting tools and make sure they are as similar as possible in their methodology. We have done that here and get to within +/- 5% between their counts and that is an acceptable difference for us.

cornwall

5:36 pm on Jul 25, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is quite a good thread in the Forum library

[webmasterworld.com...]

Its from last November, but still relevant. It goes in some detail into what the figures may and may not mean on a log file analysis!

Bottom line really is use one bit of software, and only look at trends