homepage Welcome to WebmasterWorld Guest from 54.197.215.146
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Pubcon Platinum Sponsor 2014
Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
Forum Library, Charter, Moderators: Receptional & mademetop

Website Analytics - Tracking and Logging Forum

    
Why do people refer to unique visitor and not visitor?
Where did the phrase unique visitor originate?
alexdino1




msg:898294
 10:51 pm on Mar 30, 2006 (gmt 0)

I work at a large organization and I keep hearing Marketing people refer to the phrase "unique visitor". How did this get started? Why do we not refer to users as visitors? Is there some difference that I am not aware of? Why do we not refer to unique visit? Did WebTrends start all of this confusion?

 

souffle




msg:898295
 11:32 pm on Mar 30, 2006 (gmt 0)

unique as in first time visitor and then you have the regular visitor at least thats what i thought they ment i couldb wrong

Edge




msg:898296
 3:50 am on Mar 31, 2006 (gmt 0)

visitor is defined as a series of hits, with no idle time of some number minutes between any two hits, from the same IP address. Explanation: when a web surfer arrives at a website, he/she requests files, such as GIFs and JPEGs, that make up that particular page. Each request is a hit, and they are delivered in quick succession, with no more than few seconds between them from the server's perspective. When your stats program detects a gap of more than a pre deterimined number of minutes (normally about 60) between any two hits from the same IP address, it is assumed that it is a new visitor. This is usually true, since most large ISPs, such as EarthLink, recycle idle IP addresses.

Unique visitors are actual unique users visiting a particular website within one month period. They are counted by IP address/cookies. Unique visitor counts are normally much lower that “hits” and visitors. A unique visitor may visit a website many times during a months times.

Unique visitors is considered a more important measure of what is going on within a particular website. It is possible to have 1000's of visits from one person during a month, however this would only be one unique visitor.

cgrantski




msg:898297
 2:33 pm on Mar 31, 2006 (gmt 0)

What Edge says about "visitor" is called "visit" in a lot of current analytics software, including WebTrends and all the rest. (It's funny how WebTrends, alone among all the analytics vendors, tends to get the heat for causing problems.)

The term "unique visitor" was inflicted on WebTrends and other packages by customer pressures a few years ago. Marketing people at that time were associating "visitor" with "a visit" in the way Edge does, and kept asking about the term "unique visitor" so WebTrends and others started incorporating it. If you go back far enough in the software you'll find the point where it changed. I don't know whether WebTrends was the first to change - I don't think so, I think WebSide Story might've been earlier and in fact may have started the term that the marketers picked up.

The word "unique visitor" is really closer to "unique cookie". "visitor" doesn't have much to do with an actual person or what you're callng a "user". Many people work on computers at work and have computers at home, and visit the same site from both places. It's incredibly difficult to get those two visits to look like one "visitor" and almost always those two things appear as two unrelated "unique visitors." It's kind of a joke really.

There are many many other factors that mess up the unique visitor measure, ranging from cookie non-enablement, third vs first party cookie issues, deletion of cookies, whether the site gives out a cookie at all, whether the site gives out the right kind of cookie (if it's not a persistent cookie, the unique visitor number should be ignored completely), whether the site runs on more than one server and if so how the load balancing is set up and how cookies are handled across servers, and whether the site experience crosses domains.

Then there's the question of whether the analytics software is reporting properly on the data, even in those rare occasions when the data is actually decent quality. I won't even go into the issue of daily uniques, weekly uniques, etc; it has been discussed elsewhere. (One really interesting thing is that not all the big analytics vendors get this. One of them until recently was doing it completely wrong and was taking a lot of heat from knowledgeable customers until they put together a silent fix.)

Finally there's the question of whether the person using the analytics software even knows which stat to use in a given situation ---- good software will display daily uniques, weekly uniques, etc, and the user needs to pick the right one and know what it means.

So ... flawed original concept, flawed raw data, flawed analytics software, flawed interpretation of analytics software .... there's not much chance of having all of them done right.

Rosalind




msg:898298
 3:03 pm on Mar 31, 2006 (gmt 0)

Another thing that can mess up the accuracy of unique visitor stats is when one person gets up and leaves a workstation, and someone else takes their place. For example, at an internet cafe.

So your stats for unique visitors are only ever a rough guess.

Edge




msg:898299
 5:15 pm on Mar 31, 2006 (gmt 0)

"Rough Guess"? this would seem to imply that measuring unique visitors does not have value.

I use Urchin, Webtrends and Analog for my stats. They report almost identical visits (within .01%). With this said, I have a high level on confidence of my visits. All programs report almost identical uniques (you have to limit analogs logs sample). My guess is that my stats programs are reporting approximately 98% of all uniqiues. 2% error is not "rough guess" to me. Even if they where only 90% accurate, this would be a good refernce point to estimate what is going on within a website.

gregbo




msg:898300
 12:09 am on Apr 1, 2006 (gmt 0)

"Rough Guess"? this would seem to imply that measuring unique visitors does not have value.

It only has value within very limited contexts.

If you have not read the "How the web works" section of the analog documentation, you should. It explains what can (and cannot) be measured. Furthermore, you should read the referenced documentation at the bottom of the page. (Unfortunately, the NLC-BNC paper is no longer available.)

I use Urchin, Webtrends and Analog for my stats. They report almost identical visits (within .01%). With this said, I have a high level on confidence of my visits.

This basically means the programs are all written similarly, not that they actually measure what some people believe they measure.

Edge




msg:898301
 2:37 am on Apr 1, 2006 (gmt 0)

"It only has value within very limited contexts."
Can you be more specific? since it is so limited..

Just out of curiosity, how far off do you think my stats programs are to reality? 5%, 10%, 300%?

martinibuster




msg:898302
 2:51 am on Apr 1, 2006 (gmt 0)

Unique Visitors is one of my favorite metrics.

I think it's value lies in gauging what kind of momentum your website has in terms of growth. If you are increasing your uniques month over month, then that shows you have positive momentum, often from decent search referrals, word of mouth etc.

If the month to month uniques are flat, then that might be your first indication that you have to diversify your visitor stream, and an indication that perhaps you need more content, and other issues. This is data you cannot determine from a simple aggregated head count of visitors.

Yes, the cookie situation as well as the JS issues for some tracking methods will keep the data from being 100% accurate, but it's still very useful.

gregbo




msg:898303
 3:44 am on Apr 1, 2006 (gmt 0)

"It only has value within very limited contexts."
Can you be more specific? since it is so limited..

If you take a small sample of traffic (say a few hours worth), you'll limit the effects of IP addresses changing, cookie deletion, etc. You can't make them go away entirely, and you're out of luck if someone can manipulate your numbers by sending untrackable fraudulent traffic to you (e.g. by compromising some computers and having them send clickstreams to you that look like normal traffic).

Just out of curiosity, how far off do you think my stats programs are to reality? 5%, 10%, 300%?

Unfortunately, there's no way to tell for sure. My suggestion is to tell whoever you have to report these numbers to about the risks of relying on them and have them factor these risks into their business plans.

Wlauzon




msg:898304
 5:12 am on Apr 1, 2006 (gmt 0)

Because there is a big difference between "visitor" and "unique visitor" maybe?

Edge




msg:898305
 1:58 pm on Apr 1, 2006 (gmt 0)

"and you're out of luck if someone can manipulate your numbers by sending untrackable fraudulent traffic to you (e.g. by compromising some computers and having them send clickstreams to you that look like normal traffic)."

I can't think of a reason in the world why anybody would do this so that I would think I have more traffic than I realy do. Besides, I have six years of stats reading experience to spot when something weird is going on. So, unless they be doing it for six years (what a conspiracy), I just don't think so.

I agree that stat programs are imperfect, however I am just trying to point out that I believe they are good enough. It is important to note that everybody on the web is faced with the same untrackable imperfections as you and I. So, when I report traffic as given to me by Urchin (I use the utm script btw), Webtrends, and so forth, that we are reporting traffic with a relatively identical yard stick.

Now, before somebody points out that the Urchin utm script requires java to be enabled, I know. As far as browsers that do not have java enabled? Are they realy a visitor? Anybody that surfs todays web without java is either having a tough time seeing many websites, or is a robot. So, who cares if they are counted..

cgrantski




msg:898306
 3:19 pm on Apr 1, 2006 (gmt 0)

That's true, very pragmatic and realistic. I think that's a key point we lose track of sometimes. Given the limitations of the tools we use and the considerable noise in the data we have, we can only try to minimize the noise, understand the limitations, and most of all stay within the boundaries. Consistency in comparing the same thing to itself over time is a lot more valid than trying to compare to anything on the outside, for example.

Finding little flaws in the data or the tools is a fun mental exercise but it also can prevent us from getting the value that they really offer.

However, Edge, despite the wisdom and balance of what you say, you've gotta stop mixing up "visit" and "visitor"! :-)

Rosalind




msg:898307
 4:51 pm on Apr 1, 2006 (gmt 0)

I can't think of a reason in the world why anybody would do this so that I would think I have more traffic than I realy do.

There are good reasons for bots to make you think you have more human traffic than you really do. For instance, they might want to scrape email addresses whilst staying under your bot radar. If you think the undesirable bot activity is coming from a number of legitimate visitors, you will be less likely to block it. If it inflates your visitor numbers, they don't care.

gregbo




msg:898308
 2:25 am on Apr 2, 2006 (gmt 0)

I agree that stat programs are imperfect, however I am just trying to point out that I believe they are good enough.

If the people you are reporting the numbers to are satisfied with them, then they're good enough.

Now, before somebody points out that the Urchin utm script requires java to be enabled, I know. As far as browsers that do not have java enabled? Are they realy a visitor? Anybody that surfs todays web without java is either having a tough time seeing many websites, or is a robot. So, who cares if they are counted..

If someone has told you that java-disabled access does not need to be counted, then you don't have to worry about it.

gregbo




msg:898309
 2:33 am on Apr 2, 2006 (gmt 0)

That's true, very pragmatic and realistic. I think that's a key point we lose track of sometimes. Given the limitations of the tools we use and the considerable noise in the data we have, we can only try to minimize the noise, understand the limitations, and most of all stay within the boundaries. Consistency in comparing the same thing to itself over time is a lot more valid than trying to compare to anything on the outside, for example.

When I have to explain the reports that are produced by programs like analog (especially to non-technical people), I prefer to err on the side of caution and inform them of the risks and limitations. If they understand the risks and still want to use those programs, that's perfectly fine with me.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved