Forum Moderators: DixonJones
How do you get round the problem of users connecting through a proxy skewing your results?
If you do it yourself presumably you put a time limit on what you'd call an average visit. How long do you allow - half an hour? more?
I see a fair few posts on the forums here where people mention the number of uniques they get, pageviews per unique etc.
I wonder if anyone has a reliable way to count them (surely impossible given the way http works?) or perhaps a way to reduce the unreliability, or do you just live with the fact that you can't trust the results - in which case, why count them at all?
At the moment I'm just counting pageviews, but I'd like to be able to count pageviews per visitor, but then I remember all the reasons I don't count uniques.
Any thoughts?
I never trust the numbers for unique visitors totally - I don't believe you'll ever get the true number - even with the use of cookies. There are users who don't accept cookies. And in the case of users operating system being Windows it counts "profiles" - not humans.
But then again, having numbers you can't trust totally is better than not having any numbers at all.
I use PHP, so I run a custom log with MySQL to register user browsing with the cookie.
I don't know if there's a way to log cookies used without a scripting language (PHP, ASP, etc.) and then obtain a report on that, is there?
"I don't know if there's a way to log cookies used without a scripting language (PHP, ASP, etc.) and then obtain a report on that, is there?"
If you are using Apache as your web server you can use their mod_usertrack module to set server-side cookies and then modify your logging directive to log the cookie values in your web server's log.
With Netscape/iPlanet, there are several openly available NSAPI cookie setting modules. Again, you will need to modify the web server's logging configuration (via obj.conf) to ensure that you log the cookie values.
With Microsoft's IIS (with which I am a lot less familiar) I believe there is openly available ISAPI code for setting of server-side cookies. With the correct configuration these cookies can then be logged in the W3C log.
Also, some web site's choose to log both the 'set-cookie' and 'cookie-header' variables in their logs. This distinguishes between your server sending a cookie value as part of a response to a request which did not send a cookie (set-cookie) and your server logging a cookie value which was sent with a request (cookie-header).
The benefit of this is that you can check your logs to see which cookies that were sent out in the 'set-cookie' field were subsequently received in the 'cookie-header' field. This will enable you to see which cookies were accepted by browsers. You can then exclude the cookies that were not accepted from your unique user analysis algorithm.
thanks for your responses. I know I can do the setting/getting with jsp which my site runs on, but I don't want to have to make all my pages jsp's.
There's no need to make the site dynamic to be able to use cookies.
for what its worth, I call any page which contains scripting (whether jsp/asp/php/perl - whatever) to generate anything apart from straight HTML for request a dynamic page.
Any suggestions that avoid making more dynamic pages sound good to me, I'll look into both of those.
You'll get logs with something like:
GET /myimg.gif?JHhj43DFS
GET /myimg.gif?8hJHhj
GET /myimg.gif?fsd8832
etc...
Then calculate the number of unique parameters for that request.
That's the worst method I could think of, but does not require server-side of any kind.