homepage Welcome to WebmasterWorld Guest from 54.225.57.156
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
Forum Library, Charter, Moderators: Receptional & mademetop

Website Analytics - Tracking and Logging Forum

    
Log stats, or Graphic stats?
poised




msg:897186
 3:01 pm on Jan 3, 2001 (gmt 0)


I have had recent problems with my stats, and are looking into alterbnative options. Currently I use Net Analysis though we have experienced consistent problems though this may be to do with capacity on our server.

Has anybody had any other successes? and can anyone explain the advantages/disadvantages of a pixel based system rather than logs.

From what I have heard logs are more accurate but can anybody give me reasons why?

Much Appreciated

 

Brett_Tabke




msg:897187
 3:21 pm on Jan 3, 2001 (gmt 0)

Hope you don't mind I snipped this off to its own thread. I thought it was worthy of it's own two feet.

Logs and stats are my 'other hobby' (golf first when in season). With real server logs, when someone pulls a page, regardless of agent or client, it is recorded. If an agent connects, it is recorded.

When a graphic based counter is pulled, it is only pulled when the client requests it. That means, non-graphical agents (like search engine spiders) are out. It also means you will can/may be a greater victim of proxy caching.

If setup properly, graphic based counters can actually give you better data in a couple of instances. One is with caches, where you can over ride the cache with some sort of js or dynamic url (counter.gif?day=monday). If you are using just log files, you won't see that pull as the proxy cache will cache your static pages.

The other similar, but rare instance is that of full page caches like Google and a few "behind the walls" secure sites where you normally can't reach to see who has pulled your files.

All-in-all, logs still win. I often run a combo of 4 counters (js bases, graphic based, logs, and server side includes). The amazing thing is to compare the differences between them - fascinating.

Brett

poised




msg:897188
 10:30 am on Jan 4, 2001 (gmt 0)

thanks for the reply, duly noted :-)

daveATclickthinking




msg:897189
 1:11 pm on Jan 4, 2001 (gmt 0)

Hi guys,
Firstly happy new year to all ... hope you had a great Xmas & NY.

I agree completely with Brett, the combo method works best.

Just an added extra with the img src script trick: if you always use the same name e.g. counter.gif?day=monday not all cache's honour the expiring content headers and the "?" ... what I have found works best is to use a random name generated by js e.g. 7863576354xxxwuwyetruwye.gif?day=monday ... then use a custom error 404 script to check for the xxx and post the stats to the DB ... followed by the redirection to the gif.

Hope this helps..

poised




msg:897190
 10:05 am on Jan 5, 2001 (gmt 0)

apart from the fact that a lot of people don't like cookies, what are the relative merits of using these as a base for statistics rather than ip addresses?

daveATclickthinking




msg:897191
 12:12 pm on Jan 5, 2001 (gmt 0)

The merits are as follows:

[1] Because of proxies transparent/or-not , the real ip's are shielded. An entire network(ISP) may surf off the same ip. Hense session stats go pear shaped.

[2] dial-up users are assigned a different ip every time they dial up to their network. So checking return visits becomes impossible with just ip's.

poised




msg:897192
 12:42 pm on Jan 5, 2001 (gmt 0)

cheers dave,

clear as crystal!

skirril




msg:897193
 4:10 pm on Jan 8, 2001 (gmt 0)

Ok, here's something about logs vs. graphics. Those of faint disposition, beware, it is going to be a lengthy post.

a) the basics
-------------
Every request to your site is logged, such logs usually include
- a timestamp
- an originating IP (ie. of the requesting client)
- a referring page/site or -
- the size of the transfer
- the numeric status of the transfer (404/not found, eg)
- the http protocol version
- the page requested

Usually, such lines are long, and awkward to read, therefore programs have been created to transform this into better readable (to humans) forms.

b) the problem
--------------
As I pointed out in another post, relying on such startitics is dangerous, because there's no way you can draw a line of one ip = one visitor. Several of those 'visitors' are bots, or are caused by browser problems (eg. netscape has a race condition with a proxy). Also, often sites are cached, and only the IP of the cache shows up. there's no way ot tell whether there's one, 100 or 1000 ppl behind a certain ip.

Cookies alllow to get around some of the problems, but not all of them, therefore using them is no way to ensure the condition holds.

Graphic stats have the advantage of 'looking better', the detriment of hinding all those problems. Whats it worth to have 1k requests for a document a certain day when all it really was was a race condition of a browser? (all to the same IP). How do I know it was a race condition? - I don't I can only assume.

IMO the only figures that may tella little of the popularity of your web site are:

- the number of different hosts served
- the ranking of your poages to poularity (but dont attach yourself to the numbers please)
- the number of referring links from outside (ie search engines finding you)

There's a good article on the subject (of what statistics do tell and don't tell): [analog.cx...]

The above link also contains analog, a logfile analyzer (non-graphical) which might be a solution to your problem. I dont say it is the only one, or a self-fulfluling prophecy..

Last month, the company for which I administer the site got a few good leads out of the web presence (old economy). When one good lead is enough to pay web-hosting for a year, no one talks about hits any more. they talk about projected image..

Skirril

DaveAtIFG




msg:897194
 8:01 pm on Jan 8, 2001 (gmt 0)

Thanks Skirril! Great post! A little lengthy but a valuable read! I suspect this [analog.cx] is the document you were referring to, it's the one I found most enlightening.

Marcia




msg:897195
 12:35 am on Jan 9, 2001 (gmt 0)

>>They just started their services and they are going to work with clients some days later

Eugene, I am afraid they will have to cultivate their garden a bit more before they'll be able to yield a crop.

A product of this type needs to be 100% trustworthy, fully tested and capable of demonstrably delivering with precision on a totally consistent basis.

I'm afraid I found the potential integrity of the software's quality somewhat compromised by several links on the site that aren't functional - particularly the highly important ones with ordering information (which should have already shown up in their statistics).

This type of product will function for tracking potentially high dollar amounts, and the company's target market will themselves be marketers, who very well know the importance of attention to detail.

Looks promising enough potential-wise, but it's got a bit of polishing up still needed in the presentation and development.

Marcia

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved