Welcome to WebmasterWorld Guest from 184.108.40.206
Has anybody had any other successes? and can anyone explain the advantages/disadvantages of a pixel based system rather than logs.
From what I have heard logs are more accurate but can anybody give me reasons why?
Logs and stats are my 'other hobby' (golf first when in season). With real server logs, when someone pulls a page, regardless of agent or client, it is recorded. If an agent connects, it is recorded.
When a graphic based counter is pulled, it is only pulled when the client requests it. That means, non-graphical agents (like search engine spiders) are out. It also means you will can/may be a greater victim of proxy caching.
If setup properly, graphic based counters can actually give you better data in a couple of instances. One is with caches, where you can over ride the cache with some sort of js or dynamic url (counter.gif?day=monday). If you are using just log files, you won't see that pull as the proxy cache will cache your static pages.
The other similar, but rare instance is that of full page caches like Google and a few "behind the walls" secure sites where you normally can't reach to see who has pulled your files.
All-in-all, logs still win. I often run a combo of 4 counters (js bases, graphic based, logs, and server side includes). The amazing thing is to compare the differences between them - fascinating.
I agree completely with Brett, the combo method works best.
Just an added extra with the img src script trick: if you always use the same name e.g. counter.gif?day=monday not all cache's honour the expiring content headers and the "?" ... what I have found works best is to use a random name generated by js e.g. 7863576354xxxwuwyetruwye.gif?day=monday ... then use a custom error 404 script to check for the xxx and post the stats to the DB ... followed by the redirection to the gif.
Hope this helps..
 Because of proxies transparent/or-not , the real ip's are shielded. An entire network(ISP) may surf off the same ip. Hense session stats go pear shaped.
 dial-up users are assigned a different ip every time they dial up to their network. So checking return visits becomes impossible with just ip's.
a) the basics
Every request to your site is logged, such logs usually include
- a timestamp
- an originating IP (ie. of the requesting client)
- a referring page/site or -
- the size of the transfer
- the numeric status of the transfer (404/not found, eg)
- the http protocol version
- the page requested
Usually, such lines are long, and awkward to read, therefore programs have been created to transform this into better readable (to humans) forms.
b) the problem
As I pointed out in another post, relying on such startitics is dangerous, because there's no way you can draw a line of one ip = one visitor. Several of those 'visitors' are bots, or are caused by browser problems (eg. netscape has a race condition with a proxy). Also, often sites are cached, and only the IP of the cache shows up. there's no way ot tell whether there's one, 100 or 1000 ppl behind a certain ip.
Cookies alllow to get around some of the problems, but not all of them, therefore using them is no way to ensure the condition holds.
Graphic stats have the advantage of 'looking better', the detriment of hinding all those problems. Whats it worth to have 1k requests for a document a certain day when all it really was was a race condition of a browser? (all to the same IP). How do I know it was a race condition? - I don't I can only assume.
IMO the only figures that may tella little of the popularity of your web site are:
- the number of different hosts served
- the ranking of your poages to poularity (but dont attach yourself to the numbers please)
- the number of referring links from outside (ie search engines finding you)
There's a good article on the subject (of what statistics do tell and don't tell): [analog.cx...]
The above link also contains analog, a logfile analyzer (non-graphical) which might be a solution to your problem. I dont say it is the only one, or a self-fulfluling prophecy..
Last month, the company for which I administer the site got a few good leads out of the web presence (old economy). When one good lead is enough to pay web-hosting for a year, no one talks about hits any more. they talk about projected image..
Eugene, I am afraid they will have to cultivate their garden a bit more before they'll be able to yield a crop.
A product of this type needs to be 100% trustworthy, fully tested and capable of demonstrably delivering with precision on a totally consistent basis.
I'm afraid I found the potential integrity of the software's quality somewhat compromised by several links on the site that aren't functional - particularly the highly important ones with ordering information (which should have already shown up in their statistics).
This type of product will function for tracking potentially high dollar amounts, and the company's target market will themselves be marketers, who very well know the importance of attention to detail.
Looks promising enough potential-wise, but it's got a bit of polishing up still needed in the presentation and development.