Forum Moderators: DixonJones
With on-page tracking, an external company does the analyzing for you. You still may have logifles, but they compile the reports for you. You only get the reports you pay for - usually monthly. They usually can't run historical reports for you. They might even claim ownership or copyright on the reports. Plus, you give them access to all your data. They know which spider visited you when, and from what referrers and keywords your visitors came.
What you pay is usually significantly less compared to a fluu-blown log analyzer. But you hand your data over and over a longer period of time buying a standaline analyzer might be cheaper.
I would agree that a whole lot of your decision should rest on the software you plan to use with either method and how you expect to be using it. What PMK said is important - for the on-page method, almost all of the packages (usually ASP services) just don't have a way to make a change in your filtering or reports or segments and then go back and re-analyze historical data. Or, if they do, there's an extra charge, and sometimes consulting fees. So if you're going to take the plain vanilla reports and never do anything historical like that, the on-page method would be just fine and in some ways a lot superior to logs.
There are exceptions to this. Some products use page tagging yet have powerful re-analysis abilities, even when done as a service. At least one allows you to analyze both kinds of data with the same product, or to migrate back and forth. And a couple now offer smart exports to Excel where pivot tables and such can do a virtual, but limited, re-analysis.
[edited by: cgrantski at 12:56 pm (utc) on April 15, 2004]
But I can think of a couple companies that have explicitly said (in small print) that this is part of the deal. I know that one of them got in a lot of trouble a couple years ago about not being completely upfront about this, resulting in a high-profile lawsuit or investigation or something.
The main benefit of log analysers is that no modifications to pages are required, and they are still useful if all you have are server logs. And sure, a log analyser is probably cheaper in the long run. That said, server logs are unfortunately designed to monitor file usage rather than visitors. Mainly because of caching (of many different types) and proxy servers, a wide margin for innaccuracy can creeps into reports based on log files (try comparing figures from a few of the major products sometime!). If accuracy is a concern, then you need to be very wary of some of the numbers resulting from log analysis.
On-page trackers do have disadvantages such as slightly increased initial page load and difficulties in tracking files like CSS or external javascript (or things like 404 errors). However, ownership of data and historical analysis are not amongst them - they are certainly a disadvantage of some of the systems out there, but not of on-page tracking itself.
In my experience, on page trackers are far more effective in identifying unique and returning visitors, and can also have handy features like conversion tracking built-in to the system. Comparison tests i've run always favour on-page in terms of counting accuracy.
There's a time and a place for both methods, but given a choice I always go for on-page tracking.
By way of example, consider caching by ISPs (who do this to save bandwidth). If 2 visitors from the same ISP go to your site, the second will sometimes be served a cached version of the page, which is invisible to your log files since it is served from an entirely independent server. Good on-page tracking gets around the vast majority of caching issues since the script is still run just as if the page was actually served from your server.
The proxy issue is slightly different. Some ISP's (most notably aol, but this includes many internal business networks etc.) use proxy servers to display pages. This can have effects ranging from many visitors having the same IP address (which equals the same visitor to most log analysers) to visitors being assigned a number of different IP addresses per visit (one visitor shows up as many different ones to the log analyser). Again, by using cookies and other methods, a good on-page tracker can get around the vast majority of proxy issues.
Some kind of cookie/log analyser combination can help a lot, however as I said above logfiles weren't designed for this, whereas on-page tracking can be built with visitor tracking in mind from the ground up.