Forum Moderators: DixonJones
I wrote a paper a while ago for my current employer with regards to different methods of tracking/logging.
Currently we use log files that are parsed every 24 hours. To summarise, this method is not suited to our site at all.
We have limited time to introduce a new service (the current WebTrends service provided by a supplier is going soon) so I was in search of something appropriate.
Tagging won't help me much, since I don't have time or resource to tag all of our static pages, change text/code in the database nor change code in our CMS.
I found a product from RTMetrics, that appears to sit on the network and listen to traffic. I am assuming that I would set up some rules and the appliance will start logging and analysing traffic.
This idea appears to be perfect - does anyone else have experience of such a device?
Tagging won't help me much, since I don't have time or resource to tag all of our static pages, change text/code in the database nor change code in our CMS.
That shouldn't be an issue. There are little programs that you can generally get that can do a find and replce on a block of text in every HTML page on a site. Some systems even come with a tool to do this. I think Hitslink and Hitbox are two. Obviously, back the thing up properly first!
Network "sniffing" is what systems like Hitwise tend to use, as it is not website-centric so can aggregate many different sites, but I have no experience of whether they are especially good or not for tracking an individual site. My guess is that sniffing will produce the same issues that log file only analysis does, particularly in making the incorrect assumption that two requests from the same IP constitutes on unique person. The "sniffer" can't lay a cookie down to better differentiate unique users. I would also giess that you are going to have to use thougtful rules to count page views correctly. Do you call a PDF file a pageview? those kinds of questions.
Dixon.