Forum Moderators: DixonJones
Plz share your experiences with both these tools so that helps me decide.
Eagarly awaiting your responses.
For me, it could be the prettiest interface in the world but if it doesn't have the best info/data then it is just make up on a pig.
Best data = best program
Absolutely right. Stupid terms of use aren't they? One version of log analyzer is supposed to only be used on one machine's logs. If all your clients use your server it us fine if not, then you are technically breaking the terms.
However... if you always transfer the log files onto one machine before analyzing, then I think that gets around it. Also, the software works for multiple logfiles regardless of server details, so irt is a case of the terms of business not agreeing with its functionality.
Still - it is enough for us not to use it much, even though we have got it.
Dixon.
Our scenario is that as a marketing/creative service agency we often design or market multiple client sites - often they make their own choice of where the site is hosted.
On this basis, are you saying that you would FTP the logs from different servers to your PC and then do the analysis?
I was hoping to do this 'remotely' as some of the logs are huge!
Maybe a code-based, real time system is better like HitLinks. It's just the thought of having to add code to every page of a large site, and how (if) it handles dynamic pages.
What do you use Dixon?
Stats are becoming a real pain!
I'd add Urchin to your shortlist if I were you. If you're prepared to play under the hood - altering config files and the like - it's a very powerful bit of kit and IMHO far more accurate then Webtrends, which I've always found to be very optimistic in its traffic reporting.
Does not give some very relevant and important details as do Web Trends and Net Tracker.
1.Like refferals for particular pages on your site(Which by me very very crucial)
2. Started showing some absurd unbelievable figures for Unique visitors.
For uniques or sessions? Urchin adds spider visits into sessions, but so does every other stats package I've used apart from AWStats. These can be filtered out.
Either way, UTM (client-side tracking) is worth switching on which'll improve accuracy.
All the information for per-page referrals is stored by Urchin, and it should be possible to add a custom report to show it. It's something that's on my to-do list - if I manage it I'll drop you a sticky.
In some cases, this can be inaccurate - AOL users who report multiple IPs per visit or multiple users on a shared corporate connection who will only report one.
Hosted tracking solutions use Javascript and cookies to track vistors through a site - supposedly more accurate but reliant on the client actually having Javascript and cookies enabled. It also involves adding tracking code to every page on the site (or linking to an external Javascript file).
Urchin's UTM combines these two methods of tracking by using a 'bug' graphic to store the client side info in the log files (to the best of my understanding). Having this information opens up new reports in Urchin such as new and repeat visitors and client settings (colour depth, resolution) as well as making existing reports more accurate.
In theory, this should be the best of both worlds, and should offer very accurate tracking. Urchin's results certainly seem close to AWStats, which I've always found to be very good.
The major downside to Urchin's UTM is the size of the Javascript file you need to include on each page. 19K IIRC, which I'd rather not be adding to my page weight if I can help it.
Setup is fairly straightforward and is well documented in the Urchin manual.