I know that it isn't alot of people's favoite, but currently we are using WebTrends, which is working pretty well for us.
It does everything, and more, that you mentioned above, and it does it well.
I can't really say that it is my favorite as I haven't used any others yet. I would like to hear of some others that have given good results though.
I use webtrends to and find it a great tool, just a bit heavy on my flexible friend
>>>>>just a bit heavy on my flexible friend
Are you talking about your server?
If so, I have had complaints here about the same
no was the price tag for the program
I was looking at WebTrends, as well, but was turned off by the arrogance of the sales drone I talked to who seemed to have a mighty big chip on his shoulder regarding the open-source movement *grin*.
I use the free software "Webalizer" to track our logs, though it doesn't do some of what is looked for here (like tracking spiders, though they do show up in other ways...)
I haven't used the latest version of Webalizer, and it may do these things now. It can be found at [mrunix.com...]
For tracking our internal website I use an application called SurfStats, which wasn't free, but a good deal less then WebTrends. It has some annoying quirks, but it does get the job done. The downloaded log files are stored in a db, however.
I use two programs: Accessprobe, which is good for getting traffic 'snapshots' of 1-2 days time, and WebLog, which tracks longer term stats. WebLog is free (but must be run fromt he command line or a cron job), and Accessprobe is cheap.
Something to be aware of: you can run several log analyzers. I run three every night, and might try out a couple more. Each gives slighly different information. So between them you should be able to cover everything you want analyzed.
This sounds like a good idea, but there is one thing that comes to mind.
I am running about 20-30 reports on the server, so if we were running 3 or 4 reports for each of the 20-30, I think that the server would have a fit. I don't even think the guys in that department would even permit me to do that.
Have you had any problems like this, or am I wrong about it eating up the server?
Depends on the program. One program runs in a couple of minutes, another takes over an hour since it tries to resolve all unresolved IP addresses each night. Since I own the server, it isn't a problem for me! :) If you download the log files, it shouldn't ever be a problem.
I swear by Traxis! Very configurable and the price is right. :)
I've had success with nettracker pro -- its quick, customizable and easy to export. I tried webtrends, but it took too long to generate reports. maybe I'm just the crazy one, but I like to peek at my site stats each day. :)
As someone who sees both WebTrends and a proprietary report daily, I'd have to agree with Xoc and say that running multiple programs is the most important part of tracking. I've seen WebTrends give extraordinarily different reports on the same log file, depending on the way it was set up...
Hey Krissie, welcome to WmW.
I like webtrends, but I've never tried other log analysis programs. It allows you to schedule your reports which is nice. It also does link checking, checks & notifies for server/router failures and other things that are useful too.
Little known and free Relax [ktmatu.com] is very good at tracking search engine traffic. I don't think Traxis is still available, is it boss?
My comments: WebTrends is easy to use, and generates reports and graphs that are easy to explain to a non-technical client. However, it does not let you drill down through the data. What you see is ALL you get. So, if you see that Googlebot looked at 40 pages, and you want to know which pages they were, and when they were taken, you are out of luck. You could re-create your reports with a special filter to get this info, but that's too tedious.
NetTracker Pro, on the other hand, is impressive in its drill-down capability. You can start at the 50,000 foot level and drill all the way down to looking at an individual visitor's path through your site. Custom reports are fairly easy to create and save for automatic generation. Exporting data for analysis in Excel or Access is quick and easy.
FastStats is one I haven't used in a while. It is really fast, but the reporting has some limitations that made me discard it.
One other factor when you are looking at pricing - be sure to check how multiple domains are handled. Neither WebTrends nor NetTracker are unlimited use programs. Each software vendor has their own approach to handling more than one domain, and you should be sure you price out all the options.
I like Summary for lots of details and Happy Log for quick 'n dirty.
Analog is free and does a very nice job. You can get it here
I have used many different software packages - including most versions of WT - and even been involved in the production of build in stat tools for content management systems - all based on Web Server Monitor Logs - that is the logs we are talking about here ...
But I am soon to turn completely to Network Package Sniffing instead. I am working with a local company that have made a box that can sniff and produce true live stats for me like nothing I have ever seen before. It is amasing. I can get all the numbers I want in TRUE realtime (No delay at all beside the time it takes to refresh the stat page).
I am working with the company to design new reports. In the first version of the box it will just be standard reports (like WT) but in the second edition users can set up their own reports. If you have ANY ideas of what kind of reports you - as SEO's - want then let me know, and I have them made :)
Besides the real time stats there is a number of reasons that I turn to Network Sniffing away from Server Monitor Logs.
1) It is much easier to set up and maintain. I (or my clients) save costs here.
2) When using Network Sniffing I can turn off logging on the server. Testing have proved that NT takes about 20% of the recourses to do logging and 15% on Apache. That is A LOT of power to save. In a cluster with 10 NT servers you save to!
3) When turning off logging on the server there is a much lower risk of the servers crashing (espacially true when it comes to NT! - hehe)
4) I get a lot more usefull information then what I do with server minitors. E.g.:
- Stop Request (each time a user hit the STOP botton I can see that - you can't with server logging)
- The users REAL connection speed to the server (I can even serve different content or pages depending on that connection speed - in real time!)
- True time taken - the complete time it takes for a request to be processed - not just the time it takes for the server (as it is with server loggin).
And I could keep on ... hehehe
There is just no reason what so ever for me to stay with server logging :)
I've run into a few people that are doing the packet sniffing approach to logging too. Like you Mikkel they are quite impressed with it. It just makes sense to offload as much server work as possible. The problem is the software is pretty expensive from what I've seen - and you have to be incontrol of the router or load sharing box. That's not real feasible for most folks.
Yes, Brett, you are right about that. It is a little pricey ;) If You just have a small website with a few megs of logs a day then it will most likely be overkill with network package sniffing. But then again, just because you don't actually _need_ a porsche, who would't love to drive one? - hehe
I'd like to know the best FREE logging/tracker software out there....i'm not willing to pay 100's for a program....i'm not needing advanced features right now at this time.
I like the free HitBox. It's a little slow on a dial up but it works and there is many nice reports. The uptime seems OK too.