Welcome to WebmasterWorld Guest from 184.108.40.206
Forum Moderators: phranque
My favorite is NetTracker Pro. It does extensive reporting, lets you set up and save custom reports, and, best of all, lets you drill down through the data. If you find, say, that Googlebot is listed on your robot summary with 5 visits and 100 pages requested, you can click on the 5 for the details of those 5 visits. If you find that one of the visits was yesterday and that 34 pages were requested, you can click on the 34 and get a detailed list. Most of the reports have this kind of capability. The basic product, though, works on one domain at a time. Upgrades for multiple domain use are available. Graphs are produced on demand, which is a bit slower than WebTrends prepackaged graphs, but I find that most of the time I'm working with the tables anyway.
FastStats is a very quick log cruncher with rather spartan looking results tables. It doesn't track spider visits, as I recall from testing it a year ago; perhaps they have fixed this by now. I don't like it as well as the other two, but it is fast...
There was also a perl script log tool kicking around this site or SEW, but I didn't hunt up the URL.
If you can afford it, go with NetTracker...
You run it off your PC and point it to the server log file either on a remote server or locally on your hard drive.
You can run as many sites as you want and has quite a few options on what to or not to include within the report.
The report is html so you can customize how ever you waish after it is generated.
GREAT! GREAT! marketing tool!
I created February reports for all my clients, sent them the report via e-mail. 15% of them called me to do more work for them within 24 hours!!.
This was a great investment.
The first is Microsoft SiteServer Express's analysis package. This is part of the NT 4.0 Option Pack for NT Server, although it should work on log files created by other web servers as well. Very configurable.
The second is Analog. An open source freeware download. Very configurable. Works on all log files.
The third is Weblog. A set of perl scripts for analyzing log file.
You can see samples of the possibilities of these three here: [986faq.com...] .