Forum Moderators: DixonJones
just my $0.02 :-)
sandy
Must admit, I've heard some good things about it though. Personally, I use WebTrends and NetTracker. What's the best ? Depends what you're after. If it's serious data mining, visitor profiling, and e-commerce metrics (conversions, ROI, etc), NetTracker rules.
they are happy which makes me happy too :-)
I have a problem I hope y'all can help me with.
I have to get a much better stats program. My head tech guy says one of our sites generates a log file about 1 gig in size over a month's time. Programs he's tried can take an hour or more to sort through the data.
I need to find a stats program with capable log file analyzer that can rip through this stuff QUICKLY.
Recommendations? Thank you.
We use urchin... It's unbelievably fast... I don't like to post stuff that sounds promotional but some of the stats from their site fit your example very well
Using its new world's-fastest DNS module, Urchin will process 1 GB of logfiles and perform reverse-DNS on every visitor in about 10 minutes
I really can't understand why anyone would pay for anything related to log analysis, when analog is available.
[analog.cx...]
For super speedy dns lookups; "23.6Meg log file in 18 minutes"
Try this freebie:
[summary.net...]
I use it to process log files before sending them to analog.
On the downside, FastStats requires setting up a different report for every time period (yesterday, past 7 days, last month, or whatever) and it doesn't have a category for robot/crawler/spider visits.
I like Sawmill's calendar feature (just look at the calendar and highlight the day, week, or month you want), but waiting for some of the pages to display can be annoying.
FunnelWeb's free version is worth trying, too. And Wusage has one feature that I like (being able to see a list of days, weeks, etc. with traffic totals displayed alongside), but it seems very slow when working with large logfiles.
For what it's worth, I now use three tools:
1) WebTrends Live Personal Edition to see what's happening during the day (albeit with questionable accuracy);
2) FastStats for quick, easy-to-read summaries of what happened in the previous 24 hours, including top pages, top referrers, and referrers for specific pages;
3) Sawmill for robot/crawler/spider visits, for viewing traffic over time or on comparable days (e.g., how yesterday compared with the four previous Mondays), and for examining various types of data when I have time to dig around.
I'll be looking into some of the other products mentioned in this thread but I think Web Trends is probably the industry standard for most hosting companies. Is that correct?
John316, Are any versions of DNSTran compiled for Windows? They all look to be for MAC or Unix.
Webtrends (The corporate white bread logger): over priced, under powered, too complicated to do simple stuff that other loggers do in a single click. Charts are perty, but I've yet to see a logger chart that was worth using for anything more than impressing clients (shh, don't tell - it's one of those agreed upon little lies).
FastStats (The isp logger): currently using it. Somethings are great, others poor. Referrals are not it's strong suit. Strengths, fast, reliable, good charts. I sure can't see spending more month on WebTrends when FastStats will do the same job faster and in some cases, better.
Analog (joe user): nice, but very slow on some systems. It doesn't do referrals justice. Strengths: reliable, open source, and portable.
What's next?
One thing I must say is that regardless of which traffic / log analyizer you are using, you are only getting estimates.
The dirty little secret nobody tells you about is that many hits go uncounted, due to a little thing called cache. Cache comes into play at many points, it may be that the service provider is cacheing your pages, and serving their users with a copy already downloaded to their servers, or the user may typically have their cache on as well.
What does this mean?
You're missing alot of data! When a user is on one of your pages, and decides to go back a step, instead of downloading that previous page again, the users' computer simply reloads the version it already has! Take this one step further and say that if one user of a giant ISP downloads a page, that server will also cache that page, hoping that someone else using their service might need that page. You will never know about that person seeing your website. You have no accurate way of tracking people moving through your site if they're using the back button (who doesn't?).
There are solutions, the best being (in my humble opinion) a company called BellaCoola, which solves this problem by inserting a bit of JavaScript code into your pages, which is uncacheable. With this they then create their own logfile for your site, and record only data you wish to record (only HTML pages rather than GIFS). You are able to then process this log file with your own software (WebTrends... Analog).
This is a great explanation of web cacheing:
[analog.cx...]
-meannate
We have a cusom system which dumps every visitor event - page access, button click etc. - ultimately into an Access database, but it's clunky. Is there a reasonably-priced commercial equivalent?
Chris
When looking for Wusage as posted here Google Adwords shows a link to 123LogAnalyzer. I checked against the others and its cheaper and offers a 25 day trial version. (No I am not in any way associated with 'em ;-) - but anyway I just tried the trial and its pretty damn good.
I have used analog but I'm one of "those" (read lazy) people who isn't all that hot at hand coding stuff.
Anyone else tried 123LogAnalyzer ? I'm going to run a few more test reports but it looks the goods - like Analog and report writer combined and some nice DNS lookup etc.
Cheers
Troppo
(edited by: troppo at 2:49 am (utc) on April 18, 2002)
I agree with bill, dunno about fast but lots of info for not too slow a service. I live by them now. Always insist that my client pays for them! And life is getting better:)