Forum Moderators: DixonJones
I have searched Webmasterworld on this subject and read this thread:
[webmasterworld.com...]
I thought it would be great to update the situation. It seems a lot has changed in this market.
What would be the best choice for a webmaster who want to analyse the logs of several customer's websites?
Would it be best to download their logs and analyse them locally? What would be the best choice(s) today?
--
Alexey Stcherbic
[edited by: engine at 7:25 am (utc) on Aug. 23, 2002]
[edited by: mark_roach at 7:43 pm (utc) on Aug. 24, 2002]
Any advise for someone wanting to track hits to pages on a retail site. We are not wanting to create reports for clients... we want to create reports for our own selves which will help us to determine what advertising brings people to our site who actually buy.
Also, to see which search engines are bringing people to our site who actually buy.
There is a possibility we could put software on our server.
Ideally looking for something, that whether hard or easy to set up, After is was set up, would work semi-automatically if not automatically.
We would be working with 3-6 of our sites.
To summarize: main goal is to track those who actually buy.
Thank you,
Stephen Wick
In recent months I've been testing Summary and have become pretty attached to it. It has a lot of customizable features and a ton of pre-configured reports that are downloadle in text or spreadsheet form. They also have one of the best online tutorials I've seen on web analytics. I haven't been able to get clearance to install it on any of my clients' hosting accounts, so I can't speak to performance on a server, but as an analysis tool for downloaded logs I really like it. Speed is pretty decent too.
Due to feedback at WebmasterWorld I'm also giving FastStats a try. Still trying to sort out their approach to filter includes/excludes and I'm still waiting to see the "fast" part. DNS lookups have been running for several hours (as I speak), but I'm assuming that's something that's cached for future reports. I did some tests on just a day's worth of logs and I think I may have found a good blend of qualities that I can use for client reporting but still have most of the 'at-a-glance' data that's important to me. For more depth and customization, Summary will fill most of my other needs.
A neat FastStats feature that I haven't seen mentioned above is Tree View. It's an interactive graphical display where, for any specific page, it shows referring pages and next pages visited, with percentages and special alerts on things like frequent exits.
DNS lookups have been running for several hours
Yes, the DNS lookups really slow FastStats down - and any other analyzer as well. FastStats uses a multi-threaded DNS lookup and it can be resolving up to 64 IP addresses at one time. But it's still time intensive - after all, it's using the net.
Most of the time I don't need DNS information. Without the added time for DNS lookup, FastStats reports run mighty fast for me - averaging around 4 minutes to crunch 1 GB of standard Apache files.
In the dept. of boneheaded blunders: I highly recommend making sure the filters and settings are working BEFORE you run with DNS lookups on. I ran afoul of the AND vs OR settings mentioned earlier...and several hours later...I had nada.
Perhaps I am just being stupid and haven't paid out for the all singing, all dancing version or just do not know how to configure it properly to achieve basic ROI info? :)
P.S. I am not a programmer.
Coockies is better but still not perfect (nothing is perfect in web-statstistics :)) If you target school kids and teenagers cookies is not very precise as they move a lot around. They acces the web from home, from school (maybe even from a notebook), from after-school activites and from several friends. So one person could activate multiple cookies and one cookies could be used by many friends (on the same computer). So for this cookies is not very precise.
The most precise (but still not 100%) is log in. If you count the number of passowrd protected log ins you will have a better count but still you cannot expect it to be 100% correct. Let's say it's a bank. I know of several families where husbond and wife share the same online bank account and there is no way to know (from logiles) if its the husbond or wife that has logged in.
Anyone tried it? Any major limitations (apart from the lack of a built-in dns system)?
Cookies - not forgetting those people who deny all cookies (or select which cookies to store)
Problem with logging in is that you'll be excluding search engine spiders from site and hence you'll lose traffic from search engines...
Remember: "Statistics are like bikinis. What they reveal is suggestive, but what they conceal is vital" - Aaron Levenstein (Nature Genetics 24:11, January 2000)
neg:
- requires php-programming skills to configure
With some programming around phpOpentracker we could make it possible to track the referrer of a visitor entering the page at any place in the website, comparing the referrer-string to a referrer-group (e.g. google or overture or banner xy) and adding the corresponding referrer-group-id to the session-id phpopentracker adds to any visitor as a standard procedure. once the visitor does a sale or lead, the session-id with the referrer-id is tracked from the response-page. with that you can find out the conversion-rate according to different ads and comparing e.g. the conversion-rate of visitors coming from google to the once coming from overture.
the only problem is that we sometimes loose leads/sales in the tracking process if e.g. the session-id is cut off.
at the moment we are thinking about changing to cookies instead of session-id. So what's your opinion about reliability of cookies?
[grubbybaby.com...] is the standard mod that writes the logs to a MySQL database
[digitalstratum.com...] is a similar mod, but writes to a postgreSQL database instead, for those who prefer that flavour.
I'd still like to know if anyone has direct experience of these types of solutions...
We currently use sawmill - but it is crashing out our server when we compile the stats at midnight each night. The answer of course might be to get a more pwerful server, but the costs are quite high for us.
Just your thoughts on some very good multiple site/single server tracking software would be grat.
Thanks
M
It's affordable ($495)
It's flexible
It doesn't give you a bunch of stuff you don't need.
It was actually written by the guy who wrote analog?
My clients love it...mainly because when I show them the results, they understand them.
--cyn