Welcome to WebmasterWorld Guest from 22.214.171.124
Perhaps it's time to look for a more up-to-date piece of software?
... Webalizer uses the CLF (Common Log Format [apache.org]) among others -- so the data itself is coming right from your server logs, you can't be more accurate than that. The stats/data itself is not out of date, therefore neither are the stats/information as it is being reported. If the presentation of the data is not satisfactory, that is a different story. However, that is why the raw logs are there. Also, Webalizer is open source, so you can modify the reports as you wish. If you aren't a programmer, Webalizer does provide some configuration files that allow you to do some manipulation.
the data itself is coming right from your server logs, you can't be more accurate than that.
The number of pageviews is 100% accurate, but aren't you interested in the breakdown between humans/bots/search engines?
I certainly am!
The stats/data itself is not out of date, therefore neither are the stats/information as it is being reported.
1. I need to know the number of human visitors to my site.
2. I want to know whether bots are pulling significant numbers of pages from my site.
3. I need to know whether the search engines are spidering my site.
If webalizer hasn't been updated for 4.5 years, it can't have an up-to-date list of bots, so item (2) is flawed, it won't have an accurate list of search engines, so (3) will be flawed, which means that (1) is also flawed as it will be over-estimated.
If the presentation of the data is not satisfactory, that is a different story.
IMHO this is nothing to do with presentation, it's purely about numbers. webalizer may be good enough to calculate [human_pageviews] + [bot_pageviews] + [se_spider_pageviews] as an overall total but this is so trivial I reckon I can do that faster with something like
cat www.example.com.log ¦ wc -l
I have Webalizer and AWStats available to me ... I compared the results over a few months and AWStats seems to be more consistent and mcuch more informative.
I didn't think Webalizer ever recognised the difference between people and bots
Five years ago nobody worried about bots. Well, perhaps one person did [webmasterworld.com] ;-)
In essence, this is my point. Look how fast software changes. Look at what happens in five years.
How can a five year old piece of software be a good fit for today's problems?