Not all bots have bot in ua.
I take the raw logs and write a db table, set on a script hourly. Because the logs reset daily, I key them by ip + time + other unique info and don't update twice.
On the way in I lookup each from a robots IP lookup table. If bot goes to second db table.
(as I find new bots I add their 4 segment IP).
I lookup first by 4 segment IP in robots table,
if a range I've banned that jerk bots come from and they change it, I simply write 3 segments to bot table, (1st 3) which it also checks if 4 segments not found.
Now that separated to users (or unknown bots not already in my bot table), and bots, is much easier to get counts.
The script program displays a continuous hourly count but I can on demand or daily do query off the users table, and if I've added any that were really bots it skips them.
(sometimes you need to look at behaviour to see if bot, like it hits head which real users don't or scrapes a bunch quickly, etc)
then you can do whatever you want in ways of query with the table, which tends to grow large.
I can sep out by IP to give a 'visit' trail for any IP or chain of IPs in the case of AOL visits and the like.
Other interesting analyses can be done.
I wouldn't put too much into webstats figs. they are not as smart as you can be if you analyze the logs yourself in detail
I keep one log a day so I can reload the db if something goes corrupt.