Forum Moderators: open
We have a few sites on dedicated servers, so we have log files with info on what bots are visiting. But it's an ocean of data, and I need a good ship to get through it.
So here's my question: how do you keep an eye on your logs? Do you retrieve the files and have some software analyze them? Do you install analysis software on your server, which you log into? Does your host provide bot- or useragent-analysis services to you?
I know this is something I *need* to do, so now I'd like to hear opinions on *how*. Apologies if this isn't the right place to ask, but seems like either this forum or the robots.txt one were best.
Your suggestions?
There are commercial bot-tracking software packages available, though, but my specialized needs aren't met by them very well.
Another good forum to get this type of info is the Tracking and Logging forum [webmasterworld.com], but the topic is appropriate here, too.
Otherwise, I do like said above. I Download my log everyday and run it through a small PHP script which gives me the basic informations.