Forum Moderators: DixonJones
I also plan to write a custom script to parse the log files (since I have a few web projects on several different servers, I have to check quite a few logs,) in order to get the results in just a few clicks.
Keyword data
To find keywords I need to emphasize better or less on the page.
To find keywords I need to bid on or negative out. To understand how people view your site. I'm also finding that the more links from a certain industry I have, the more visitors I have to similarly related pages- thus it gives me an idea whether I need to beef up links from different but related industries.
Referrer Data
I check who's referring me, and why. Sometimes it's a forum member somewhere asking questions on what's the best widget- good idea to go in there and "lend a hand." Occasionally the referrer may be someone who is linking to you, but with out dated or mistaken information- good idea to head over and fix those. Sometimes you find ideas for link opportunities.
Unique Visitor Data
This is pretty obvious
Top Landing Pages
Kind of related to the keyword research.
These are just some of the data I look at, and how I use it to forward my goals.
Anyone else care to share?
How do all of you find the time to check gig + (our site's 24 hour logs are close to a gig...) log files DAILY?
We don't have time, we MAKE time. ;-)
Besides, there are a lot of stats programs that make this task easy.
Bull did say something about bots, which is understandable, but I'm under the impression there are other reasons as well.
I personally have "tail -f /path/to/access_log" running whenever I want to know sth about a site, looking for a bot or sth like that. It also helps you adjusting links according to click-paths etc
But I do it just a few times, mostly when I have new sites where I wait for GB. :)