I always saw Tedster and other writers asking questions like " Do you know when was Google bot visit to your website!"
This gives me an impression that analyzing website logs are very easy!, i was always wondering how it can be easy.
But in my case, i have a website with about 90 k unique visitors daily, on a Windows server.
our daily log file is 1 Giga +, so how to analyze such a file! even with a software if will be a long time consuming process true?
I'm using statcounter dot com as website statistics tool, since using a local tool on server, will cause a high CPU load to process logs.
Im missing something? if i want to use local logs just to security issues (tracking and scans, attacks...etc) what fields shall i be logging in IIS? and where to edit the settings of logging?