Welcome to WebmasterWorld Guest from 54.196.244.186

Forum Moderators: DixonJones & mademetop

Message Too Old, No Replies

Analysing big size logs!

how? what softwares? and how to determine what logged in IIS

     
7:17 am on Oct 31, 2010 (gmt 0)

Full Member

5+ Year Member

joined:June 24, 2008
posts: 219
votes: 0


Hello

I always saw Tedster and other writers asking questions like " Do you know when was Google bot visit to your website!"

This gives me an impression that analyzing website logs are very easy!, i was always wondering how it can be easy.

But in my case, i have a website with about 90 k unique visitors daily, on a Windows server.

our daily log file is 1 Giga +, so how to analyze such a file! even with a software if will be a long time consuming process true?

I'm using statcounter dot com as website statistics tool, since using a local tool on server, will cause a high CPU load to process logs.

Im missing something? if i want to use local logs just to security issues (tracking and scans, attacks...etc) what fields shall i be logging in IIS? and where to edit the settings of logging?

Thanks
8:08 am on Oct 31, 2010 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 15, 2001
posts:1436
votes: 0


I used to use TextPad to open big log files and then use its tools to find whatever I wanted within the log.

TextPad can open and work with massive files and you can do pretty much you want then with the data.
8:19 am on Oct 31, 2010 (gmt 0)

Moderator from US 

WebmasterWorld Administrator martinibuster is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Apr 13, 2002
posts:13973
votes: 123


Mach5 FastStats [mach5.com] has been around awhile but it gets the job done for files up to 1.7 GB (for their $99 paid version). Their free version only works up to 5,000 lines of a log file. Here is a sample report [mach5.com]. On the left hand nav click on the Search Engine Spiders link in the Visitor Information category to see how they display the search spider visits. The software installs on your desktop computer. It's an oldie but a goodie at a reasonable price.
8:20 am on Oct 31, 2010 (gmt 0)

Full Member

5+ Year Member

joined:June 24, 2008
posts: 219
votes: 0


Thanks for reply Mark.

I use notepad plus too, but for example, recently i found indications that some one was scanning our website with "acunetix_wvs_security Scanner". the scanner was sending some values to a web form. so i discovered it.

If i want to go through logs to get any common mark this scanner is using to block future scans, how can i do it with software's such as TextPad and notepad+!
5:20 pm on Oct 31, 2010 (gmt 0)

Preferred Member

5+ Year Member

joined:July 25, 2006
posts: 460
votes: 0


You can use a command line utility called "grep" to pull out and list only lines from your text file that match a pattern you define, and then inspect them for similarities. At least that would filter out all the lines you're not interested in. You can redirect the grep output into a separate text file, which you can then inspect in a text editor. There are free versions of grep for Windows.

For more advanced filtering and sorting, you could import your log data into a MySQL database, which won't have any trouble with 1GB or more of data.
5:27 am on Nov 10, 2010 (gmt 0)

New User

10+ Year Member

joined:Feb 12, 2003
posts:26
votes: 0


Nihuo Web Log Analyzer is very fast and easy to use.
6:23 am on Nov 10, 2010 (gmt 0)

Senior Member from US 

WebmasterWorld Senior Member tangor is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Nov 29, 2005
posts:6163
votes: 284


Don't forget importing to MySQL or Access is another possibility. Easy to write any custom reporting desired. (I've been using Access since 1997).