homepage Welcome to WebmasterWorld Guest from 107.22.45.61
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Accredited PayPal World Seller

Visit PubCon.com
Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
Forum Library, Charter, Moderators: Receptional & mademetop

Website Analytics - Tracking and Logging Forum

    
Analysing big size logs!
how? what softwares? and how to determine what logged in IIS
alahamdan




msg:4224444
 7:17 am on Oct 31, 2010 (gmt 0)

Hello

I always saw Tedster and other writers asking questions like " Do you know when was Google bot visit to your website!"

This gives me an impression that analyzing website logs are very easy!, i was always wondering how it can be easy.

But in my case, i have a website with about 90 k unique visitors daily, on a Windows server.

our daily log file is 1 Giga +, so how to analyze such a file! even with a software if will be a long time consuming process true?

I'm using statcounter dot com as website statistics tool, since using a local tool on server, will cause a high CPU load to process logs.

Im missing something? if i want to use local logs just to security issues (tracking and scans, attacks...etc) what fields shall i be logging in IIS? and where to edit the settings of logging?

Thanks

 

Mark_A




msg:4224453
 8:08 am on Oct 31, 2010 (gmt 0)

I used to use TextPad to open big log files and then use its tools to find whatever I wanted within the log.

TextPad can open and work with massive files and you can do pretty much you want then with the data.

martinibuster




msg:4224457
 8:19 am on Oct 31, 2010 (gmt 0)

Mach5 FastStats [mach5.com] has been around awhile but it gets the job done for files up to 1.7 GB (for their $99 paid version). Their free version only works up to 5,000 lines of a log file. Here is a sample report [mach5.com]. On the left hand nav click on the Search Engine Spiders link in the Visitor Information category to see how they display the search spider visits. The software installs on your desktop computer. It's an oldie but a goodie at a reasonable price.

alahamdan




msg:4224458
 8:20 am on Oct 31, 2010 (gmt 0)

Thanks for reply Mark.

I use notepad plus too, but for example, recently i found indications that some one was scanning our website with "acunetix_wvs_security Scanner". the scanner was sending some values to a web form. so i discovered it.

If i want to go through logs to get any common mark this scanner is using to block future scans, how can i do it with software's such as TextPad and notepad+!

SteveWh




msg:4224576
 5:20 pm on Oct 31, 2010 (gmt 0)

You can use a command line utility called "grep" to pull out and list only lines from your text file that match a pattern you define, and then inspect them for similarities. At least that would filter out all the lines you're not interested in. You can redirect the grep output into a separate text file, which you can then inspect in a text editor. There are free versions of grep for Windows.

For more advanced filtering and sorting, you could import your log data into a MySQL database, which won't have any trouble with 1GB or more of data.

cuiyj




msg:4228716
 5:27 am on Nov 10, 2010 (gmt 0)

Nihuo Web Log Analyzer is very fast and easy to use.

tangor




msg:4228728
 6:23 am on Nov 10, 2010 (gmt 0)

Don't forget importing to MySQL or Access is another possibility. Easy to write any custom reporting desired. (I've been using Access since 1997).

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / WebmasterWorld / Website Analytics - Tracking and Logging
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved