Welcome to WebmasterWorld Guest from

Forum Moderators: ocean10000

Message Too Old, No Replies

Reading IIS Logs into database

there must be an easier way?



6:19 pm on Sep 6, 2003 (gmt 0)

WebmasterWorld Senior Member suzyuk is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Hi All,

trying to figure out how to do this.. but is there a better way?

<li>Log Files are deleted after 2 weeks so need a way to store them for analysis
<li>Web host puts the .log files in a "secure" folder only accessible by password.
<li>Would it be easier to scrap the history then write own stats?

Here's what I have so far but I'm getting a script timeout so I know it must be wrong..

excuse this code I'm really trying to figure this out :)

Option Explicit

const ForReading = 1

Dim sBasePath, sFileName, fs
sBasePath ="ftp://[user]:[password]@[sitename].co.uk/Logs/"
sFileName = "[filename].log"

Set fs = CreateObject("Scripting.FileSystemObject")
on error resume next
set objLogFile = fs.OpenTextFile(sBasePath & sFileName)
if Err.Source <> "" then
bErrFlag = 1
sErrDescription = "Invalid File name or Path. Please try again."
set fs = Nothing
end if

if bErrFlag = 0 then ' no file error
set objTS = objLogFile.OpenAsTextStream(ForReading)
response.write "is there a connection here?"

' clean up
set objTS = Nothing
set objLogFile = Nothing
set fs = Nothing
end if

everything in [] is hard coded at the minute..

appreciate some guidance



6:28 pm on Sep 6, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

Search for Microsoft Log Parser. The Log Parser makes it easy to import your log files into SQL Server without all that scripting and the risk of script timeouts if your log files are too large.


6:49 am on Sep 7, 2003 (gmt 0)

10+ Year Member

If you have problems with your script timing out, try this:
Server.ScriptTimeout = NumSeconds
If I remember right the default value for this property is 90 seconds



9:07 am on Sep 10, 2003 (gmt 0)

WebmasterWorld Senior Member suzyuk is a WebmasterWorld Top Contributor of All Time 10+ Year Member

Thanks Gary, I decided the log parser was way over my head (I'll leave it for another day as I only have an NT server for testing/playing and it requires 2000..)

However I managed to hack somthing together which got me the information I required ;)

"hack" being the operative word ;) I'm still very new to this ASP lark!



3:45 pm on Sep 14, 2003 (gmt 0)

10+ Year Member

If you want to import the log files into SQL Server you could always put together a DTS package to do it for you.

Even if you don't want to import the log files, but just copy them to your local machine so that they are safe from deletion you can use the file transfer protocol task to set up a scheduled copy.


8:31 am on Sep 17, 2003 (gmt 0)

10+ Year Member

Suzy, just an alternative suggestion here that might be worth considering - rather than relying on the log files themselves, why not take direct control of create the logging files yourself, particularly as you're using ASP anyway?

A technique I've used myself goes like this ..

(1) Each page to be logged looks at the ServerVariables collection to retrieve the required server variable values (typically you'd be interested in HTTP_REFERER, HTTP_USER_AGENT, REMOTE_ADDR, REMOTE_HOST, SERVER_SOFTWARE etc);
(2) Append these values to your own xml file (with xml tags to map onto the required server variables);
(3) Have your own asp admin-type page to read this xml file (i.e. the equivalent of trawling through the original log files) - here, you can use xsl to format the output how you want it. So you could have filter to allow you to parse the xml file and only show records for a specific HTTP_USER_AGENT or REMOTE_ADDR etc ..

I just see this sort of technique as giving you more control on the details being logged, with xsl transforms on your asp admin page giving you much greater control on you actually interrogate your own customised xml log files.

Worth a thought anyway ....


Featured Threads

Hot Threads This Week

Hot Threads This Month