| 5:28 pm on Aug 9, 2005 (gmt 0)|
You can't adjust the way that it saves the logs in IIS (other than selecting a different format and turning fields on and off).
How are you analyzing this information? You must be using something. You'll have better luck parsing it after it's been converted to a usable format.
| 5:17 am on Aug 10, 2005 (gmt 0)|
Thanks Ben for the reply. We are using both clicktracks and webtrends on this data. We have the querystring only on a few pages and never took it seriously. It's only recently that we imported this into SQL database and took a close look at the records and we realized this issue. But if this is how it's done, we can live with it.
| 11:34 am on Aug 10, 2005 (gmt 0)|
I use Clicktracks as well, and I have rewritten URLs that cause havoc. Here's a trick that works, but I haven't implemented it 100% (time constraints):
Microsoft has a free product called LogParser. It is exceptionally fast at reading and rewriting log files. It can process a 100 MB log file in less than 30 seconds. You can parse the URL to be combined, use regular expressions to rewrite them, or run SQL-like commands against the whole log.
If you're using Clicktracks, you could pre-process your logs to rewrite your URLs in any way you'd like, or strip out any users, whatever. Then, analyze the resulting logs in Clicktracks, and take advantage of the lovely user interface.
| 11:57 am on Aug 10, 2005 (gmt 0)|
Ben, thanks for that. Actually I had downloaded log parser last week, but never got to using it. But we did use the log parser API for another small application we are building to subset a log file. First we were handling the pre-processing using SQL queries and it took a long long time. Log parser did the whole thing in less than a minute. It's an amazing piece of sw.
| 12:32 pm on Aug 10, 2005 (gmt 0)|
Free, fast, simple, and with inadequate documentation. It's like a Linux app.