Forum Moderators: DixonJones

Message Too Old, No Replies

For those with 'home grown' solutions

How do you normalize?

         

DJOpie

8:31 pm on Feb 13, 2004 (gmt 0)

10+ Year Member



For those of you that have developed home grown metrics solutions I would love to hear some feedback. I have seen several threads that discuss the pluses and minuses of doing it but how do you go about it?

We are developing a asp.net application that has aprox. 500k - 600k visitors a month. We are going to tie together both web demographic information and user profile and transaction information.

We have looked at implementing some products to help such as browserhawk. So onto the questions..

- How do you normalize your data (or dont you)?
- Do you use OLAP data cubes?
- What do you use to display your metrics? On a web page? excel? etc..

Thanks in advance..
Opie

DJOpie

4:09 am on Feb 17, 2004 (gmt 0)

10+ Year Member



Bump.. Sorry ;)

cfx211

6:55 pm on Feb 18, 2004 (gmt 0)

10+ Year Member



3 tables: 1 for the user cookie, 1 for the session cookie, 1 that logs requests.

There is only 1 row in the user cookie table per cookie issued.

There is 1 row in the session cookie table per session cookie issued. This table has a foreign key back to the user cookie table. One user cookie can have multiple session cookies.

Every request of custom coded transaction gets a row in the log table. One session can generate boatloads of requests. The log table stores the user and session cookie values on every request.

Accounts can be associated to multiple user cookies and user cookies can change their account value if someone new logs in from that computer. There is no way to normalize that relationship. You just need to overwrite the account_id on the user cookie table whenever it changes. I also tend to build a lot of account to cookie xref tables.

Data gets presented in excel depending on what is asked for. Everything comes from SQL queries. I think OLAP is a waste of time unless you are in a huge organization that has a bunch of business people who actually know what to request.

Ben_Graham

1:56 am on Feb 24, 2004 (gmt 0)

10+ Year Member



I've had to develop our own in-house web analysis tool because our site uses URL rewriting to make dynamic pages appear static. Our site is all custom-coded in ASP.NET.

Our traffic is too high to use logging to DB, so I have a series of hacked batches that imports the web log into a SQL Server DB for analysis nightly. There I have a primary table for visits with a child table of visititems. I have several other tables (pages, orders, etc.) to normalize the data.

Generally, I use the ASP session ID to identify uniques. The real agony is in trying to work out what is a unique, what are spiders, etc. There are several cursors and fairly complex stored procedures required to make this happen.

Before I implemented our system I did a day-by-day comparison of my uniques versus WebTrends and a couple of other tools. Eventually, I found that there was a very high correlation between my visits and WebTrends's unique visitors.

If you use other cookies you can link those out to a Visitors table (I use Visitors to identify our internal IPs, known spiders by IP and user agent, etc.). The really good thing about an internal system is in its flexibility. Want to know the most common search term used to find your site for customers who bought product A? Or number of page views for customers who placed orders for more than $X? No off-the-shelf product can do this for you.

I have a few standard queries, but most of what we do is ad hoc. I have the luxury of working in an environment where the decision makers are able to run Query Analyzer and do their own queries. If I had the luxury of time, though, I would likely use ASP.Net to pull the data back out and put it into a web form, although Excel can chart direct from queries (something I have not yet experimented with).