| 4:47 pm on May 16, 2006 (gmt 0)|
The tracking is appalling, but the number of pageviews should (in theory) be OK. However...
It is entirely possible that log files under estimate page views - especially with static pages. This is due to cahing at various stages of the Internet. In a static page, it could be that when a person "refreshed" a page, it does not need to recall the page from the source server, as it is chached on the user's machine... then on the company's machine... then at the ISP level. Any of these could lower the page count.
The systems you are looking to buy, of course, track page views differently... :(
| 4:56 pm on May 16, 2006 (gmt 0)|
ThePhoenix... I like to have a few reports at hand and compare them against each other. you may want to try installing webalizer.
It is free and it is pretty plain, no fancy GUIs or anything like that but it really does give you the meat and potatos of what you are looking for, at a glance.
I sometimes try to compare raw log files to my reports and Urchin seems to be off although it has some traps to discount bots where as webalizer doesn't.
Just a suggestion
[edited by: Receptional at 5:12 pm (utc) on May 16, 2006]
[edit reason] no url drops please [/edit]
| 6:09 pm on May 16, 2006 (gmt 0)|
| 6:15 pm on May 16, 2006 (gmt 0)|
|Add about 20% for the missed cached pages |
Unless you've correctly instructed clients not to cache pages. IME, caching in violation of protocol has gone way down with the widespread implementation of HTTP/1.1
For caching servers, anyway. Clients still flout the protocol left and right.
| 6:41 pm on May 16, 2006 (gmt 0)|
Thanks for all of the input. We do not have time to implement a solution for just determining the number of page views. I am just trying to determine how accurate the current reports could be from log files and WebTrends.
| 8:07 pm on May 16, 2006 (gmt 0)|
Read the analog [analog.cx] documentation section on "how the web works." It will explain what is in a log and how it might differ from the number of pages that are served out of a proxy and/or browser cache.
| 2:56 pm on May 18, 2006 (gmt 0)|
Thanks for the input. I am just trying to get an estimate of our number of page views, as that is how the ASP vendors charge for their services.
I am currently showing x number of page views per month. I am concerned that the activity from the spiders/robots are being counted in that total. Does anyone know if WebTrends does or does not count the spider/robot activity in the total of page views?
| 5:00 pm on May 18, 2006 (gmt 0)|
|I am just trying to get an estimate of our number of page views |
Take http page requests (from the log files) and quote that. You can at least say you honestly reported the best and most accurate information you have available.