Welcome to WebmasterWorld Guest from 18.104.22.168
All of these vendors charge based on page views. I am trying to determine how many page views we could possibly have.
How accurate is the Page view tracking with Log files and WebTrends?
joined:Mar 8, 2002
It is entirely possible that log files under estimate page views - especially with static pages. This is due to cahing at various stages of the Internet. In a static page, it could be that when a person "refreshed" a page, it does not need to recall the page from the source server, as it is chached on the user's machine... then on the company's machine... then at the ISP level. Any of these could lower the page count.
The systems you are looking to buy, of course, track page views differently... :(
It is free and it is pretty plain, no fancy GUIs or anything like that but it really does give you the meat and potatos of what you are looking for, at a glance.
I sometimes try to compare raw log files to my reports and Urchin seems to be off although it has some traps to discount bots where as webalizer doesn't.
Just a suggestion
[edited by: Receptional at 5:12 pm (utc) on May 16, 2006]
[edit reason] no url drops please [/edit]
Add about 20% for the missed cached pages
Unless you've correctly instructed clients not to cache pages. IME, caching in violation of protocol has gone way down with the widespread implementation of HTTP/1.1
For caching servers, anyway. Clients still flout the protocol left and right.
I am currently showing x number of page views per month. I am concerned that the activity from the spiders/robots are being counted in that total. Does anyone know if WebTrends does or does not count the spider/robot activity in the total of page views?