Forum Moderators: DixonJones
I'm more concerned with month to month changes in analyzing web server logs, yet I want some decent absolute accuracy too.
Here's my current strategy to try for some accuracy:
(My analysis software lets me filter out "agents", "Clients", "URLs or filenames" and "referrers").
1. I don't want visits by my client's staff to their own site to show in results, so I do filter them out as [exclude client = myclientsdomain.com].
2. But I do want visitors who are referred from another page of the client's site to count, so I don't filter by [exclude referrer = my clientsdomain.com].
Does this make sense?
Or am I inflating the traffic?
Do "typically reported" traffic stats include requests inside a site? Or only requests for the entry page?
This makes a pretty big difference in the visit numbers.
But,
>>so I do filter them out as [exclude client = myclientsdomain.com].
The domain you use for this filter must be the domain of the host your client's staff use to access the Internet, eg. a firewall/proxy: gateway.myclientsdomain.com (to make sure filter both the hostname AND the corresponding IP address).
>> so I don't filter by [exclude referrer = my clientsdomain.com].
Exactly. This filter is about filtering requests from specific (referring) URLs. If you exclude myclientsdomain.com it will exclude requests from one page to another within the same domain.
>> Do "typically reported" traffic stats include requests inside a site?
Yes.
>> This makes a pretty big difference in the visit numbers.
Actually no. The visit numbers (read: number of user sessions) is normally determined by users' IP addresses. It does not matter if a user is making 1 or 50 requests during a session - it is still one visit.
It does, however, affects the number of page views.
/Hannu
Otoh, log schedules can also blow up the number of visitors, eg. 23.55-0.05 is 2 visits (loganalyser splits things at midnight).