Forum Moderators: DixonJones
Perhaps it is very basic and it has been discussed, but couldn't find it here after several searches.
I have several database generated sites with several thousands of pages each. At this point web bots visits are a very significant number and for that reason I cannot get accurate page views stats.
I wanted to create a simple javascript and php or perl counter that can write pageviews to a text file. Only hits to a page (just the date and time of the hit) so I can query that text file for near accurate page views (as far as I understand, there is no such thing as totally accurate page views).
Am I going in the wrong direction? Is there any log analizer that can differentiate most web bots (including adsense googlebot). I use Funnellweb and I like it much, but its bots list is extremely small.
Any input will be helpful
Enrique
Also, bots don't catch & serve cookies, so if you can cookie each unique visitor you have, and then read the cookie data on a successive page view, you'll more accurately tell 'people' traffic from bots that way, too.
The cookie method is interesting, but which is better?
There are a number of users that choose to disable javascript. But also there are a number of users that choose to disable cookies.
Which is the highest number?
Also, do I have to place a privacy statement at my site for placing cookies at my visitors machines?