Forum Moderators: coopster
I'm tracking the number of times a page has been viewed by my visitors, using the following code:
<?php
$ip = $_SERVER["REMOTE_ADDR"];
if ($ip == '66.249.66.242')
{ $write_rec = 'N'; }
if ($write_rec!= 'N')
( ** Write record to database ** }
?>
The '66.249.66.242' IP is one of Google's spiders. Obviously this is a bad way of doing things because I would need the IP of every spider to successfully differentiate between a human visitor and a spider.
Whats the best way of doing this?
if(!preg_match("/Googlebot/", $_SERVER['HTTP_USER_AGENT'])) {
# not Googlebot
} ;)
It is for tracking but only for my reference. My site is built from a database, so for example I have the following in the database:
widget0001 ¦ widgetinformation ¦ view count
widget0002 ¦ widgetinformation ¦ view count
...
widget9323 ¦ widgetinformation ¦ view count
I have a stats package but I'd like this count just so that I know which is the most popular page when I'm reading my raw data.
insert for each hit and then see what needs to be filtered once you have enough base data
use a table that stores
ip
pagename
user agent
then you can select counts from mysql and start excluding bots that you find in there, you could also maintain a botlist as well to load into your array for comparison and tighten up your pageview counting over time