Forum Moderators: DixonJones

Message Too Old, No Replies

Tracking total unique IPs served?

         

AgmLauncher

9:24 pm on Nov 7, 2005 (gmt 0)

10+ Year Member



Is there a free tool or script that you can use to track the total number of unique first time IP visits served? (which doesnt include spider hits)

I've seen a lot that just record the daily unique visits and add them to a running total every day. This isnt helpful. I'd like to know how many different people have visited the site starting the day we install the script. (obviously :P)

ronburk

6:19 am on Nov 8, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Is there a free tool or script that you can use to track the total number of unique first time IP visits served?

How about:

cat *.log ¦ awk '{print $1}' ¦ sort ¦ uniq ¦ wc -l

Since you didn't specify language, platform, or planet, I'll assume my favorite for all three will work just fine :-)

(which doesnt include spider hits)

Ah, well now you have to decide what defines a spider. If you think that's "obvious", then you have some studying to do.

I'd like to know how many different people have visited the site starting the day we install the script. (obviously :P)

And I would like a pony for Christmas, but we're both going to have to settle for somewhat less than what we want. For example, when I and my wife use different computers to access your website, you'll only see 1 IP address because I run NAT on outgoing connections. Likewise, if your website is up for a couple years (or you're a very high traffic site), you can count on a great many different AOL users having visited you via the exact same IP address.

Try googling for "analog webworks" and see if that page makes it clear why counting unique IP addresses is not identical (and sometimes far from it) to counting real, breathing, unique people.

ogletree

8:17 am on Nov 8, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



That information is not that useful unless you really know a lot about bots and do some research. You can run 4 different apps and get 4 very different reports. No program is going to do what you want out of the box. Most programs offer the feature to filter the results. You need to filter out the bots one by one or use wildcards on some apps. Then you have to go look at the hosts that hit your site and copy the ones that are obvious bots. Then go to the referrer section. I run a report that show just the non normal browsers ip's then filter those or some apps will let you filter specific referrers. Then when your all done you still have bogus information because an aol user has a different IP every time they click on something practically. Many security apps make it so the visitor leaves hardly any info in your stats. Some of these apps comes standard on new computers as a value added feature. You always take stats with a grain of salt. The more traffic you have the more inaccurate it is. You should only compare stat reports from the same program. If you run the same report every day you can tell up and down trends. About the only thing I find useful in my apps is search terms . I can use that info to see where I rank and build new pages.

ldylion214

5:32 pm on Nov 8, 2005 (gmt 0)

10+ Year Member



I use StatCounter. It works fine for me. Nicci