Forum Moderators: DixonJones

Message Too Old, No Replies

What methods do you use to determine "unique" clicks/views

         

moheybee

2:30 am on Dec 13, 2004 (gmt 0)

10+ Year Member



Obviously there is no way to completely ensure unqiue clicks/views, but...

What means do you use in your scripts other than tracking unique IP addresses to attempt to filter unique clicks/views?

Some argue that tracking by IP is risky -- people connecting through proxy server or through a corporate gateway (where they have 1 public IP for hundreds of private IPs) will be treated as 1 person.

Some people just use sessions, but then, how would you deal with someone who doesn't accept cookies or deletes/edits the cookie(s) you are setting?

If a user clicks any given ad and you record his click in a cookie/via a session, how long before you would count a click again from him for that ad (a day, a month, etc.)?

Do you check user-agent to filter out any bots/crawlers? Obviously you can fake it, but every "fake" click you can filter out makes it worth it.

When dealing with CPC advertising I think it's better to have a low estimate of "unique" clicks as opposed to an over estimate, but that's just my opinion.

I'm not currently trying to implement something like this, but I may be in the near future and I'm honestly just curious how people have specifically confronted this issue.

moheybee

7:58 pm on Dec 13, 2004 (gmt 0)

10+ Year Member



I am suprised by the lack of responses on this forums and others. Is this something most people don't have to deal with or worry about?

jatar_k

8:24 pm on Dec 13, 2004 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



thing is, not many people use custom stats analyzers or tracking, there are just too many packages available for free or that you can buy.

I have multiple tools that I built and use. Some to do keyword analysis, custom tracking through out my site and some very customized traffic analysis tools. I leave all the gross traffic numbers to webtrends.

"Unique" is never really a very good number in regards to traffic because there is always AOL who doesn't really have many external ip's for the sheer volume of users that come through on them. Also the corporate environment as you mentioned.

>> When dealing with CPC advertising I think it's better to have a low estimate of "unique" clicks

only if you are the one buying, not so much if you are selling.

>> Do you check user-agent to filter out any bots/crawlers?

you could use user agent and ip, there are enough db's of spiders around.

>> not currently trying to implement

what specifically are you looking at? Are you talking about a ad click program? Full PPC engine? Your question is a little vague and there are so many intricacies to tracking all the various types of traffic on a site and drawing all that info together.

moheybee

9:46 pm on Dec 13, 2004 (gmt 0)

10+ Year Member



Thanks for your response.

There are two things specifically I may be dealing with in the near future:

1) Tracking, as accurately as is possible, the number of daily "unique" views for each product listed on a website.

2) Creating a completely automated PPC engine, as in, advertisers pay by credit card and little interaction is needed. For this, I am not only curious the how you would handle filtering out duplicate clicks, but also what you would personally define, and how you would define it to your advertisers (which these days is a legitimate concern of most), as a duplicate click. For example, would you "want" to count two clicks from the same individual if they were made more than 24 hours apart or would more time need to pass? What cutoff would you use?

Even from a seller's perspective I believe a lower estimate is better than one that could potentially be over the actual. That is just my opinion though and it could always change.

I appreciate your insight.

cfx211

5:54 pm on Dec 15, 2004 (gmt 0)

10+ Year Member



I would suggest reading through this thread because a lot of the same concepts apply to click tracking, especially in terms of building out an in house backend.

[webmasterworld.com...]

We use a redirect logging system that happens to write session and user cookie values to the table. The session and user cookie values also get written once to master tables that store important info like user agent and client IP address.

We then have a second table that eliminates known automated agents from those two tables, and then do our PPC counts by joining the redirect table to the cleaned up cookie tables.

In terms of what to count, thats a definition for you to decide. I personally think its a good idea to eliminate counting any additional clicks from a cookie/session/link combination that takes place within 10 seconds of the first click.

moheybee

12:19 am on Dec 16, 2004 (gmt 0)

10+ Year Member



cfx211, thanks for the link and insight.

I am very concerned about fradulent clicks. Would your 10 second check apply to all the ads on the site or just to the specific ad clicked? If an individual tracked by a session clicked a specific ad and then clicked it again a minute later, would you count it?

cfx211

5:54 am on Dec 16, 2004 (gmt 0)

10+ Year Member



I see enough back and forth behavior that while the same cookie/session clicking a link within a minute could be fraud, its also very likely that its someone who is using that link as navigation and hit it twice.

For instance a person on your site sees two ads for widgets. They click on Ad A and look at that site. They then hit the back button and click on Ad B. After seeing the site behind Ad B, they decide that the Ad A website was better. How do they get there? By hitting the back button and then clicking on Ad A again.

If you really don't want to count both clicks on Ad A then change your pricing model to be per referred visit and not per click. That way you don't have this problem.