Forum Moderators: coopster

Message Too Old, No Replies

Improving Click Tracking

How to avoid automated scripts from clicking away

         

nickCR

2:00 pm on Jun 19, 2008 (gmt 0)

10+ Year Member



Hello All!

I have a new problem. Last night an automated script hit my site and ran up over 240 clicks. The links they clicked will eventually be based on PPC and if that happened it would have cost our advertisers money for no reason.

I already have it checking the IP and UserAgent which it's recording in the DB. If the same IP / UserAgent clicks the same ad twice within a 30 min period the second click doesn't deduct payment.

The problem with the automated script that hit the site is that it went through all the different links and clicked each one once.

I was thinking that I should put some logic to check for excessive clicks in a time period then somehow "BAN" that user. How would you ban them? Just in the PHP or put a ban in the .htaccess if so how would you insert the IP's to ban in the .htaccess automatically?

Best Regards,

Nick

cameraman

3:52 pm on Jun 19, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You can perform file operations on an .htaccess file just like any other. Put an allow line at the bottom of your deny list to serve as a marker (like maybe allow from 127.0.0.1).
To add a deny, read the file, explode it on 'allow from 127.0.0.1', add the new deny to the first exploded element, and re-assemble & write the .htaccess. explode will 'swallow' your marker so you'd reassemble with something like $parts[0] . 'allow from 127.0.0.1' . $parts[1], plus \n characters as appropriate.

d40sithui

4:09 pm on Jun 19, 2008 (gmt 0)

10+ Year Member



if you have access to the etc/hosts.deny, you can place the IP(s) you want banned in there too.

nickCR

5:04 am on Jun 21, 2008 (gmt 0)

10+ Year Member



Thanks guys for your replies! I'll check those methods out!