Forum Moderators: coopster

Message Too Old, No Replies

php to .htaccess

ip logging

         

tanx

7:54 pm on May 27, 2007 (gmt 0)

10+ Year Member



Hi!

I log all visits to my site in a mysql-database, and if a visitor visits more than say 50 pages within a 24 hours time frame the script below writes to .htaccess, denying him further access:

$result = mysql_query("SELECT ip,visits FROM iplogging WHERE ip = '".$ip."'");
$row = mysql_fetch_array($result);
if($row[visits]>=50){
$filename = ".htaccess";
$fp = fopen($filename,'a+');
fwrite($fp,"deny from $ip \r\n");
fclose($fp);
}

There are however some ip-addresses that I'd like to white list (e.g. google, slurp, myself) - how can this be done most effectively? How would I add to the above script that it shouldn't tell .htaccess to blacklist Google ip addresses ranging from 66.249.64.0 - 66.249.95.255?

Thanks for your help.

Best
Tanx

jatar_k

12:19 pm on May 28, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



you could have a table that contains all the white listed ips

check against that table first

joelgreen

12:32 pm on May 28, 2007 (gmt 0)

10+ Year Member



Check user agent string instead of ip to check if it is Google, Yahoo, etc.

jatar_k

12:46 pm on May 28, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



a combination is best, user agent strings are spoofed all the time

tanx

8:06 pm on May 28, 2007 (gmt 0)

10+ Year Member



Hey guys

Thanks for your thoughts
I have a number of regularly returning visitors that have a fake google-user agent string:

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

All they do is spam my online mail form with useless garbage and junk. I would hate to include these ip's in my white list based on user agent alone - therefore I opted for the ip-solution a long time ago.

The optimal solution for me would be that I could just add the ip range '66.249.64.0 - 66.249.95.255' inside the php-script above itself. - Can this be done somehow?

Best regards,
Tanx

jatar_k

9:23 pm on May 28, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member



as I said, keep a list of the white list ips and check the ip against that first, if it is in that list then skip your visit checking code

barns101

10:13 pm on May 28, 2007 (gmt 0)

10+ Year Member



One crude solution would be to list the white IPs in an array and then use in_array() [php.net] to check the visitor's credentials.

londrum

7:55 pm on May 29, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



there is a great script in webmasterworld's library already which will do everything that you need (click library at the top of the page, whilst inside the php forum, and the thread in the library is called 'Blocking Badly Behaved Bots #3') -- and you don't even have to log stuff in a database.

you just have to put some code at the top and bottom of every page, and then it writes a 0-byte size file into a directory, which checks against the user's IP. you can amend the settings to block them out after however many page views (and let them back in after however many minutes or hours, if you want). you can also set it to allow so many page views within however many seconds. so there are two ways to block them.

and... it's got a whitelist too. it already contains the values to allow googlebot, slurp and all the others, and it is a simple matter to add new ones as needed.

it is a very good little script. you should check it out. and because it's written by the people here at webmasterworld, you shouldn't have any problem getting help with it if it plays up.