Welcome to WebmasterWorld Guest from 126.96.36.199 , register , free tools , login , search , pro membership , help , library , announcements , recent posts , open posts Become a Pro Member
Ban certain IPs from auto downloading my websites but still maintain good bots like googlebot from crawling my site Imaster msg:3289508 12:33 pm on Mar 22, 2007 (gmt 0) I have a huge website which is dynamically generated. I have noticed over a period of time that there are stupid web downloaders, etc which have been aggressively downloading my website thus enormously increasing the load on the server and slowing down my site.
I would like to auto add such ips to my .htaccess file "deny from xx.xx.xx.xx" statement.
But I would still like to keep specific bots like googlebots, etc from crawling.
How do I do this?
jdMorgan msg:3289575 1:16 pm on Mar 22, 2007 (gmt 0)
Two suggestions from previous threads here at WebmasterWorld:
Install AlexK's modified version of xlcus'
php script to ban runaway crawlers [ webmasterworld.com]. This script detects excessively-fast consecutive page requests.
Key_master's bad-bot script [ webmasterworld.com] (perl), or birdman's php version [ webmasterworld.com] of it. These scripts trap malicious visitors based on robots.txt violations.
Adding exclusions to avoid banning *any* major 'bot is a good idea.
Imaster msg:3289613 1:27 pm on Mar 22, 2007 (gmt 0)
wow man. You are awesome. Thanks I will check it out.