I currently check my access logs daily, and manually block IPs I find up to mischief using htaccess thus;
<Files *>
order deny,allow
# China
deny from 14. 27.8.0.0/13 etc....
</Files>
That entails a delay of 24 hrs, exposing the site to nasties until I wake up and check.
I was wondering if you think it wise to try to automate this IP deny process, and if so how would you go about it?
My concern is that any mechanism would involve auto-writing to my htaccess file, and we all know how a slight error invokes a 500, bringing the site down, so the process would have to be dependable and bomb-proof. The other concern is giving my site's password to a script to enable it to re-write my htaccess.
I know some of you do this with confidence in various bot-traps posted here at WebmasterWorld, but I'm still wary of the complicated bot-trap methods.
The main IPs I want to auto-block are those visitors who try copying to email or MS Frontpage.
For example, I currently block some email copiers and MS Frontpage using:
RewriteCond %{HTTP_REFERER} compose [NC,OR]
RewriteCond %{REQUEST_METHOD} ^(OPTIONS|TRACE|DELETE|TRACK) [NC]
RewriteRule ^(.*)$ - [F,L]
This blocks them, but I see in my logs they try repeatedly to save the page, and of course it is still in their cache.
So, I'd like to auto-block them +on IP as soon as+ they invoke OPTIONS or compose, AND somehow cause their cache to refresh so it flushes out the full-text page, and is replaced by the 403 Forbidden page they received when they were auto-blocked.
I'd prefer a non-php method, unless that is the ideal route.
Btw: Shared server, so no access to server config file.
Good or bad idea, too risky? Your thoughts please.