Forum Moderators: coopster
Assuming I already have the bad IP to ban (let's call it $ip), how would I write the IP to the appropriate line in .htaccess, and safely?
I'd like to use a relatively simple script like this one:
<?php
$file = $_SERVER['DOCUMENT_ROOT'] .'/.htaccess';
$fp = fopen($file, 'a');
fwrite($fp, "Deny from".$ip."\n");
fclose($fp);
?>
But again, I'm not sure how safe this is, and it doesn't write the IP to the correct line. Any suggestions? I know the other anti-bot scripts do this, but it is very (needlessly?) complicated and I have not been able to block the IP in the right line. Please help out a newbie trying to defend his sites vulnerable forms! Thanks.
Instead of writing
fwrite($fp, "Deny from".$ip."\n");
fwrite($fp, "SetEnvIf %{REMOTE_ADDR} \"^".$ip."$\"getout\n");
Deny from getout
Eliminating the necessity of parsing through a lot of existing .htaccess records to find the insertion point should result in a measurable perfomance improvement on the 'insert new record' function, and simplify the PHP and "manual" .htaccess coding.
Jim
Jim
# The following lines are written by the bad-bot scripts
SetEnvIf Remote_Addr ^66\.249\.***.101$ getout
SetEnvIf Remote_Addr ^211\.231\.**\.13$ getout
#
# ...other config directives...
#
# Block bad-bots using lines written above by bad bot script, but
# always allow robots.txt and 403.html error page to be fetched
SetEnvIf Request_URI "(403\.html¦robots\.txt)$" allow
<Files *>
Order Deny,Allow
Deny from env=getout
Allow from env=allow
</Files>
When you get a server error, look at your server error log file. It will often tell you exactly what is wrong.
Note that my script escapes the literal periods in the $ip variable. Octets reading *** were intentionally-obscured to comply with the WebmasterWorld TOS.
Jim
just a couple unimportant questions to ask. I have a custom 403 error page, but ironically the banned can't see it--they're forbidden from the forbidden page! I can still send them the message I want by dropping the .html code for the 403 page right into the .htaccess, but it would be cleaner if I could just let the banned see the banned page. any way to do this?
also, if the banned go to example.com, it comes up with the "Red Hat Enterprise Linux Test Page", when every other page, even example.com/index.html, comes up with the normal 403 page. why does this happen / how can I prevent it?
also, i've prevented my getout.php page from being spidered through robots.txt. is there any way a good crawler could still find it, even though i haven't even put in a link to it yet?
maybe i'm just overthinking things! anyway, thanks for helping me accomplish my main objective!
I have a custom 403 error page, but ironically the banned can't see it--they're forbidden from the forbidden page!
Did you modify this line
SetEnvIf Request_URI "(403\.html¦robots\.txt)$" allow
Did you note this?
Change all broken pipe "¦" characters in code on WebmasterWorld to solid pipes before use; Posting here modifies that character.
No idea why you'd see the default server page, unless it's just a by-product of this first problem.
Jim
as for the server page, it has some weird instructions on it referring to a nonexistent file (welcome.conf) in a directory that doesn't exist (/etc/httpd/conf.d/) or at least isn't accessible by ftp; looks like I'll have to talk to my host about that one. fortunately it's not too pressing.
there's a minor problem in the script that should be corrected lest newbies like myself get hung up on them:
as code appears:
fwrite($fp, "SetEnvIf %{REMOTE_ADDR} \"^".$ip."$\"getout\n");
as code should be:
fwrite($fp, "SetEnvIf Remote_Addr ^{$REMOTE_ADDR} $ getout\n");
the former literally writes "%{REMOTE_ADDR}" to .htaccess.
EDIT: The script still does not work. With both versions, it bans ALL IPs! There must still be a problem with my .htaccess file, and I've written it exactly as your example, Jim...