Forum Moderators: phranque

Message Too Old, No Replies

Custom 405 page

bad idea?

         

keyplyr

10:26 am on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm using a custom 403 page. Don't know why, just seemed like a good idea at the time:

ErrorDocument 403 /forbidden.html

I ban lots of UAs with:

RewriteRule!^forbidden\.html$ - [F]

So now when a UA gets a 403. I get 2 errors: 1 for the whatever they requested and another for the custom 405 page they are being forbidden to get - LOL

Solution: I guess I'll remove my custom 403 page. Hardly anyone seeing it anyway, and those who do see it I don't like.

Question: Do I then change:


RewriteRule!^forbidden\.html$ - [F]

To this?

RewriteRule .* - [F]

<added> correction in subject line, meant to say "403"</added>

Thanks

jdMorgan

3:36 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



keyplyr,

WHY are they forbidden to fetch the custom 403 page? That is the intent of the exclusion in the RewriteRule - to *allow* them to fetch the custom 403 page.


RewriteRule!^forbidden\.html$ - [F]

The above line allows anyone to fetch "forbidden.html", but redirects all other requests to the 403-forbidden processing.

However, if your custom 403 page is not named "forbidden.html" or is not in the web root directory, then it won't work.

Jim

keyplyr

8:00 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



WHY are they forbidden to fetch the custom 403 page?

Hi Jim,

Well then maybe I'm misinterpreting this:


[Wed Nov 5 05:52:14 2003] [error] [client 211.151.XX.XXX] client denied by server configuration: /www/p/path/htdocs/images/file.gif
[Wed Nov 5 05:52:14 2003] [error] [client 211.151.XX.XXX] client denied by server configuration: /www/p/path/htdocs/forbidden.html

I see now that it's this:


SetEnvIf Remote_Addr ^211\.151\.XX\.XXX$ ban
<Files *>
order deny,allow
deny from env=ban
</Files>

<added>
I'm using these SetEnvIf rules as temporary deterrents until the bad guys go away. So I will just need to tolerate the double errors I guess (hundreds of them though!)
</added>

jdMorgan

8:20 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Nah, don't put up with it - fix it!

This is essentially the same fix as used in your RewriteRules, but implemented using SetEnvIf and the allow/deny precedence control provided by the Order directive (Note that this construct will fail badly if "Allow,Deny" is used!).


SetEnvIf Remote_Addr ^211\.151\.XX\.XXX$ ban
SetEnvIf Request_URI ^/(forbidden\.html¦file\.gif)$ allowit
<Files *>
Order Deny,Allow
Deny from env=ban
Allow from env=allowit
</Files>

Enjoy, :)
Jim

keyplyr

9:11 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks,

So just to be clear (LOL) since there is potentially thousands of files that could be requested, it's not realistic to "allow" each by name, so I removed that part and am using (example):


SetEnvIf Remote_Addr members\.portal\.com/badguy ban
SetEnvIf Remote_Addr ^23\.345\.XXX\.XX$ ban
SetEnvIf Referer ^12\.123\.XXX\.XX$ ban
SetEnvIf Request_URI ^/forbidden\.html$ allowit
<Files *>
Order Deny,Allow
Deny from env=ban
Allow from env=allowit
</Files>

Note that this construct will fail badly if "Allow,Deny" is used

Really - Ah Ha... never even thought about that before.

Thanks

jdMorgan

10:55 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



keyplyr,

I don't really understand the bit about thousands of files that could be requested. So, I'll forge ahead, possibly stumbling over my own feet... :)

The code as you re-posted it bans those three bad guys from accessing all files on your site, except that they are permitted to access the forbidden.html page.

The code has no effect whatsoever on anybody who's not on your 'bad guy' list.

Also, I should note that I consider it "only fair" to allow bad guys to access robots.txt as well as the custom 403 pages and anything associated with the 403 pages (like a logo for the 403 page, for example).

I presumed (I think incorrectly) that your "file.gif" was an image or logo that displayed on your forbidden.html page. That may be the cause of my confusion. You will definitely still see the 403-Forbidden errors in your log as these guys try to access files, but you won't get a second 403 on the redirect to forbidden.html for each of those requests.

A trick I use to save bandwidth is to make my custom 403 page tiny. It contains a basic "403-Tough luck, bud" message, and a text link to a secondary 403 page with a lot more info. Bad bots rarely follow that link, so having the small 403 page can save a lot of bandwidth. Bad guys are allowed to access both of my 403 pages, though.

Jim

keyplyr

11:03 pm on Nov 5, 2003 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Thanks, all is well then :)

I presumed (I think incorrectly) that your "file.gif" was an image or logo that displayed on your forbidden.html page.

Yeah, file.gif was just an example I used to show that for every failed request, there was a subsequent failure to get the forbidden.html page as well.
The way I undestood your example led me to think that I was to do that for every file, hense my edit.

My error logs continue, but now without the extra error, thanks again.

<added>
Yes, I agree about allowing all access to robots.txt:

SetEnvIf Remote_Addr members\.portal\.com/badguy ban
SetEnvIf Remote_Addr ^23\.345\.XXX\.XX$ ban
SetEnvIf Referer ^12\.123\.XXX\.XX$ ban
SetEnvIf Request_URI ^/forbidden\.html$ allowit
SetEnvIf Request_URI ^/robots\.txt$ allowit
<Files *>
Order Deny,Allow
Deny from env=ban
Allow from env=allowit
</Files>

I use no logo on any of my custom error pages, I keep them very small but just large enough to override the MS error pages.
</added>