Forum Moderators: phranque

Message Too Old, No Replies

Error document for status 403

         

adb64

8:25 pm on Feb 21, 2006 (gmt 0)

10+ Year Member



In my .htaccess I block access to my site for certain spiders and IP adresses. In case any of these spiders or IP addresses enters my site a 403 is returned. I also have a 403 error document, but that isn't returned for these spiders and IP addresses.

Below a part of my .htaccess file:


# Set error documents

ErrorDocument 403 /error403.php

# block certain IP addresses

deny from 81.247.***.***
deny from 80.200.***.***

# block certain user agents

SetEnvIfNoCase User-Agent ^BAD_SPIDER1 ¦ ^BAD_SPIDER2 AC_FORBIDDEN
Order Allow,Deny
Allow from all
Deny from env=AC_FORBIDDEN

When any of the blocked UAs or IPs enters my site the default Apache error page is returned and not my own 403 document. The returned error page has an extra message saying:


Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.

How can I return my own error page? This is only of interest ofcourse for the blocked IP addresses, not for the blocked UAs.

Thanks,
Arjan

stapel

9:19 pm on Feb 21, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I think the problem is that you're denying access last, and denying to the entire site (which includes your custom 403 page).

Instead, have a secondary environment, called something like "allowbad", in which you allow the "bad guys" to view things like your "terms of use" page and your custom 403 page. So you'd have something like:

    # Ban known "bad guy" IPs and domains:

    SetEnvIf Remote_Addr ^ww\.xx\.yyy\.zzz$ badguy

    SetEnvIf Remote_Addr ^\.badguy_domain\.com$ badguy


    # Make a list of files the "bad guys" can access:

    SetEnvIf Request_URI "^(/403\.php¦/robots\.txt¦/terms_of_use\.html)$" allowbad


    # Ban the baddies, except for the files on the list:

    <Files *>

    order deny,allow

    deny from env=badguy

    allow from env=allowbad

    </Files>

So only the "bad guys" are banned (from the "deny"), except for the few pages you specify (from the "allow").

This has worked fairly well for me, but "your mileage may vary".

Eliz.

adb64

10:25 pm on Feb 21, 2006 (gmt 0)

10+ Year Member



Hi Eliz,

Thanks, that works great. The only thing is that in my htacces I also prevent image hotlinking as follows:


#
# Prevent image hotlinking
#
<FilesMatch "\.(gif¦jpg¦png)$">
SetEnvIfNoCase Referer ^$ allow_image
SetEnvIfNoCase Referer ^http://(.+\.)?mydomain\.com allow_image
Order Deny,Allow
Deny from all
Allow from env=allow_image
</FilesMatch>

In the htaccess this is after the 'Badguy' blocking but it will allow the 'badguys' from retrieving the images.
When I put it before the 'Badguy' blocking, the hotlink prevention doesn't work anymore. Any ideas how to incorporate this?

Thanks,
Arjan

stapel

11:25 pm on Feb 21, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have a separate section, below the "bad guy" blocking section, for hotlink protection. It goes something like this:

    # The following lines let certain parties through the hotlink protection.


    # Allow "blank" referrers through,

    # so people can come from bookmarks, etc.

    RewriteCond %{HTTP_REFERER}!^$


    # Let people see my pages correctly

    # in the search-engine caches.

    RewriteCond %{HTTP_REFERER}!^http:// google_cache_IP /.*$ [NC]

    RewriteCond %{HTTP_REFERER}!^http:// yahoo_cache_IP /.*$ [NC]


    # Allow "friendlies" through.

    RewriteCond %{HTTP_REFERER}!^http:// allow_my_avatar_on_this_forum /.*$ [NC]

    RewriteCond %{HTTP_REFERER}!^http:// my_own_domain /.*$ [NC]

    RewriteCond %{HTTP_REFERER}!^http:// a_friend's_site /.*$ [NC]


    # Everybody else gets a substitute graphic.

    RewriteRule .*\.(jpg¦gif)$ /hotlink.png [R,NC]

(The above is "anonymized". Copy the basic structure, but not the actual coding!)

I think you may need some "escape" characters before the "dots", in order for the above to be completely and technically correct, but the above has worked just fine for me.

Eliz.