Forum Moderators: phranque

Message Too Old, No Replies

Combining .htaccess files

making one file out of 2

         

Conard

4:18 pm on Aug 27, 2004 (gmt 0)

10+ Year Member



This may sound strange but I have a couple of .htaccess files. One in the root for blocking user agents and another in an image directory to block hot linking.
Here are some examples:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
# A whole load of other agents
RewriteCond %{HTTP_REFERER} ^http://www.iaea.org$
RewriteRule ^.* - [F,L]

and this:

RewriteEngine on
RewriteCond %{HTTP_REFERER}!^$
RewriteCond %{HTTP_REFERER}!^htt(p¦ps)://(www\.)?mysite.com.*$ [NC]
RewriteRule .(gif¦jpg)$ [mysite.com...] [R,L]

If I want to combine these files would it look something like this:

RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR]
# A whole load of other agents
RewriteCond %{HTTP_REFERER} ^http://www.iaea.org$
RewriteRule ^.* - [F]
RewriteCond %{HTTP_REFERER}!^$
RewriteCond %{HTTP_REFERER}!^htt(p¦ps)://(www\.)?mysite.com.*$ [NC]
RewriteRule .(gif¦jpg)$ [mysite.com...] [R,L]

TIA.........

jdMorgan

4:41 pm on Aug 27, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yes, that's fine, but let me suggest a few tweaks to the second rule:

RewriteCond %{HTTP_REFERER} .
RewriteCond %{HTTP_REFERER} !^https?://(www\.)?mysite\.com [NC]
RewriteRule \.(gif¦jpg)$ /another-image.gif [L]

First, "." (any character) is equivalent to "!^$" (not blank), but shorter.
Second, "https?" is more efficient than "htt(p¦ps)" -- the question mark simply means that the "s" is optional, and accomplishes exactly the same thing.
Third, literal periods in the patterns (e.g. in the "mysite.com" in the 2nd RewriteCond) should be escaped to mark them as literals.
Fourth, there is no need to end-anchor the referrer, and it's more efficient not to. So leave off ".*$".
Fifth, (ignoring the second instance of an unescaped period), using an internal rewrite instead of an external redirect to serve your alternate image to hotlinkers has two advantages. First, it's more efficient, and second, it does not require the cooperation of the client.

For the first rule, you might also consider using something like this:


RewriteRule !^(robots\.txt¦custom_403_page\.html)$ - [F]

This allows all user-agents to fetch your robots.txt file, which gives them fair warning if you block them. It also prevent an infinite 403 redirection after a forbidden response is returned. If you don't allow your custom 403 page (if you use one) to be fetched, the forbidden client will get a second 403 error trying to fetch your custom 403 page... and a third trying to fetch it again... and make a mess of your server log.

Hope this helps!

Jim

Conard

4:50 pm on Aug 27, 2004 (gmt 0)

10+ Year Member



Thanks again JD, I'll do some re-working.