Forum Moderators: phranque

Message Too Old, No Replies

RewriteCond in httpd.conf more than once

Can I just do an overall, single copy?

         

peterg22

1:58 pm on Jun 5, 2008 (gmt 0)

10+ Year Member



Folks,

I'm trying to move away from .htaccess and ideally want to put a single copy of what was in there before so that I can apply it to more than one directory at once. this is what I'm using right now:

<Directory />
Options FollowSymLinks
RewriteEngine on
RewriteCond %{REQUEST_METHOD} !^(GET劣EAD吏PTIONS同OST)$
RewriteRule .* - [F]
RewriteCond %{THE_REQUEST} .*prx1.* [NC,OR]
RewriteCond %{THE_REQUEST} ^(GET劣EAD同OST)\ /?http:// [NC]
RewriteCond %{THE_REQUEST} !^(GET劣EAD同OST)\ /?http://(www\.)?mydomainname\.com/
RewriteCond %{THE_REQUEST} !^(GET劣EAD同OST)\ /?http://nnn\.nnn\.1\.nnn/
RewriteCond %{THE_REQUEST} !^(GET劣EAD同OST)\ /?http://192\.168\.nnn\.nnn/
RewriteRule .* - [F]
<Limit GET POST>
Order allow,deny
Allow from all
Deny from env=bad_bot
</Limit>
</Directory>

<Directory "/usr/local/www/data">

... same as above

</Directory>

But can I just do this at the top of httpd.conf so that it'll apply to anything that I add below?

<Directory *>
Options FollowSymLinks
RewriteEngine on
RewriteCond %{REQUEST_METHOD} !^(GET劣EAD吏PTIONS同OST)$
RewriteRule .* - [F]
... etc
</Directory>

jdMorgan

9:48 pm on Jun 5, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Have you tried it? Moving it from within your <Directory> containers up into the <VirtualHost> container should work -- See the "Context" of each and every directive you want to use; It will save you time.

One note on your "bad-bot" deny: Do NOT deny bad-bots from fetching either your robots.txt file or your custom 403 error page (if you use one) -- If you do, you will create a logical problem in the first case, and a server loop in the second.

In order for a bot to be considered fully "bad" you have to let it fetch robots.txt so that it can demonstrate that it either won't fetch it or won't abide by it. And if you block access to your custom 403 error document, then when a denied client tries to fetch a page and gets the 403 error response, that in itself will generate another 403 error because access to the 403 error page will be denied, and you have a loop.

You can use mod_setenvif to set a second variable based on the requested URI, which you can then use to override the "Deny from env=bad-bot" if you use "Order Deny,Allow" instead of the other way round:


SetEnvIf Request_URI "^/robots\.txt$" always-allow
SetEnvIf Request_URI "^/custom-403-page\.html$" always-allow
Order Deny,Allow
Deny from env=bad-bot
Allow from env=always-allow

Jim

peterg22

10:05 pm on Jun 5, 2008 (gmt 0)

10+ Year Member



Ah, thanks for that - I haven't used <virtualhost> before but will give it a try.

Thanks also for the robots.txt info - I've been under a mass attack lately and have been a bit trigger-happy with my banning..