Forum Moderators: phranque
It's the files section
Here's what I have one line above the section:
order deny,allow
<Files ~ "\.htaccess$">
deny from all
</Files>
<Limit GET POST>
</Limit>
<Limit PUT DELETE>
deny from all
</Limit>
<Files *>
header append X-robots-tag "noarchive"
deny from (long list fully obfuscated)
deny from env=ban
</Files>
Am I correct that there can only be one files statement and anything I want to deal with files has to be contained in that section? Obviously I want to protect htaccess, I also want to deny about a dozen persistent bad actors. Is the above structure incorrect? If so, what should I do to get things on track?
Probably pull the limit out of the middle. Also wondering if the deny from env=ban needs to be inside the files statement or if I need several (since my SetInEnv statements are working with two in place).
I think you can simplify somewhat, and then add allowances for robots.txt and your custom 403 error page (if you have one), both of which should never be restricted to avoid problems such as having your server repeatedly hammered by a robot that should be Disallowed by robots.txt, but is coming from a banned IP address range, cannot fetch robots.txt, and so doesn't know it is Disallowed from spidering... or having your server attempt to serve a 403 response using your custom 403 error page, but then encountering a second 403 error because the 403 page itself is denied, and then another, and another, etc. (I call these two cases "self-inflicted denial of service attacks.")
Try something like:
Order deny,allow
#
<Files ~ "\.htaccess$">
Deny from all
</Files>
#
<Limit PUT DELETE>
Deny from all
</Limit>
#
Deny from (long list fully obfuscated)
Deny from env=ban
#
<Files ~ "^(robots\.txt¦your-custom-403-error-page-if-any\.html)$">
Allow from all
</Files>
#
Header append X-robots-tag: "noarchive"
Jim