Forum Moderators: rogerd & travelin cat
Where to Place WP Related htaccess Directives
# HTACESS DEFENSE
<FilesMatch ^\.ht>
Require all denied
</FilesMatch>
# WP CONFIG and XMLRPC DEFENSE
<FilesMatch ^(wp-config|xmlrpc)\.php>
Require all denied
</FilesMatch>
"above" or "outside" the public_html directory's htaccess file
<Files "robots.txt">
Order Deny,Allow
Allow from all
</Files>
or <FilesMatch "\.(js|txt|xml)$">
Header set X-Robots-Tag "noindex"
</FilesMatch>
Options -Indexes +Includes
(same for all sites, but it didn't work at the higher level) SSIErrorMsg "<!-- SSI error -->"
(don't remember if I even tried this on the higher level, but it made intuitive sense to put it right after the +Includes statement) AddType text/html .html
AddOutputFilter INCLUDES .html .php
AddOutputFilterByType DEFLATE text/css text/javascript
AddCharset UTF-8 .html .php .js .css
(Heh, I seem to say that last thing twice-- in both htaccess files-- but I think it works in both places.) ExpiresActive On
ExpiresDefault blahblah
(Again, site-specific stuff for various types of files: for example, the test site has instant expiration so I never need to refresh/reload after making changes.) ErrorDocument 403 blahblah
(This overrides the host's built-in ErrorDocument directives-- but by using the same filename while changing the error documents' location, I can still use the host's built-in "Allow from all" directive that lets everyone get the 403 page. I haven't seen it, of course, but I have to assume it's in the config file.) <Files "banner-icon.png">
Order Deny,Allow
Allow from all
</Files>
(This file is only used by one site, but I want all humans to be able to see it even if it's invoked by the 403 page.) AllowOverride on
.htaccess files should be used in a case where the content providers need to make configuration changes to the server on a per-directory basis, but do not have root access on the server system.
There are two main reasons to avoid the use of .htaccess files.
The first of these is performance. When AllowOverride is set to allow the use of .htaccess files, httpd will look in every directory for .htaccess files. Thus, permitting .htaccess files causes a performance hit, whether or not you actually even use them! ....
And so, for each file access out of that directory, there are 4 additional file-system accesses, even if none of those files are present. (Note that this would only be the case if .htaccess files were enabled for /, which is not usually the case.)....
The second consideration is one of security. You are permitting users to modify server configuration, which may result in changes over which you have no control.
<DirectoryMatch /blahblah/(onesite|othersite)>
AllowOverride blahblah
</DirectoryMatch>
immediately followed (in config) by <DirectoryMatch /blahblah/(onesite|othersite)/[^/]+>
AllowOverride none
</DirectoryMatch>
so the server only has to check one place. But it's only worth it if each of your sites contains many deeply nested directories, none of which would ever need an htaccess of its own-- for example, to allow auto-indexing in one directory, or to set a different expiration time or custom 404 page. But it's only worth it if each of your sites contains many deeply nested directories, none of which would ever need an htaccess of its own
2 existing htaccess files for each WP site
RewriteCond %{HTTP_USER_AGENT} ^-?$
RewriteRule ^ - [F]
BrowserMatch ^-?$ keep_out
Deny from env=keep_out
replacing "Deny from" with whatever locution 2.4 prefers. I kinda think the simplest way, if you're blacklisting rather than whitelisting, is to collect all the things you don't want, and shove them inside a <RequireNone> envelope, which can go anywhere. So Require env keep_out
only don't quote me, because I don't have any personal experience with 2.4.
...whatever locution 2.4 prefers...
Require all denied
For example, we had a customer that received an attack that was 500Gbps. This is highly uncommon and was reported as the largest DDoS in history - CloudFlare mitigated the attack for our customer.
I have configured my VPS to run mod_security
Shri, since you raised the topic would you care to share a bit of your experience? Do hack attempts manage to penetrate CF's "defenses"? Has CF had times when it's network / solution slowed down things a bit? Has CF been hit with a DDOS attack, ~retribution? Has using a CDN created new issues for you or for others? Such as?
A user typically connects to Cloudflare which then examines the request and sends it over to your server if it is not cached. Most pages created by CMS driven sites are not cacheable - some work has to be done to achieve this. We send headers and set rules which makes Cloudflare cache wordpress pages for a few mins.
If you're a newbie running on shared hosting / VPS and do not know how to admin servers / apache and all that stuff - consider cloudflare as a front end.