Forum Moderators: phranque
If I want a RewriteCond to match domain.com, subdomain.domain.com, and domain.com/filename.ext, what is the best way to do that?
For instance, will the example below work as a "catch all" because it doesn't have a starting anchor or ending anchor?
RewriteCond %{HTTP_REFERER} domain\.com
Or would it have to be something more like this... (forgive me I'm rather new to .htaccess files.)
RewriteCond %{HTTP_REFERER} *.domain\.com*.
Please correct me on any syntax issues :)
THANKS!
1) Can I have multiple rules for different referring domains, as illustrated below? and
2) Is it really heavy on system resources to have 3-4 rules like the ones I have below in my .htaccess file?
[3]
# Internally rewrite visitors referred by badsite1 to error403.html page,
# where we will log the IP address and redirect them somewhere else
RewriteCond %{HTTP_REFERER} badsite1\.com
RewriteCond %{REQUEST_URI} !^/fake-error1\.html$
RewriteCond %{REQUEST_URI} !^/fake-error2\.html$
RewriteRule .* /error403.html [L]
#
# Internally rewrite visitors referred by badsite2 to badsite2.html page,
# where we will log the IP address and redirect them somewhere else
RewriteCond %{HTTP_REFERER} badsite2\.com
RewriteCond %{REQUEST_URI} !^/fake-error1\.html$
RewriteCond %{REQUEST_URI} !^/fake-error2\.html$
RewriteRule .* http://www\.badsite2\.com [L]
[/3]
"Heavy" is three or four hundred rules... Three or four thousand rules would be "very heavy." :)
How many is too many? That depends on your overall server speed, and how many sites you share it with, so that question is akin to "how long is a piece of string" -- Only you can tell from the evidence at hand.
Just as a point of comparison, if you consider .htaccess directives to be similar to PHP statements (both are interpreted from text to executable routines on-the-fly, rather than being compiled), then .htaccess directives might justifiably be equal in number to PHP statements... And many PHP programs are thousands of lines long.
You'll know you have too many when your server responds sluggishly, and recovers when you trim your .htaccess file down. :)
I'm not sure if I gave this warning in your previous thread or not, so I'll repeat it: Do not give too much information when blocking or restricting access; To do so is to reveal the level of your technical proficiency and to invite more-targeted exploits.
For example, don't put "Access Denied: Invalid User-agent" on one of your 403-Forbidden error pages. That tells the scraper he needs to use a valid UA string to get in! Or it tells him that the clever UA string he picked or created to spoof you has a reputation problem, or that he mis-typed it.
Worse yet is to gloat -- "Ha ha! -- I banned you!" This only invites retribution. So, take a look at why you think you need all of these 'special' error pages, and consider if any of these warnings apply...
If you end up with several more similar rule-sets, then take a look at the [S=nn] flag for RewriteRule. It will allow you to take the common parts of the similar rulesets, such as the URL exclusions, and move them into a rule that [S]kips the other rules if they match. So in this case, If the requested URL-path is fake-error1 OR fake-error2, then skip the following two rules, which then no longer need to check those two conditions.
Also, let me introduce the in-line OR. This one RewriteCond line replaces both URL-path-exclusion RewriteConds:
RewriteCond %{REQUEST_URI} ^(fake-error1¦fake-error2)\.html$
RewriteRule .* - [S=2]
Jim