Forum Moderators: phranque
... about a year ago, this thread said "no":
[webmasterworld.com...]
You could first try it on a test server before deploying it onto a production system.
Regards,
R.
1) Break the file into logical sections and test them one-at-a-time.
2) Test the code on a test server, or even in a test subdirectory on a live server.
3) Use WannaBrowser to check user-agent blocking.
4) If you have a problem with some bad guys getting through, scan through the list of blocked user-agents and/or IP addresses, and make sure you have [OR] flags where you want them, and none where you don't want them.
5) Make sure your have "{", "}", "(", and ")" in the correct places; using "()" around server variables can lead to a silent failure. Example:
RewriteCond %(HTTP_USER_AGENT) ^Larbin [NC] Jim
I think my issues can be cleared-up with a few questions if you'll indulge me.
This is one section that troubles me. Is there a problem with the extra "deny from" statements below?
<Limit GET POST PUT>
order allow,deny
allow from all
deny from 24.4.253.122 24.4.253.123 209.26.224.45 towtimes 24.183.144 216.34.244 24.73.161.18 www.protow.com mail.protow.com 24.73.204.166 proxy-server.southeast.rr.com 216.160.95.136 idslppp136.sttl.uswest.net static-64-65-139-27.dsl.sea.eschelon.com transport-development-group.london.cw.net primeinc.com 208.242.194.183 80.58.52.170 64.179.23.178 66.73.48.200 63.64.41.58 66.73.48.200 207.31.249.220 12.151.162.21 63.148.99.247 209.149.201.5 Allresearch.com
# Npbot
deny from 12.148.209.192/26
# cyveillance.com
deny from 63.148.99.224/27
deny from 65.118.41.192/27
# branddimensions.com user-agent: BDFetch
deny from 204.92.59.0/24
# www.markwatch.com user-agent: markwatch
deny from 204.62.224.0/22
deny from 204.62.228.0/23
deny from 206.190.160.0/19
# rocketinfo.com
deny from 209.167.132.224/28
</Limit>
No, you can have as many Allows and Denys as you like.
You might actually want to add a few more Denys to shorten the list in your first Deny and make it easer to read by grouping IPs and Remote-Hosts separately.
Note that you can block both www.protow.com and mail.protow.com by using Deny from protow.com
However, this would also block xyz.protow.com, so you may not want to do that.
If *none* of these Denys work, try removing the <Limit GET POST PUT> and </Limit> and surrounding the code with <Files *> and </Files> instead.
Jim
My last questions are rather simple. Can these statements (below) go anywhere in the htaccess file? They currently reside after my "rewrite engine on" command and before my Rewrite statements.
RedirectMatch 301 /cgi-bin/ikonboard/* /cgi-bin/ib3/ikonboard.cgi$1
RedirectMatch 301 /toc.htm$ /toc2.htm$1
RedirectMatch 301 /links.htm$ /resources.htm$1
And lastly; Is the syntax of this line correct?
RewriteRule ^.*$ [google.com...] [R,L]
Thank you all for the help!
Can these statements (below) go anywhere in the htaccess file? They currently reside after my "rewrite engine on" command and before my Rewrite statements.RedirectMatch 301 /cgi-bin/ikonboard/* /cgi-bin/ib3/ikonboard.cgi$1
RedirectMatch 301 /toc.htm$ /toc2.htm$1
RedirectMatch 301 /links.htm$ /resources.htm$1
Is the syntax of this line correct?RewriteRule ^.*$ [google.com...] [R,L]
I'd suggest specifying either R=301 or R=302, just to make it obvious whether a permanent or temporary redirect is being done (The default is 302-Temporarily Moved).
You can shorten the pattern to just ".*" (or even to just ".") if you like:
RewriteRule .* http://www.google.com [R=301,L]
I have both of these statements in my htaccess:
Options Includes
Options All
See Apache core (for Options), mod_alias, and mod_rewrite at [httpd.apache.org...] for details.
Jim
[httpd.apache.org...]