Forum Moderators: phranque

Message Too Old, No Replies

mod_rewrite exceptions to a blanket statement

         

summerg

11:07 pm on Dec 30, 2008 (gmt 0)

10+ Year Member



I have the following statement redirecting anything that's not a physicle file to index.php, and it's working great.

RewriteCond %{REQUEST_FILENAME} -s [OR]
RewriteCond %{REQUEST_FILENAME} -l [OR]
RewriteCond %{REQUEST_FILENAME} -d
RewriteRule ^.*$ - [NC,L]
RewriteRule ^.*$ index.php [NC,L]

However, I need to make about 2 dozen exceptions to this rule. For instance, my old site had the following rule:

RewriteRule ^products/(resources¦documents)/([0-9]*)(/?)(.*)(/?)$ /products/detail.php?tab=resources&ProductID=$2&hbxCMP=$4 [NC,L]

that's now moved to something like:

RewriteRule ^products/(resources¦documents)/([0-9]*)(/?)(.*)(/?)$ /products/$2#documents [NC,R=301]

but I have a conflict between that statement and the block above that I can't seam to resolve. Do I have to change the condition above with "not" statements for the 20+/- exceptions I want to make? There must be a better way?

thanks

g1smd

11:42 pm on Dec 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The NOT condition will do the job.

You might not need 20 if you can use (this¦that) notation where the pipe means OR.

You don't have any redirects in that code, but you do have two rewrites. There is a difference.

By rewriting "everything" and doing a "physical file check" for every request you have a big inefficiency there. You should limit the rewrite so that it doesn't need to do any slow system file checks for CSS and JS files, robots.txt, SE tool account verification files, and so on, and only rewrites for valid types of requested URL.

g1smd

11:58 pm on Dec 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



A pattern like this is dangerous:
^products/(resources¦documents)/([0-9]*)(/?)(.*)(/?)$

Using * after the [0-9] means that it could be empty. Using ? on the following slash makes it optional.

That means the pattern will match when there is a double // in the URL - making a duplicate URL.

There are other inefficiencies in that pattern, not least the .* part.

In fact with .* and then an optional slash after that, the pattern would also match three /// slashes too.

[edited by: g1smd at 12:01 am (utc) on Dec. 31, 2008]

jdMorgan

11:59 pm on Dec 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Put the most-specific rules first, and end them with [L].

In this way, later general rules won't execute until after the specific rules have already rewritten the URL to files that exist.

You need to get rid of this rule, or make it more-specific:

RewriteRule ^.*$ index.php [NC,L]

For one thing, it will rewrite requests for index.php to index.php, in an 'infinite' rewriting loop.

Add additional exclusions as necessary using RewriteCond and the NOT "!" operator, as g1smd stated above.

Jim