Forum Moderators: phranque

Message Too Old, No Replies

mod_rewrite: redundant rewriting

         

gergoe

5:41 pm on May 4, 2004 (gmt 0)

10+ Year Member


does anyone have any idea about how i can put an already rewritten url into a RewriteCond? i've tried the following, but the RewriteCond was not working because it used the original request uri for the matching, not the rewritten one.

>>>>>>>>>>
RewriteRule ^/vvg/?$ /some_dir/vvg.html
.
.
.

#none of these dirs are being requested:
RewriteCond %{REQUEST_URI}!^/(IMG¦SCRIPTS¦XML)/

#no file with the following extensions are requested:
RewriteCond %{REQUEST_URI}!\.(ihtml¦css¦pdf¦swf¦fla)$

#the requested resource is not in the root dir
RewriteCond %{REQUEST_URI} ^/.*/.*

#then pass the request through some server sided script
RewriteRule ^/(.*)$ /proc.pl?page=$1
<<<<<<<<<<

if i try to put the "/vvg" uri through this, the uri is being rewritten to "/some_dir/vvg.html" nicelly, but then the last rule will not be applied because it matches the original uri. but if i put the /vvg/ uri through it it works well, because the original uri is not in the root.

so actually the question is, which variable i can use to get the uri rewritten by the previous rules? i've tried the N and the PT flags, but none of them did the trick...

thanks

jdMorgan

8:33 pm on May 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You could try %{SCRIPT_FILENAME} and see if that works.

Otherwise, take a look at the [C] (chain) flag - You can use that to make rule#2 require rule#1 to have been invoked.

Another method is to combine the functions of both rules, so that no info has to be passed from one to the next.

Yet another method is to create your own environment variable in the first RewriteRule, using the [E=varname:value] flag method, and test it using
RewriteCond %(ENV:varname) ^value$
in the second ruleset.

Jim

gergoe

9:28 pm on May 4, 2004 (gmt 0)

10+ Year Member


2nd and 3rd was not an option for me, but the others was perfect, especially the first one which was the one i was looking for (i was just too afraid to give it a try without knowing that it is what i need, since it is a production website and server)

thanks

jdMorgan

9:46 pm on May 4, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



A quick solution for a production webserver is to create a "test" subdomain, point that to a subdirectory on the server, lock it down by putting a robots.txt file in that subdirectory, and then put a redirect in place to steer requests for test.example.com to that subdirectory. You can then test code in there without too much danger of interfering with commerce.

You have to test new mod_rewrite code in .htaccess rather than httpd.conf, but for basic idea development, it works. Cheap and dirty, too.

Jim

gergoe

9:53 pm on May 4, 2004 (gmt 0)

10+ Year Member


i don't find the mod_rewrite so complicated to be affraid of changing it on the fly, and actually all the changes i've done are just minor changes. anyway, none of the visitors know my phone nr, so what's the matter then? ;-)

don't you think that turning on the htaccess files does decrease the performace? or this is valid only for the heavily used servers?

jdMorgan

12:13 am on May 5, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I like the phone number idea... :)

On heavily-accessed servers, mod_rewrite can and will slow things down if you are not careful to limit the scope of your RewriteRules. For example, don't check all files for hotlinking, just check images, scripts, and multidmedia filetypes. Better yet, put all of these filetypes in a subdirectory, and put the access control rules in .htaccess in that subdirectory. In this way, the .htaccess code is not even run if it doesn't apply to the request.

The same effect can be achieved in httpd.conf by enclosing the code in a <directory> container.

To get some perspective on .htaccess performance considerations, bear in mind that the average .htaccess file is under 20k bytes in size, and that it is interpreted as part of the server-native code. On the other hand, there are many sites out there that run huge PERL and PHP scripts -- they're much larger than .htaccess files, and require non-server-native interpreters in order to function.

Jim