Forum Moderators: phranque

Message Too Old, No Replies

internal redirects ride again

         

lucy24

4:08 am on Dec 24, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Took me a few months, but I figured out what is happening and why it's happening and even how to prevent it from happening. And then, after working it all out on my own, I did a Forums search and found the identical explanation. Oops. I even found a post from the one and only jdMorgan showing a better way to prevent it. But I've got a couple of residual questions.

Situation: Error Logs periodically say
Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace.

It only happens with requests that are blocked in mod_rewrite for user-agent or referer or whatnot, not with core-level Deny from directives.

Changing the LogLevel is out, since I'm on shared hosting and that's a config-file setting. And the requests are supposed to get a 403, so they do end up in the right place. They just take a nanosecond longer and make a little more work for the server.

So I'm left with testing on myself like an old-fashioned medical researcher. What I see is
Forbidden

You don't have permission to access {pagename} on this server.

Additionally, a 500 Internal Server Error error was encountered while trying to use an ErrorDocument to handle the request.


My first thought was some glitch in mod_rewrite so the [F] flag didn't carry its implied [L]. But changing them all to [F,L] didn't make any difference.

:: pause here for inspiration to strike ::

A 403 is not only a slam-the-door-in-your-face. It's also a special kind of rewrite, because the user gets sent to the 403 page, which happens to be called "forbidden.html" (host's default name, requiring no ErrorDocument directive). Along the way to pick up their 403 page, they run into the rule that sent them there in the first place:

RewriteRule (\.html|/)$ - [F]

Fresh request for html file, fresh door-slamming, fresh request for "forbidden.html" file, fresh door-slamming... until Apache catches on and says Enough Is Enough.

D'oh!

I've got a cluster of <Files> exemptions, including robots.txt and forbidden.html-- but they don't do any good here, because the request never makes it as far as the core.

My original fix was to add another Condition to each of the affected Rules:

RewriteCond %{REQUEST_FILENAME} !forbidden
(or)
RewriteCond %{REQUEST_URI} !forbidden

Either one works. The much better fix, courtesy jdMorgan, is to put a conditionless Rule at the very beginning of your Rewrites:

RewriteRule forbidden\.html - [L]

(Modified to fit my own naming, duh.)
______

Residual questions: How come this doesn't happen all the time, everywhere? I would expect every single question about using mod_rewrite for lockouts to be followed with a plaintive post saying "Now I'm getting a bunch of internal-redirect errors".

And, odder still: These new improved redirectless 403s don't show up in the Error Log at all. Not even as the usual "client denied by server configuration". They're in the Access Log as 403. But what's keeping them out of the error log?

g1smd

12:04 pm on Dec 24, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



They are not in the error log because the 403 status is now successully served. Previously you had a 500 error when this was attempted.

There is one major flaw in doing the
RewriteRule foo - [L]
thing. For THOSE requests, NONE of your other (later) rules get to run, so any and all exploits that later rules are supposed to block, don't get blocked for those requests.

For this reason, I prefer adding the RewriteCond on the single rule that needs to avoid matching for this request.

lucy24

5:58 pm on Dec 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



They are not in the error log because the 403 status is now successully served. Previously you had a 500 error when this was attempted.

My Error Logs show all 403s. (I counted. Or rather, asked the text editor to count.) Maybe yours are set for a lower detail level? I share a server with gazillions of other sites-- I think they group us based on size-- so the logs have to be set at a level that will be most useful to the most people.

There is one major flaw in doing the RewriteRule foo - [L] thing. For THOSE requests, NONE of your other (later) rules get to run, so any and all exploits that later rules are supposed to block, don't get blocked for those requests.

In the spirit of the Apache forums I have tried to understand this, but no luck :(

There are only two ways to get at a custom 403 page, right? Either as a consequence of a 403-- which means the would-be visitor has already been stopped from whatever it is they hoped to do-- or by asking for it explicitly. "I've heard so much about your pretty 403 page, can I please come in and look at it?" Hmm. Got a vague idea Moscow was once captured in the middle ages by someone using a similar line. But I can't put in a condition looking at THE_REQUEST because then we're potentially right back where we started with the infinite loop.

For now I do have it in separate-condition form. That is, any Rule involving .html and [F] has an exception for !forbidden. And the same for !goaway, since the logs suddenly decided to make a fuss about that too.

I still haven't figured out why the same thing doesn't happen with image requests from hotlinkers-- but I know it doesn't, because I have looked at a few random pages. They ask for an image, they get an image. Just not the one they asked for.

g1smd

9:38 pm on Dec 26, 2011 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Lets' say you have a rule which blocks a nasty exploit that uses parameter
?nasty=messupyourserver


This code will block the request:

RewriteCond %{QUERY_STRING} (^|&)nasty=messupyourserver(&|$)
RewriteRule .* - [F]


If you precede that rule with:

RewriteRule ^specialpath - [L]


Now, a request for
www.example.com/specialpath?nasty=messupyourserver
will be able to exploit your server at will.