Forum Moderators: phranque

Message Too Old, No Replies

apache 2 issues: Request exceeded the limit of 10 . etc.

         

whoo

7:37 pm on Apr 4, 2008 (gmt 0)

10+ Year Member



I have two things going on that are stumping me:

The first is this:

I have a set of very simple deny from blah blah rules.

If I test any of those rules by going to the front most page of my site, I get the dreaded:

Forbidden
You don't have permission to access / on this server.
Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request.

message >> A loop, I presume, since Im forbidden to look at the 403 page.

The thing is, if I adjust that url to anything else, the 403 page shows up, as it should.

--

Actually, I can see now what the problem is, I just dont know how to fix it.. If i tack on index.php to my front page url, then the normal 403 shows up. If I dont call a file name specifically, it does not.

I have this in my .htaccess:

<Files 403.shtml>
order allow,deny
allow from all
</Files>

Any suggestions on how I might fix this?

The other thing I am seeing is the "Request exceeded the limit of 10 internal redirects due to probable configuration error. Use 'LimitInternalRecursion' to increase the limit if necessary. Use 'LogLevel debug' to get a backtrace." message..

Ive read on here everything related to it, and am still confused. It occurs in similar situations as the above -- its a banned user, that is not calling an actual file name.

Any and all help with this is greatly appreciated.

whoo

9:06 pm on Apr 4, 2008 (gmt 0)

10+ Year Member



I can add some more info to this, I turned on debugging, and now see this:

[Fri Apr 04 16:05:28 2008] [debug] core.c(3042): [client xx.xx.xx.36] redirected from r->uri = /403.php

that happens 9 times and then I see:

[Fri Apr 04 16:05:28 2008] [debug] core.c(3042): [client xx.xx.xx.36] redirected from r->uri = /

when I am emulating one of the banned ppl.

My 403.php is not doing any redirecting.

whoo

9:11 pm on Apr 4, 2008 (gmt 0)

10+ Year Member



actually, I dont even use a 403.php ... hmmm, ok, well that tells me something :)

whoo

9:16 pm on Apr 4, 2008 (gmt 0)

10+ Year Member



oke, so I fixed that, so that the right file is actually being called.

Now I see this:

[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /403.shtml
[Fri Apr 04 16:14:17 2008] [debug] core.c(3042): [client 0] redirected from r->uri = /

Only edit was removing my own IP

whoo

9:24 pm on Apr 4, 2008 (gmt 0)

10+ Year Member



so, oke, rethinking this< i understand the problem, that Im banned from seeing the file, but doesnt the files directive above solve that problem?

It worked on Apache 1.3.x :(

jdMorgan

10:26 pm on Apr 4, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The server is looping because access to 403.shtml is denied, as you specified.

The likely problem is that you've got another "Order" directive later that's conflicting with this one.

Here's another way to do it, using only one Allow/Deny block for all URLs/files:


SetEnvIf Request_URI "^(403\.shtml¦robots.txt)$" allowsome
#
Order Deny,Allow
Deny from bad-bots
Allow from env=allowsome

Similarly, if you use mod_rewrite to generate Forbidden responses, you'll need to add a rule at or near the top of your rewriterules:


RewriteRule ^(403\.shtml¦robots\.txt)$ - [L]

This will bypass all following rules if the request is for either of those files.

Replace the broken pipe "¦" characters with solid pipes before use; Posting on this forum modifies the pipe characters.

Jim

whoo

11:02 pm on Apr 4, 2008 (gmt 0)

10+ Year Member



hmm, ok, same error. this is very frustrating.

whoo

11:06 pm on Apr 4, 2008 (gmt 0)

10+ Year Member



.. and it doesnt seem to matter if I specify a filename. index.php is the same as not including it.

10 errors similar to the above. a standard 403 that's also telling me there was a 500 errors. I am **almost** ready to ditch apache 2 and go back to 1.3. One shouldnt have to rewrite a previously working .htaccess .. I thought moving from PHP 4 >> 5 would be the real troublemaker. Surprise, surprise.

whoo

12:21 am on Apr 5, 2008 (gmt 0)

10+ Year Member



OK, Ive decided to do this differently. On another site, using the same httpd.conf I have this .htaccess:

ErrorDocument 404 /404.php
ErrorDocument 403 /403.shtml
Deny from comcast.net
<Files 403.shtml>
order allow,deny
allow from all
</Files>

I am shown the custom 403.shtml, as I should be. Now to keep adding "stuff" till I see what breaks it.

whoo

1:05 am on Apr 5, 2008 (gmt 0)

10+ Year Member



OK :) I managed to solve this, I believe, using another one of jd's replies:

[webmasterworld.com...]

Changing all occurances of:

RewriteRule ^.*$ - [F]

to:

RewriteRule !^403\.shtml$ - [F]

has seemingly fixed this.

God only knows what this forum would be like without his wealth of knowledge. Thanks Jim, again :)

jdMorgan

2:31 pm on Apr 5, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The rule I posted in my previous reply, when placed at the top of your rewriterules, should have done the same thing with a lot less work. I'd suggest re-testing that, so that you don't have to remember to add the exclusion to every one of your access-control rules in the future, and so that you don't block robots.txt as well. If you block robots.txt, some primitive robots will take that as carte-blanche to attempt to spider your entire site, which can lead to excessive bandwidth wastage.

Be sure to completely flush your browser cache after changing any server-side code, and be sure to note the warning about broken pipe characters in my previous post.

Jim