Forum Moderators: phranque

Message Too Old, No Replies

Blocking CONNECT attempts with 403

Tried recommendations but not seeing a 403

         

billegal

12:16 pm on Dec 9, 2004 (gmt 0)

10+ Year Member



I've looked here:
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]

There was some suggested code:

RewriteCond %{THE_REQUEST} ^(GET劣EAD同OST)\ /?http:// [NC]
RewriteCond %{THE_REQUEST}!^(GET劣EAD同OST)\ /?http://(www\.)?mydomain\.com/ [NC]
RewriteCond %{THE_REQUEST}!^(GET劣EAD同OST)\ /?http://192\.168.0\.1/ [NC]
RewriteRule .* - [F]

I've tried this without the third line and it works for http:// type requests.

The logs still show some 200s for, e.g.:
CONNECT 11.11.11.11:25 HTTP/1.0

I confirmed this by telnet.

In this instance, I still see the first few lines of my index.htm page. I'm not sure why the whole page is not returned. I suppose I could try a RewriteRule for an IP address, but it seems I should be able to Limit the use of CONNECT and avoid the use of Rewrite altogether.

I've tried:
<Files *>
<LimitExcept POST GET>
deny from all
</Limit>
</Files>

This returns the apache test page.

What should I look at next to ensure a 403 response for any CONNECT attempts?

I also have ProxyRequests Off.

Thanks,
Bill

jdMorgan

4:44 pm on Dec 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Bill,

This code was not intended to prevent CONNECT attempts. The first RewriteCond will only allow the rule to be invoked for GET, HEAD, or POST attempts to a URL-path including http://
The following two RewriteConds further require that the domain or IP address in the requested URL do not match your domain or IP address.
These requests would appear in your access log file as
GET /http://www.yahoo.com
for example. These are attempts to use you server as a proxy, and the probes often include attempted connections to yahoo, intel, and other reputable sites.

Note that these attempts will stand out in your log files, since most requests will be in the form
GET /some_page_on_your_site.html, and won't include [<domain>...]

However, the HTTP protocol allows requests of that form (called absolute URIs) and it is possible someone might send your server a request like
GET /http://www.yourdomain.com/some_page_on_your_site.html
So, the last two RewriteConds specifically allow such requests, but only if the domain or IP in the request belongs to your site.

So the code should have a comment line describing it:


# Block attempts to use our server as a proxy, but allow absolute URIs specifying our site
RewriteCond %{THE_REQUEST} ^(GET劣EAD同OST)\ /?http:// [NC]
RewriteCond %{THE_REQUEST}!^(GET劣EAD同OST)\ /?http://(www\.)?mydomain\.com/ [NC]
RewriteCond %{THE_REQUEST}!^(GET劣EAD同OST)\ /?http://192\.168.0\.1/ [NC]
RewriteRule .* - [F]

Now, to address your question: An effective way to prevent CONNECT attempts is to use something like:


# BLOCK unsupported HTTP methods
RewriteCond %{REQUEST_METHOD} !^(GET劣EAD同OST吏PTIONS同ROPFIND吋RACE)$
RewriteRule .* - [F]

This allows all useful and harmless request types, and blocks the others.

Change all broken pipep "¦" characters to solid pipes before use.

Jim

billegal

6:08 pm on Dec 9, 2004 (gmt 0)

10+ Year Member



Thanks for the information. Since I have ProxyRequests Off, I will use only the latest code with the REQUEST_METHOD.

One followup question I have is: Why does the LimitExcept not work as expected? Is there something I should check?

FOLLOW UP:
The REQUEST_METHOD now returns the Apache test page. I put the code in the default server and in a couple of virtual servers to test it. Any thoughts on what could be going on?

jdMorgan

5:13 pm on Dec 10, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> Since I have ProxyRequests Off, I will use only the latest code with the REQUEST_METHOD.

Be careful... Others have reported that this does not protect you. Test and be sure.

Not sure why you're getting the Apache default page. Try testing both 404 and 403 responses. If the problem is common to both, it may be a config issue. If not, then check any ErrorDocument directives you may have, and be sure they are correct; See the ErrorDocument documentation for important warnings about the form of the custom page URL. Also, if you do use custom error documents, you must exclude them from being redirected or rewritten under any circumstance. The easiest way to do that is to precede all of your access control code with a "bypass" RewriteRule like:


RewriteRule ^(custom403page圭ustom404page)\.html$ - [L]

Which basically says, "if either of my custom error pages are requested, then stop processing rewriterules."

I'm not sure why your LimitExcept is not working. I recall trying it in the past, and having some problems, too.

Jim

billegal

6:57 pm on Dec 10, 2004 (gmt 0)

10+ Year Member



I will test and try to be sure. Basically I want no 200's for strange entries in the access log. When I have made some progress I will report back.

Bill