Welcome to WebmasterWorld Guest from

Forum Moderators: ocean10000

Message Too Old, No Replies

UrlScan 3.0 Beta Release

Prevent SQL Injection Attacks

3:03 am on Jun 30, 2008 (gmt 0)


WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month

joined:Jan 14, 2004
votes: 3

For those who have not been paying attention the last couple of months. There have been automated attacks on many servers exploiting bad data processing by many web applications, to insert into or do damage to production databases which the web application has access too. These attacks are often used to further other types of attacks later on. Microsoft took a great deal of heat because a large number of the affected websites are using the Microsoft Web Application stack (IIS/ASP/Net/SQL), even though this applications were not written by Microsoft. Recent related threads here "Under Attack! [webmasterworld.com]","No New IIS Or Microsoft SQL Server Vulnerabilities [webmasterworld.com]")

Microsoft has released UrlScan 3.0 [learn.iis.net] at in public beta this week. This is a updated tool that can be used to filter out harmful data before it can be processed by the Web application itself. The main focus of this release is to combat some of the automated Injection attacks on websites by blocking them before the application can act on them. This can be used by web application hosts to protect the hosted sites from the attacks and give there clients time to update their web applications.

New Features

* Deny rules can now be independently applied to query string, all headers, a particular header, URL or a combination of these.
* A global DenyQueryString section in configuration lest you add deny rules for query strings with the option of checking the un-escaped version of the query string as well.
* Using escape sequences (like %0A%0D) can now be used in deny rules so it is possible to deny CRLF and other sequences involving non-printable characters.
* Multiple UrlScan instances can now be installed as site filters, each with its own configuration and rules (urlscan.ini).
* Configuration (urlscan.ini) change notifications will be propagated to IIS worker processes so you won’t have to recycle your worker processes after making a configuration change. Logging settings are the only exception to this.
* Enhanced logging to give descriptive configuration errors.

Common UrlScan Scenarios
[learn.iis.net] that can be used to stop web applications from attacks from being sucessfull.

[edited by: Ocean10000 at 3:04 am (utc) on June 30, 2008]

5:30 pm on July 1, 2008 (gmt 0)

Senior Member

joined:Nov 12, 2005
votes: 0

The real lesson here is to develop your apps correctly. A tool like this is encouraging sloppy coding techniques and doesn't solve the real problem at hand. Education is what is needed :)

Other than that, it's pretty neat, although I feel like it would just create problems for me if I were to use it. :-P

12:58 am on July 2, 2008 (gmt 0)


WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month

joined:Jan 14, 2004
votes: 3

I agree with you about developing the applications correctly. And using this as an added protection layer on top of that, as an added security blanket.

But in the rest of the world there are too many bad applications that never get updated that this will at least help protect them from the current crop of attacks.

12:30 pm on July 6, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:June 13, 2002
votes: 0

I would think it has some performance overheads checking every url and logging etc.

If I needed this functionality, Id put into a non Windows solid state device just inside the firewall with a bunch of other protections at this layer.

What about cache and cookie poison, or form field manipulation - if you are going to do iis app layer protection remember the attackers you are protecting against know *all* the tricks.

Also this doesnt look like it has built in best practice templates, so misconfiguration is going to be an issue with non-security savvy users.

8:33 pm on Oct 16, 2008 (gmt 0)

New User

10+ Year Member

joined:Sept 17, 2008
posts: 11
votes: 0


After doing some research on IIS security, I am thinking of using the IIS URL Scan 3.0 to filter out multiple HTTP requests from bots (with bogus urls) that overload the server.

Is this a good way to stop this? Or would I be better off using other type of software/tool...I am really looking to block malicious requests that end up taking up a lot of resources and ultimately bog down the server. However, I am new to security and would like an easy to use tool/software.