Forum Moderators: phranque

Message Too Old, No Replies

My Site is getting DoS'd for days now

         

l008comm

9:36 am on Nov 5, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



So as I type this, my server is under attack. It's been getting attacked for days, it keeps getting worse and worse. Today alone, my apache access log is 1 GB. I loaded every ip into a database so I could get a clearer picture of what's going on. The number of IP's attacking me as of right now is over 100,000. Even with user agent blocking, its still enough to take the whole server down. We're talking 400-600 page requests per second. As a temporary fix I've made the domain name point to 127.0.0.1 to salvage the rest of the sites on my server. But this just keeps getting worse and I'm out of ideas. New hosts are joining in way faster than I can type in firewall rules. And I don't know how to make firewall rules to block hundreds of thousands of IPs. The users agents are all these:

Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.8.1.8) Gecko/20071008 Firefox/2.0.0.8
Java/1.6.0_01
Java/1.6.0_02
Java/1.6.0_03

The "attack" is page loading. They are requesting the page every second. I had a system to block out people who load my page more than 20 times in a half hour, but even that system can't handle this much traffic.

So I'm thinking there are 3 options as to what this actually is.
1) some new virus that is targeting my site? (i'm thinking probably not)
2) Some web site that causes the visitor to unknowingly run some sort of java applet that DoS's my site by way of simply loading my site every second, until they navigate to a different page. (possible?)
3) Someone released some great new "script" or program that 'automatically tells you your ip" but it does it by way of grabbing my web site every second or so, bringing my server to it's knees.

Any other thoughts on what might be the cause of this. And of course, any ways I can possibly stop this, other than just waiting it out.

jdMorgan

2:31 pm on Nov 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sounds like someone may not like you or your site --for personal or business reasons-- and has set a botnet loose to DOS you. Or perhaps it is being done 'for profit' -- Check your webmaster and admin e-mail carefully for "pay us to make it stop" offers, and contact authorities as appropriate.

Tying in with your other thread, if often helps to make your custom 403 page very small in these cases; Include only enough header code to make the page valid, and then provide a very short error message containing a text link to another (secondary) 403 info page with more details.

Jim

l008comm

10:08 pm on Nov 5, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



I checked through my whole junk mail box. My #*$! is going to be huge, and some guy named ed isn't going to bother me anymore. But thats all.
Plus judging from the shape of the traffic, the way it shoots up at the same time, during the day, then tails off at night. I really think its a bot. The home page gives you your IP, and I have people set up scripts like this a lot. I really suspect thats what this is. Of course the other issues is that even with all this traffic blocked with a 403, its still increasing by a significant amount every day. At this rate, even blocking traffic, I've only got a few more days before my server is going down again.

9 Hours:
[img155.imageshack.us...]

5 Days:
[img233.imageshack.us...]

jdMorgan

10:37 pm on Nov 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Well, block what you can in the firewall based on the user-agent if the firewall supports that function --better ones do, while inexpensive ones don't-- and then use a tiny custom 403 page or even a blank one to save on response bandwidth.

Also, use mod_headers to close the connection after your custom 403 page is served, so the server threads will be released immediately:


# Disconnect client after 403 response
<FilesMatch "^403\.html$">
SetEnv nokeepalive
</FilesMatch>

Jim

l008comm

11:02 pm on Nov 5, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



:-D How do I configure a custom 403? Its been a while since i've gotten knee deep in apache config...

plumsauce

11:34 pm on Nov 5, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The home page gives you your IP, and I have people set up scripts like this a lot.

Unless this is central to the existence of your site, you may want to consider taking it off. Alternatively, break it by giving out 127.0.0.1 to discourage this kind of use.

The attraction may be that it lets people using dynamic ip's update their dynamic dns records by first reading the ip and updating at their dns provider. If someone wrote a script to do this every few minutes and distributed the script, then lot's of home boxes could be using this as a means of reading their ip.

Returning 127.0.0.1 does not harm their system, but it effectively black holes access from outside until they find another alternative to determining their own ip.

Now, you say that it is a very limited number of user agents. Then those user agents "could" be bots that need to figure out their own external ip behind a nat as part of the reporting home process.

l008comm

11:53 pm on Nov 5, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



the [site's purpose is to display visitors' IP addresses, so] showing the ip is pretty important.
I just tried to make a custom 403 page, that was nice and short. But my custom 403 page was 403'd, so I got a full automatically generated 403 page telling me that i got a 403 error, and i got another 403 error while trying to handle the first one.
Now I'm very confused.

[edited by: jdMorgan at 12:41 am (utc) on Nov. 6, 2007]
[edit reason] Obscured site identity per TOS. [/edit]

jdMorgan

12:43 am on Nov 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Check the path in your ErrorDocument directive, 'cause it sounds like it's incorrect. Also be sure to specify only a local filepath, not a URL.

ErrorDocument 403 /local-path-to-403-page.html

Jim

l008comm

3:26 am on Nov 6, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



But if I'm blocking someone from my site, doesn't it make sense that I'll get a 403 while trying to give them a custom 403?

l008comm

4:14 am on Nov 6, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



ErrorDocument 403 /local-path-to-403-page.html

Thats exactly what I did, with the path to my file of course, and it still doesn't work. Does the 403 file have to be outside of this virtual host's directory?

jdMorgan

5:15 pm on Nov 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



But if I'm blocking someone from my site, doesn't it make sense that I'll get a 403 while trying to give them a custom 403?

You should exclude both your-403-page.html and robots.txt from ever returning a 403 response. Let anybody, anywhere fetch those files. Otherwise, you open the door to an even simpler kind of DOS attack, since each request causes your server to loop, in effect, 403ing/DOSing itself.

Jim

l008comm

11:29 pm on Nov 6, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



How do I enable access to my 403 page if I'm blocking by user agent?

jdMorgan

11:49 pm on Nov 6, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you block using mod_access, then "SetEnvIf" and "Allow from env=" directives come in handy. If using mod_rewrite, then add a RewriteCond based on REQUEST_URI.

Jim

l008comm

12:04 am on Nov 7, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



OKI actually just used this:

ErrorDocument 403 "Error 403 - forbidden

Now everyone gets a really short 403 message. Works for me.

So back to the keepalive. I'm setting a variable if the user agent matches a 'bad' one. Can I use that same variable to turn off keepalive for those user agents?

Achernar

12:30 am on Nov 7, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



The 403 doesn't have to be publicly accessible because it's not served with it's own URL. The server should be able to read it and to display it in a reply to a URL that is actually 403ed.

Example:

Your 403 error text is located in a folder with other error messages
'/errors/403.html'

but is used by the server on url like this one:
http://www.example.com/private/message.php?msgid=145668

jdMorgan

12:34 am on Nov 7, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> Can I use that same variable to turn off keepalive for those user agents?

No, the 'best' available containers are <Files> and <FilesMatch>, neither of which will work with the "text-only" ErrorDocument method.

Jim

l008comm

6:25 am on Nov 7, 2007 (gmt 0)

10+ Year Member Top Contributors Of The Month



OK I've got my server handing out 403's pretty good. So moving on to the next problem I'm having:

Since this started, I've been getting lots of errors every time I load a page that requires PHP to run an external program. PHP running uptime, for example. From what I can tell though, it should be working. I set my php keepalive timeout to 2 seconds, with a max of 450 clients. These 403's i'm sending out, about 200 per second, shouldn't be running php or any other extra process. So I'm not sure what is causing this, but it's definitely somehow related to the DoS. Also, when I restart apache, it will run for a little while, then it shortly goes back to not being able to fork any processes. I'm not getting any error numbers or anything like that, and nothing at all in the apache error log. I don't see what could be causing this... there are usually 100-200 requests per second, I don't see how they could clog up the server like this. Also my os has a limit of 1000 processes per user, so 'www' shouldn't be getting anywhere close to that.