Forum Moderators: phranque
Mozilla/5.0 (Windows; U; Windows NT 6.0; de; rv:1.8.1.8) Gecko/20071008 Firefox/2.0.0.8
Java/1.6.0_01
Java/1.6.0_02
Java/1.6.0_03
The "attack" is page loading. They are requesting the page every second. I had a system to block out people who load my page more than 20 times in a half hour, but even that system can't handle this much traffic.
So I'm thinking there are 3 options as to what this actually is.
1) some new virus that is targeting my site? (i'm thinking probably not)
2) Some web site that causes the visitor to unknowingly run some sort of java applet that DoS's my site by way of simply loading my site every second, until they navigate to a different page. (possible?)
3) Someone released some great new "script" or program that 'automatically tells you your ip" but it does it by way of grabbing my web site every second or so, bringing my server to it's knees.
Any other thoughts on what might be the cause of this. And of course, any ways I can possibly stop this, other than just waiting it out.
Tying in with your other thread, if often helps to make your custom 403 page very small in these cases; Include only enough header code to make the page valid, and then provide a very short error message containing a text link to another (secondary) 403 info page with more details.
Jim
9 Hours:
[img155.imageshack.us...]
5 Days:
[img233.imageshack.us...]
Also, use mod_headers to close the connection after your custom 403 page is served, so the server threads will be released immediately:
# Disconnect client after 403 response
<FilesMatch "^403\.html$">
SetEnv nokeepalive
</FilesMatch>
The home page gives you your IP, and I have people set up scripts like this a lot.
Unless this is central to the existence of your site, you may want to consider taking it off. Alternatively, break it by giving out 127.0.0.1 to discourage this kind of use.
The attraction may be that it lets people using dynamic ip's update their dynamic dns records by first reading the ip and updating at their dns provider. If someone wrote a script to do this every few minutes and distributed the script, then lot's of home boxes could be using this as a means of reading their ip.
Returning 127.0.0.1 does not harm their system, but it effectively black holes access from outside until they find another alternative to determining their own ip.
Now, you say that it is a very limited number of user agents. Then those user agents "could" be bots that need to figure out their own external ip behind a nat as part of the reporting home process.
[edited by: jdMorgan at 12:41 am (utc) on Nov. 6, 2007]
[edit reason] Obscured site identity per TOS. [/edit]
But if I'm blocking someone from my site, doesn't it make sense that I'll get a 403 while trying to give them a custom 403?
You should exclude both your-403-page.html and robots.txt from ever returning a 403 response. Let anybody, anywhere fetch those files. Otherwise, you open the door to an even simpler kind of DOS attack, since each request causes your server to loop, in effect, 403ing/DOSing itself.
Jim
Example:
Your 403 error text is located in a folder with other error messages
'/errors/403.html'
but is used by the server on url like this one:
http://www.example.com/private/message.php?msgid=145668
Since this started, I've been getting lots of errors every time I load a page that requires PHP to run an external program. PHP running uptime, for example. From what I can tell though, it should be working. I set my php keepalive timeout to 2 seconds, with a max of 450 clients. These 403's i'm sending out, about 200 per second, shouldn't be running php or any other extra process. So I'm not sure what is causing this, but it's definitely somehow related to the DoS. Also, when I restart apache, it will run for a little while, then it shortly goes back to not being able to fork any processes. I'm not getting any error numbers or anything like that, and nothing at all in the apache error log. I don't see what could be causing this... there are usually 100-200 requests per second, I don't see how they could clog up the server like this. Also my os has a limit of 1000 processes per user, so 'www' shouldn't be getting anywhere close to that.