Forum Moderators: DixonJones

Message Too Old, No Replies

Scanbot revenge script

Investigating an idea for a script retaliator

         

SpliFF

11:09 pm on Mar 12, 2006 (gmt 0)

10+ Year Member



I have a custom error tracking system on my site that emails me when invalid links are accessed (404) so I can check to see if the error is caused by the site itself (the site is dynamic and uses SES urls so it has millions of internal links).

As time passes I am receiving a larger volume of error reports generated by hacking/scraping scripts - such as accesses to /sumthin and various frontpage dlls and exes. It's really starting to get annoying since I run on a hardened Linux system and these types of accesses are never going to cause security issues.

I found a thread on this board (which is now closed) dealing with using mod_rewrite and .htaccess to trap these bots and it gave me an idea which I wanted to share and discuss.

The process would be:
1.) Identify bots based on their signature (UA or request URL) in http.conf
2.) Redirect them to a CGI with the following pseudocode:


return header 200 (Document Found)
output 'You should not be here. <a href="/">HOME</a><br><br>'
for (i=1,i<1000000,i++) {
output chr(random(32-96))
sleep 1 second
}

In theory this would have the effect of bogging down hacking/ripping scripts for hours or until they explicitly drop the connection. If I'm really lucky they'll consume all their memory/threads and crash. I know a well-written script would get around these issues but my understanding is that many of these things are naive VB scripts designed more for speed and compactness (since they are often worms) than robustness.

This could backfire if it used up too many server threads but it could be modified to limit the maximum number of running instances to a value below Apaches max threads directive.

Has this been tried before? Any advice, comments or objections?

Peb0

7:52 pm on Mar 14, 2006 (gmt 0)

10+ Year Member



I'm gonna jump onboard and piggy back this thread.

I was thinking exactly the same thing this last week. I've been trying to keep my site as clean as possible, then all of a sudden I'm finding mass groupings of 404 messages. Each time it happens, it seems to be for the same pages. It hasn't been enough for me to put anything into action yet, but the small amount of research I've done shows that almost all the files being requested have to do with proxies. They're looking for a vulnerability.

By how fast consecutive requests are being made, I realize that it's being done via an application or script of sorts, so in all likelihood, the user doesn't actually see the page getting requested (perhaps unless it exists). My attitude is that if anyone feels they have a right to hack at me, then whatever I can throw their way is fair game.

If anyone has any other suggestions I'd like to hear. Whether blocking their IP or freezing their machine by some type of overrun, what else can we do.

PebĪ

ronburk

9:00 pm on Mar 15, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Yeah, so you slow them down and they have to start up another copy of the scanbot.

Your time invested: hours
Their time invested: 5 seconds

You would probably spend less of your time and more of theirs if you simply had Apache update your iptables on the fly to start dropping packets sent from that IP address (and hope it's not an IP address being shared among a few hundred AOL users).

If you want to have a chance of doing any good, you could run a copy of snort configured to log the most common attacks you're getting, then use the attacking IP address to locate an admin to email the log to.

Sometimes this is a waste of time, but sometimes you find a real admin who will take action. I've gotten a fair number of zombies fixed or blocked at the other ISP's firewall this way.

If you really are determined to implement your own form of tarpitting, you might want to start by reviewing more advanced techniques others have used. Google for "tcp tarpit labrea" for example.

SpliFF

1:32 am on Mar 24, 2006 (gmt 0)

10+ Year Member



Perhaps

Labrea works on unused IPs. In this case the IP concerned is being used by the server.

I would need to do more research but I know is that iptables-based tarpits don't work because the attacker can (and usually does) put a low socket timeout on their connect(). I'm guessing many attackers use a per-packet or socket timeout but how many would count the total time spent reading data? I'm guessing not many - though that could always change. It would bloat their code a bit and create more potential for error and detection though.

I'm basically trying to slow the information stream rather than aborting it entirely and I want to do it on a per-request basis (so I don't knock legitimate users and proxies) off the site. Whatever I do has to happen after Apache receives the request so it can check the request for dodgy patterns.

Wether this is time well spent is a matter of opinion. I have to catch the requests anyway (as I said, to stop error emails) so this only really concerns what action to take after an attack is spotted. My five hours extra work may stop millions of scans per year that would overwise have occurred had the worms/bots not been busy waiting for replies from my server. If my program works I could release it and others would be saved even that 5 hours and reduce the number of worldwide scans by millions more.

ronburk

12:52 am on Mar 25, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Google for

afraid tcp tarpit iptables

The goal is to hose the connection beneath the level of the sockets API's ability to do anything about it.

Key_Master

1:09 am on Mar 25, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



SpliFF, it's not a bad idea but it's very easy to program a bot to timeout a HTTP request. Based on what I have observed, most bots seem to timeout in 5 seconds or less.

You'll only slow your own server down.