Forum Moderators: coopster

Message Too Old, No Replies

My site was hacked.

         

andrewsmd

2:47 pm on Nov 24, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I have a voting site that was hacked by someone. I have a captcha script on it and the user's have to enter the randomly generated numbers. Luckily I run some checks and I got the bad votes and didn't insert them. My question is how can someone insert automated clicks. What I mean is I have radio buttons and a submit button now of course they can see them if they view my source so how would they automatically go that page, make a radio button selected, insert the numbers into my text box and submit the vote.

cameraman

3:35 pm on Nov 24, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



As I understand it, there are scripts out there that can 'read' a lot of the image-based captchas - I have an image captcha on my forum software and it doesn't seem to slow the jerks down much. In another area of the same site I came up with a dozen silly questions like 'which is hotter, fire or ice' and select one randomly - so far that's been pretty effective.

As for 'selecting' the radio button, the form field just needs a value; it's the browser that turns a mouse click into a value for the post. With your new curl knowledge you can post to your form. Since you probably don't have one of those captcha reading scripts you yourself wouldn't be able to get by that, but that's exactly how you'd go about automating the process: curl or sockets.

andrewsmd

3:52 pm on Nov 24, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



First of all, I know that nothing is hack proof. I just put the captcha script in to make it that much more difficult to hack. I did catch the bad votes because I store votes in a temp table and run a check on them after 500 are inserted. That is why I was just curious as to how they submitted the form automatically. I marked them as bad because I had over 200 votes from the same IP in an hour. I am in the process of generating a random captch script that asks the user to do a simple math equation like 5 TimEs nIne or sIx + 7 to make it even harder. If it was with curl or sockets then I will try to put in some implementation to block that. Is there any way you can put in PHP code to say don't give the user any of that kind of requested information or would that effectively stop them from being able to vote at all. Thanks,

cameraman

4:03 pm on Nov 24, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



LOL doing something as complicated as math might weed out some of your legitimate visitors...

Actually, you didn't get hacked, you got attacked. Hacked would be the situation if you didn't have the steps you've designed to keep illegitimate votes out.

I've never used sockets. I believe curl does the hard work and by the end uses sockets to do its thing. There's no way [that I know of] that you can tell whether your visitor is coming from a browser or a curl transaction.

I don't know what you mean by "that kind of requested information" - the automated script received exactly what a browser would receive, and used that info to post back to your server. If you didn't send enough information for an automated script to post, it wouldn't be enough information for a browser to post, either.

mrscruff

4:16 pm on Nov 24, 2008 (gmt 0)

10+ Year Member



All the script has to do is collect the input options from you form, and it will then just send a request to your server with the post data included, does not need to select/click any thing.

You can include a text input that you hide (css; display: none), this would filled out by any bot scripts, and if that filed is populated when you receive the request you can ignore it.

andrewsmd

4:40 pm on Nov 24, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I'm not too worried about loosing some legitiamte votes. I would rather miss 10% of my real votes and stop all of the illegitimate votes. Plus if someone isn't smart enough to do simple math, I don't think I really care about their opinion anyways.

enigma1

9:45 am on Nov 30, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



You could stop the spam votes without using captchas and be more efficient. People always prefer to minimum number of clicks to submit a form. You could do as mrscruff said. And you could even randomize the CSS so with every page load, the form fields are different, yet the form layout will be the same.

SteveWh

7:53 pm on Nov 30, 2008 (gmt 0)

10+ Year Member



5 TimEs nIne or sIx + 7

Mixing up the capital letters like that makes it harder for your legitimate readers and is no impediment at all to robots (which can convert to all-upper or all-lower more easily than a person can, and they won't be annoyed by it, either).

As far as the automated form submission goes, the point is that when a human user fills out the form, they do it all in their browser. Nothing goes back to your server until they click Submit. At that point, their browser packages up the info from the filled-out form and sends it back to your website as a single package of data with a GET or POST request.

All a robot has to do is send that package. They don't even have to get the web page where your form is.

One thing you could do is use .htaccess to ban accesses to your forms handler script unless the referer is the page where your form is. However, many of your legitimate visitors might have referer info turned off, so you'd be banning them, too. Also, I'm sure plenty of robots know that trick, and will just supply the referer it knows you want.

[edited by: SteveWh at 7:55 pm (utc) on Nov. 30, 2008]

enigma1

7:45 pm on Dec 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



All a robot has to do is send that package. They don't even have to get the web page where your form is.

Steve, bots need to do more than that. And depends on the form implementation. For instance if we have a hidden field with the form that bind in some way with a session/cookie identifier, the bot would have to know the correct value each time the page loads subject to a session identifier.
Example:
<input type="hidden" name="some_name" value="123">
where 123 are the last 3 digits of a session rotated 3 bits left.
While with a regular browser all these will be transparent a bot will think to always populate the some_name hidden field with 123.

The server in turn detects the mismatch and could simply pretend it "accepted" the form submission. That's just an example that would not require a captcha.

Many bots rely on some common code to automatically process forms. There are many popular web packages so a bot can be pre-programmed to process given forms as these are well known. Once you customize a form submission though, it becomes very hard for someone to figure the relationship of the various form elements with other things like a session cookie.

andrewsmd

8:25 pm on Dec 1, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Just an FYI. I have implemented something such as enigma1 said using the session ID. Also, I have stored all of my form submission into a temporary table that in turn I run checks on at the end of each day. I run all kinds of checks like duplicate session ids, entries from the same IP too often or too fast, and other things of that nature. I am pretty sure I have locked down my form secure enough because the bot that was running on it quit after I uploaded this new version. I still kept my captcha script because that is just one more level of security.

enigma1

11:06 am on Dec 2, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Andrew, most of these methods will work just makes it harder for spammers to figure out what is going on, but personally I am against captchas because people may leave your site, as forms become more complex for them. And one way currently spammers bypass captchas is without OCR methods. What they do for instance, is they pull-in the form elements to their own site and leave it to their own visitors to do the bypass.

For instance we have:
1. good.example.com/form1.html -> form to bypass
2. evil.example.com/form2.html -> form presented on the bad guy site

Upon request of the evil.example.com/form2.html page by their own clients, the evil server opens a connection to good.example.com/form1.html pulls-in the form elements and includes them in its form2.html. That will include your captcha (image and input). Now visitor of evil.example.com/form2.html page completes the captcha/riddle thinking it belongs to evil.example.com. Upon submission the evil.example.com takes the data and submits them to your site bypassing your captcha and then he can submit a review, comment, create an account or place an order and spam your site.

This is just a simple example. As they may use intermediate sites or hijack browsers and take advantage of active content to do it so you cannot trace back where it came from. And of course is very hard to defend against it.

So my opinion is as long as you keep your form processing custom and pretend that all form submissions are accepted you're making it harder for them to figure out what is going on.