|Weird access attempt logged by Apache|
| 6:09 am on Oct 22, 2009 (gmt 0)|
Has anyone or does anyone know what this means in my access logs?
88.80.7.nnn - - [21/Oct/2009:21:31:35 -0700] "GET http://spammy-chaos.c0m/pp/set-cookie.php HTTP/1.1" 301 247
220.127.116.11 - - [21/Oct/2009:22:04:42 -0700] "GET /buy.php?listingid=76465 HTTP/1.1" 404 4794
The second one I know is normal, the first one, does that mean my system is trying to get that url and get the cookie? Is that a hacker?
Much appreciated. Hope it's not a security breach
[edited by: jdMorgan at 1:21 pm (utc) on Oct. 22, 2009]
[edit reason] obscured IP and domain [/edit]
| 6:11 am on Oct 22, 2009 (gmt 0)|
I read up the http code definition from w3 but dont really understand the 301?
| 1:36 pm on Oct 22, 2009 (gmt 0)|
Someone or something from a colocation center in Sweden requested that "set-cookie.php" page from your server using a full URL in the request line. This is a rarely-seen but valid format. However, the domain in the client's request line was not your own. So it may have been some sort of attempt to use your server as a proxy.
However, due to either the URL not being correct for your site (e.g. the page doesn't exist or you have some code that checks URLs to see if they resolve to existing files), or because the hostname sent in the Host header (which does not appear in your logs) was wrong, your server responded with a 301 redirect.
Unless there is a second request from the 88.80.7.nnn IP address, basically, nothing happened.
The second request was from Googlebot, so I'm sure why you included it here.
It looks like your server does not log user-agents. If you have an interest in watching your logs so that you can take action to improve access control, I suggest that you talk to your host about fixing the log format to log at least the 'standard' entries. Trying to determine if a request was legitimate or not without being able to see the requesting user-agent is very difficult. If it were me and they said they won't support standard logging, I'd be looking for a new host...
| 5:12 pm on Oct 22, 2009 (gmt 0)|
The second one, I'm used to. I did that as an example of how logs show up. I'm self-hosted. In my logs it does show more than just this one time.
Thats exactly what I was afraid of. This ip is somehow making my server request a page, I'm not sure, but cant that mean that when my server requests the page, there could be cookies placed on my machine which will get passed to my users?
| 5:41 pm on Oct 22, 2009 (gmt 0)|
"Machines" -- that is, servers can't accept cookies. Only clients (e.g. browsers accept cookies).
Further, servers don't make requests; Their job is to server, not demand things. Client/server, customer/waiter, patron/shopkeeper... Servers just sit there and do what clients ask them to do.
This client asked your server to pass a request to another server. Due to some code on your server and the fact that it's likely not set up as a proxy, all your server did was to issue a redirect response. As I said, unless that client came back again afterward and tried some other trick, you can safely ignore this. In fact, if this is the worst-looking eveidence of abuse in your log file, you've got nothing at all to worry about...
The exploit requests you need to be wary of are the ones where your server returns a 200-OK response. 400s, 403s, 301s, 302s, 303s, 304s, etc. are cause for investigation, but not alarm.
| 5:49 pm on Oct 22, 2009 (gmt 0)|
Thank you again Jim. I guess sleepless night will do that to me. I know some of this stuff as far as servers and cookies, I guess it just didnt click right away. I forgot that even when using a curl function, cookies are set.
Thank you again and I will definetly look for reattempts from this ip...
I guess I just wonder; how is it requesting that address from my machine.
| 10:27 pm on Oct 22, 2009 (gmt 0)|
All the client has to do is send these two lines in the request:
GET http://spammy-chaos.c0m/pp/set-cookie.php HTTP/1.1
You could do it yourself with a simple terminal emulator program.
This could very well be a simple coding error, although it might be a coding error in a site-scraper script.
| 9:59 am on Oct 23, 2009 (gmt 0)|
That ip is at it again:
88.80.7.#*$! - - [23/Oct/2009:02:08:52 -0700] "GET [spam-#*$!#*$!.com...] HTTP/1.1" 404 18261
how does the server code response keep changing, I thought it would be the same....
Well I know 404 is not found, is there a way to verify that an ip block is effective?
# Begin IP blocking #
Deny from 88.80.7.#*$!
Allow from all
# End IP blocking #
When I add the above code to my config and try to restart the server, it fails? Anyone have any suggestion, I know it worked before on my host with ipower, but it was in an htaccess file. Could it be a placement issue?
| 3:26 pm on Oct 23, 2009 (gmt 0)|
The server response will be the same, but only if the client request is the same. As I noted in my first post, the client's first request triggered a 301 -- possibly because the requested hostname was incorrect. Unfortunately, the requested hostname does not appear in standard access log files.
In the second case, the hostname was likely correct, but since the page does not exist, your server returned a 404-Not Found.
So, it looks to me like your server is working fine. And as previously stated, this 'exploit' is really nothing to worry about. If you feel a need to 'slap' them, you could always add a rule to detect these 'set-cookie.php' requests and return a 403-Forbidden, but that's a waste of time -- yours and the server's...
Besides, if you do that, this client may 'think' that that file *does* exist on your server and that you're trying to protect it, and so come back and attack your server even more vigorously, trying to "break down your defenses."
Unless you're getting hundreds of these set-cookie.php per day, I'd ignore them. And once they reach the thousands-per-day level, the right approach is to block them at the firewall, so really, there's not much worth bothering about at the server level.
The only way to know if an IP block is effective is to watch your log file. But once your site gets popular and you start seeing hundreds/thousands/tens-of-thousands of 'bogus' requests every day, that will likely get rather tiresome...
If your code causes the server to fail to restart, check your error log file. You may not have mod_access loaded, or you may not have the proper AllowOverride and/or Options settings to allow you to use it in the context you're trying to use it in.
Also, I suggest that if you're just starting to build an IP-blocking list, you should use "Order Deny,Allow", omit the "Allow from All", and add provisions (likely using mod_setenvif) so that even denied IPs can fetch robots.txt and your custom 403 error page. If you don't make those provisions, you're setting yourself up for a 'self-inflicted DOS attack', as many robots take any failure to fetch robots.txt as carte-blanche to spider the entire site, and you don't want to return a 403 when the custom 403 error page is fetched, because that creates an infinite loop.