Welcome to WebmasterWorld Guest from 54.196.175.173

Forum Moderators: Ocean10000 & incrediBILL

Message Too Old, No Replies

Anybody have a clue

     

wilderness

4:41 pm on Jan 24, 2014 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Why these IP's are getting through on 2d requests?

I've the following in place to deny blank UA's, and in addition the IP's ranges are denied (Note other blank UA's are denied and these IP's are some sort of exception).

this is the first time I've noticed the pattern of same browser and UA as applied to this issue.

RewriteCond %{HTTP_USER_AGENT} ^-?$


208.68.38.202 - - [24/Jan/2014:09:04:51 -0700] "GET / HTTP/1.1" 403 831 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:21.0) Gecko/20100101 Firefox/21.0"
208.68.38.202 - - [24/Jan/2014:09:05:14 -0700] "GET / HTTP/1.1" 403 831 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:21.0) Gecko/20100101 Firefox/21.0"
208.68.38.202 - - [24/Jan/2014:09:05:46 -0700] "GET / HTTP/1.1" 200 9420 "-" "-"

192.81.210.204 - - [31/Dec/2013:22:07:21 -0700] "GET / HTTP/1.1" 403 644 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:21.0) Gecko/20100101 Firefox/21.0"
192.81.210.204 - - [31/Dec/2013:22:07:27 -0700] "GET / HTTP/1.1" 403 644 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:21.0) Gecko/20100101 Firefox/21.0"
192.81.210.204 - - [31/Dec/2013:22:07:40 -0700] "GET / HTTP/1.1" 200 9420 "-" "-"

198.211.107.216 - - [02/Jan/2014:06:40:23 -0700] "GET / HTTP/1.1" 403 644 "-" "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:21.0) Gecko/20100101 Firefox/21.0"
198.211.107.216 - - [02/Jan/2014:06:40:29 -0700] "GET / HTTP/1.1" 200 9420 "-" "-"

Angonasec

2:19 pm on Jan 26, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Double check your Allows and any holes you've poked.
Otherwise... worrying.

dstiles

7:33 pm on Jan 26, 2014 (gmt 0)

WebmasterWorld Senior Member dstiles is a WebmasterWorld Top Contributor of All Time 5+ Year Member



If it's an empty UA there will be nothing at all. The "-" is added by the logger to simplify human reading of the logs. I think ^$ might be better?

Doesn't the IP pre-empt anyway? In mine the block would be on the IP and the UA checked only if the IP were not blocked. I wonder if you have things the other way around and finding a "good" UA prevents the IP being checked. Just a thought. :)

wilderness

9:03 pm on Jan 26, 2014 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Many thanks for the replies.

Double check your Allows


I've a mere two allows in place and they are related to robots.txt, 403s and two other files that offer a limited access when denied access.


I think ^$ might be better


This was my first inclination and tried this, however there was not any change in the failed deny.

Doesn't the IP pre-empt anyway? In mine the block would be on the IP and the UA checked only if the IP were not blocked.


This presents a dilema.

I've some IP's blocked with "deny from" and some IP's blocked with mod_rewrite (despite being aware of the caution of mixing methods, I'm not aware of the order in which the rules are processed).

I'm inclined however to believe that were rule order the cause, than ALL blank UA's would get a 200, and that is not the case, rather just these particular IP's (at least thus far).

lucy24

10:22 pm on Jan 26, 2014 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



I think ^$ might be better?

He's hedging his bets by including the -? so it goes either way. As I remember it, "" means the UA header is blank, while "-" means it wasn't sent at all. Same for other fields like referer.

At one time I tried the same RewriteRule but for some reason it didn't work-- that is, it had no visible effect. Rather than rack my brains figuring out why, I changed to

BrowserMatch ^-?$ keep_out

as part of my SetEnvIf + authz package. This works a treat.

I've some IP's blocked with "deny from" and some IP's blocked with mod_rewrite (despite being aware of the caution of mixing methods, I'm not aware of the order in which the rules are processed).

In htaccess, mod_rewrite is processed early while mod_authz-thingummy is processed very late. (So late that for a long time I thought of "Deny from..." as a core directive.)

Each mod issues its own 403s, and they're final. One mod can't override another. If one mod issues a redirect and another one issues a 403, all the visitor ever sees is the 403.

I've a mere two allows in place

Is your overall site Deny,Allow as opposed to Allow,Deny? Or did you just mean that you have FilesMatch envelopes for certain files?

If you're issuing 403s in mod_rewrite, you need to make a separate exclusion for your error documents. But that wouldn't have any effect on the reported 200.

Edit after looking up the IPs: All three of those nasties are Digital Ocean. Why not just block the IP? They must be fairly offensive, because I've got all three blocked already.

wilderness

11:51 pm on Jan 26, 2014 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



Is your overall site Deny,Allow as opposed to Allow,Deny? Or did you just mean that you have FilesMatch envelopes for certain files?


Deny,Allow

Yes I was referring to file envelopes.



Why not just block the IP? They must be fairly offensive, because I've got all three blocked already.


and in addition the IP's ranges are denied

lucy24

12:05 am on Jan 27, 2014 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



and in addition the IP's ranges are denied

Oops, missed that. But try the BrowserMatch approach and see if it works any better.

In htaccess, a double negative does not make a positive. (403 + 403 != 200) I know you know that. I'm just saying.

wilderness

1:32 am on Jan 27, 2014 (gmt 0)

WebmasterWorld Senior Member wilderness is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month



In htaccess, a double negative does not make a positive. (403 + 403 != 200) I know you know that.


see what happens when you a-s-s-u-m-e ;)

Perhaps that's the problem?
The IP is catching a 403 and then the blank UA is catching a 403 which becomes a 200.

If so, I don't recall seeing a similar occurrence in more than a decade.

lucy24

2:19 am on Jan 27, 2014 (gmt 0)

WebmasterWorld Senior Member lucy24 is a WebmasterWorld Top Contributor of All Time Top Contributors Of The Month



Well, I did say it doesn't happen that way. Imagine how dreadful it would be if you had to issue an odd number of lockouts or it wouldn't work :)

Angonasec

3:57 am on Jan 27, 2014 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Q/
(403 + 403 != 200)
/Q

In my very limited experience, these kind of "bugs" eventually turn out to be conflicting directives, or processing-order confusion. If it's not the first, experiment with the latter, and you'll crack it.

Otherwise Apache is... broken! :)
 

Featured Threads

Hot Threads This Week

Hot Threads This Month