Forum Moderators: phranque

Message Too Old, No Replies

Banned IP Address

Still got a 200!

         

erlandc

4:55 am on Nov 30, 2007 (gmt 0)

10+ Year Member



hmm?

puzzled, here's the culprit - http://tags2dir.com/directory

cannot find any info on the above

what gives?

thanks

e

[edited by: jdMorgan at 10:02 pm (utc) on Nov. 30, 2007]
[edit reason] de-linked [/edit]

phranque

5:37 am on Nov 30, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



what exactly is the problem?

erlandc

6:05 am on Nov 30, 2007 (gmt 0)

10+ Year Member



hi phranque,

as i said, i banned the ip address in my .htaccess file
of the aformentioned culprit yesterday.

i checked my log files today and it got a 200 page from my site, normally a banned ip would get a 403 page.

i'm really puzzled on how this happened

thx

e

phranque

6:12 am on Nov 30, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



how did you ban the ip address?
perhaps you should also or instead ban by referrer.

erlandc

7:12 am on Nov 30, 2007 (gmt 0)

10+ Year Member



my log file

99.239.116.97 - - [29/Nov/2007:00:14:09 -0800] "GET / HTTP/1.0" 200 15606 "http://tags2dir.com/directory/" "tags2dir.com/0.8 (+http://tags2dir.com/directory/)"

denied like this...

<Files *>
order allow,deny
000.000.000.0

Referrer?

RewriteCond %{HTTP_USER_AGENT} ^badbot [NC,OR]

How would I do that?

Not an expert.

Thx

e

erlandc

7:15 am on Nov 30, 2007 (gmt 0)

10+ Year Member



...question, how did that happen? how did that site
do that? most ips i ban, always get a 403 page.

i'm really, really, puzzled

thx

e

phranque

8:19 am on Nov 30, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Options +FollowSymLinks
RewriteEngine On
RewriteCond %{HTTP_REFERER} ^http://tags2dir\.com [NC]
RewriteRule!^custom_403_page\.html$ - [F]

erlandc

9:11 am on Nov 30, 2007 (gmt 0)

10+ Year Member



thanks phranque,

not exactly sure where to place what you sent

Options +FollowSymLinks
RewriteEngine On
RewriteCond %{HTTP_REFERER} ^http://tags2dir\.com [NC]
RewriteRule!^custom_403_page\.html$ - [F]

my .htaccess file ...

RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ^holmes [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^genieBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^Grafula [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^grub\-client [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^grub-client [OR]
RewriteCond %{HTTP_USER_AGENT} ^grub-client-2.6.0 [OR]
RewriteCond %{HTTP_USER_AGENT} ^hget [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^OmniExplorer [OR]
RewriteCond %{HTTP_USER_AGENT} ^panscient.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^StackRambler [OR]
RewriteCond %{HTTP_USER_AGENT} ^asterias [OR]
RewriteCond %{HTTP_USER_AGENT} ^ripe.net/whois [OR]
RewriteCond %{HTTP_USER_AGENT} ^buytaert.net [OR]
RewriteCond %{HTTP_USER_AGENT} ^megabonk.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^pingdom.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^rr.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^comcast.net [OR]
RewriteCond %{HTTP_USER_AGENT} ^test.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^apnic.net [OR]
RewriteCond %{HTTP_USER_AGENT} ^#*$! [OR]
RewriteCond %{HTTP_USER_AGENT} ^rackforce.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^SBIder [OR]
RewriteCond %{HTTP_USER_AGENT} ^datacha0s [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*NEWT [OR]
RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [OR]
RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus.*Webster [OR]
RewriteCond %{HTTP_USER_AGENT} ^Microsoft.URL [OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]
RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^Ping [OR]
RewriteCond %{HTTP_USER_AGENT} ^Link [OR]
RewriteCond %{HTTP_USER_AGENT} ^ia_archiver [OR]
RewriteCond %{HTTP_USER_AGENT} ^DIIbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^psbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailCollector
RewriteRule ^.* - [F]
RewriteCond %{HTTP_REFERER} ^http://www.iaea.org$
RewriteRule!^http://[^/.]\.your-site.com.* - [F]

thanks again,

e

wilderness

10:38 am on Nov 30, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You have plenty of errors is your lines?
Many will not function at all, while others are simply syntax errors.
EX:
The use of ^ is defined as begins with, yet in many of your lines the UA is used in a manner of the UA which is NOTcommonly at the beginning.

When denying IP ranges, it's a learned practice to noy deny short and/or to a singular Class D. Rather to take the IP range at the very minimum up to a singular Class C range.

The periods in Dot.com require escaping.
dot\.com

The example lines supplied to you should go AFTER your line:
RewriteRule!^http://[^/.]\.your-site.com.* - [F]

edites by wilderness.

BTW it would also prove beneficial to your own sanity to keep extenswive lines (such as you provided) organized alphabetically.

erlandc

6:06 pm on Nov 30, 2007 (gmt 0)

10+ Year Member



wilderness,

thanks, got some of what you said, but not
all, as i said to phranque, i'm no expert

i'll try to figure it out, put them in order

however, i'm still looking for an answer to
my original question. i banned an ip address
in the order, deny etc. file and usually it gets
a 403 page, and it came back and got a 200 page.

how did they do that?

thx again

e

wilderness

6:11 pm on Nov 30, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



either your denial failed due to a syntaz error (of your many)

or they simply returned on a different Class D, which you failed to recognize.

When denying IP ranges, it's a learned practice to noy deny short and/or to a singular Class D. Rather to take the IP range at the very minimum up to a singular Class C range.

the file example you provided does not include any denials for the tags directory refers?
(Please note; I'm NOT requesting you submit your entire htaccess to the forum, rather that you review your lines?)

The example that phranque supplied will function, however even that is overkill IMO (i. e., KISS).

The following would prove sufficient:

RewriteEngine On
RewriteCond %{HTTP_REFERER} tags2dir [NC]
RewriteRule!^custom_403_page\.html$ - [F]

(Please Note; Rewrite only needs to be turned on once, unless a line exists which turns it off.)

erlandc

7:38 pm on Nov 30, 2007 (gmt 0)

10+ Year Member



thanks wilderness,

oh boy, looks like i'll be spending the week-end
try to figure out what you said and doing research

i actually got that .htaccess file from this forum
titled "A Close to perfect .htaccess ban list"

All i did was tweak it.

oh well, back to the basics

thanks again

e

phranque

7:41 pm on Nov 30, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



All i did was tweak it.

the .htaccess syntax is unforgiving.
a one character tweak can have large ramifications.

wilderness

8:13 pm on Nov 30, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



i actually got that .htaccess file from this forum
titled "A Close to perfect .htaccess ban list"

Those threads are a superb read for learning, however, many participants which did not have any extensive experience in htaccess, simply corrupted examples by contstant repetion of copying and pasting complete files that either contained syntax errors, overkill and even unnecessary repetion in lines (one example is in the begins with web, which many people use as many as ten lines of web followed by other terms, when the entire tens or more lines may be be replaced with a single line.)

In summary, the "A Close to perfect .htaccess ban list", REQUIRES reading of the entire threads and eliminating the crap which beginners submitted either in their questions for assistance or their zealous acts of copying and pasting.
Determining and providing credence to the participants that utilized valid corrections.

A best practice is to use "A Close to perfect .htaccess ban list" as a starting point for your learning and expand your searches of the Webmaster World forums (and google as well) to insure the examples that your attempting to use are valid application.

erlandc

8:22 pm on Nov 30, 2007 (gmt 0)

10+ Year Member



Well thanks,

I'll so some reading and research.

I do check my logfiles daily and what
I have seems to be working, and perhaps
it isn't.

Back to the grind.

Thanks alot.

I'll be back here soon to see if I've
learned more than what I know now.

e

wilderness

8:37 pm on Nov 30, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I do check my logfiles daily and what
I have seems to be working, and perhaps
it isn't.

I'm not going to check each and everyone of your lines!

Two examples that are invalid:

RewriteCond %{HTTP_USER_AGENT} ^pingdom.com [OR]

UA from my logs and Nov 23
"mywebsite.com" "Pingdom GIGRIB (http://www.pingdom.com)"

As a result the proper use would be either contains pingdom or ends with pingdom.com

The UA does NOT begin with, (^)

RewriteCond %{HTTP_USER_AGENT} ^Nutch [NC,OR]

There are far too many crawls in which the Nutch UA's do not NOT begin with, (^), as a result the correct use would be contains

RewriteCond %{HTTP_USER_AGENT} Nutch [NC,OR]


RewriteCond %{HTTP_USER_AGENT} ^grub\-client [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^grub-client [OR]
RewriteCond %{HTTP_USER_AGENT} ^grub-client-2.6.0 [OR]

These three lines may simply be replaced with
RewriteCond %{HTTP_USER_AGENT} grub [OR]

erlandc

10:35 pm on Nov 30, 2007 (gmt 0)

10+ Year Member



thanks wilderness & pharanque,

much appreciated - looks like it's
going to be a busy week-end (it's cold
outside anyway!)

will apply what you both have helped me
with and will expand my knowledge with
your guidance

will work hard to apply your techniques

thank goodness for take-out

e

phranque

11:15 pm on Nov 30, 2007 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



These three lines may simply be replaced with
RewriteCond %{HTTP_USER_AGENT} grub [OR]

that might do more than you want.
for example, it would also exclude the iusebigrubbers browser.
in other words, your regexp should be as precise as it can and should be.

wilderness

12:49 am on Dec 1, 2007 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



that might do more than you want.
for example, it would also exclude the iusebigrubbers browser.
in other words, your regexp should be as precise as it can and should be.

For your amusement ;)

RewriteCond %{HTTP_USER_AGENT} client [OR]

erlandc

1:40 am on Dec 1, 2007 (gmt 0)

10+ Year Member



thanks guys,

gotta busy week-end ahead of me!

will test and follow your advice
and tips

will let you know how it goes

e