Forum Moderators: phranque

Message Too Old, No Replies

Bots are killing my site need help

         

propaganda

3:38 pm on Aug 12, 2009 (gmt 0)

10+ Year Member



I am trying to block some bad bots on my e107 powered site and am not having much success. There are constantly 30-60 guests at all times eating up bandwidth. If I check my logs the offender is this agent:

Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727; InfoPath.1)

In my robots.txt I put.

Disallow:/
User-agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1; .NET CLR 2.0.50727; InfoPath.1)

Which doesn't seem to stop it. Any other way to block this agent?

wilderness

4:32 pm on Aug 12, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Robots.txt does NOT block/deny anybody.

Rather, it makes a request to anybody (person or machine) to comply with requests.

You'll need to make an adjustment in the ".htaccess" file (please note that the possible result of editing this file (due to a syntax error) could result in a "500 server error" and prevent your website (s) from functioning at all.

a google on htaccess+deny [google.com] will provide you with some insights.

However my suggestion is that you explore "multiple conditions" and restrict your visitor to BOTH an IP range and either the complete UA you provided or a portion of the UA you provided.

wilderness

4:39 pm on Aug 12, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Some OLD Tutorial links [webmasterworld.com] that may assist your understanding.

propaganda

4:53 pm on Aug 12, 2009 (gmt 0)

10+ Year Member



What I added to my .htaccess is the IP of the culprit

SetEnvIf User-Agent ^Java keep_out
SetEnvIf User-Agent ^Web keep_out
SetEnvIf User-Agent Library$ keep_out
order allow,deny
deny from 142.205.213.254
allow from all
deny from env=keep_out

Doesn't seem to be helping though still getting through on that IP judging by the logs...

wilderness

5:19 pm on Aug 12, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Denying access to a visitor will NOT stop the requests and denies from appearing in your visitor logs.
Is the number immeduiately following HTTP/1.1" a 200 or 403?

suggest the following changes (not sure why you'd need a Toronto bank with access to your site):

SetEnvIf User-Agent Java keep_out
SetEnvIf User-Agent ^Web keep_out
SetEnvIf User-Agent Library$ keep_out
order allow,deny
deny from 142.205.0.0/16
allow from all
deny from env=keep_out

In addition. .

e107 powered site and am not having much success. There are constantly 30-60 guests

I've no clue waht e107 is!
Are your guests being managed by the same htaccess as the lines you've included are contained?
EX:
If your adding these lines to your root htaccess and your guests are managed by a sub-directory (possibly Word Press or some other program), which contains it's own directory htaccess (overriding your root settings), than you'll need to be positive that your adding the lines to the correct file and/or location.

tangor

6:17 pm on Aug 12, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I would suggest "order deny,allow" instead of "order allow,deny" to insure processing those UA's FIRST before allow...