Forum Moderators: open

Message Too Old, No Replies

Mr.Carlito

         

idiotgirl

7:51 am on Jul 6, 2008 (gmt 0)

10+ Year Member Top Contributors Of The Month



Here's one I haven't seen before:
64.237.57.*** - - [05/Jul/2008:20:28:36 -0400] "GET / HTTP/1.1" 200 7643 "-" "Mozilla/5.0 (MrCarlito-0.1 http://www.mrcarlito.com/spider.html)"

Didn't check robots.txt. The reference page says:

MrCarlito-0.1 is an experimental spider that collects header & link information from web pages. The spider is written in PERL (Practical Extraction and Report Language), and uses the LWP::UserAgent Class. Currently this spider does not delve into websites, it simply obtains the headers & hostnames contained in your web page index.

IMHO - it would be more polite if Mr.Carlito bothered to check with robots.txt to see if he's welcome. I guess that's not Carlito's Way.

[edited by: incrediBILL at 8:12 pm (utc) on July 6, 2008]
[edit reason] fixed formatting and link [/edit]

wilderness

8:43 pm on Jul 6, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



64.237.57.zzz - - [28/Oct/2007:17:36:18 -0500] "GET / HTTP/1.1" 301 313 "-" "Mozilla/5.0 (MrCarlito-0.1 [mrcarlito.com...]

added the backbones 32-63 Class C as a result.

Megaclinium

4:13 am on Jul 7, 2008 (gmt 0)

10+ Year Member



Is there a maximum # of addresses or ranges over which adding more to your IP deny list slows down the server?

wilderness

1:36 pm on Jul 7, 2008 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I've not heard of anybody hitting limit walls when utilizing "simple" IP or UA denies.

My own file some 1,700 lines, having been condensed multiple times.

What will slow requests down are processor intensive rules.

Don