Forum Moderators: open
[webmasterworld.com...]
Just the other day in another thread I more-or-less complimented their bot for being compliant in following robots.txt.
And NOW I'm forced to eat my hat!
However Szukacz could technically claim that they are compliant.
I'll let you be the judge.
I've added their range to my denies.
Szukacz has take to using a blank refer and ua fields and as a result was denied access to my robots. And traveled into a disallowed folder.
193.218.115.6 - - [09/Jul/2003:01:35:21 -0700] "GET /robots.txt HTTP/1.1" 403 - "-" "-"
193.218.115.6 - - [09/Jul/2003:01:35:25 -0700] "GET /myfile.pdf HTTP/1.1" 200 33484 "-" "Szukacz/1.5 (robot; www.szukacz.pl/jakdzialarobot.html; info@szukacz.pl)"