Forum Moderators: open
Mozilla/4.0 (compatible; MSIE 5.0; Windows 95) TrueRobot; 1.5
anybody have any idea if when using the UA Rewrite line if I'm required to use the ENTIRE line
or will JUST portions work?
Would the Rewrite be
Mozilla/4.0 (compatible; MSIE 5.0; Windows 95) TrueRobot; 1.5
or
TrueRobot; 1.5
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ^ASPSeek.* [NC]
RewriteCond %{HTTP_USER_AGENT} ^Bot(.*)mailto:craftbot@yahoo.com [NC]
RewriteCond %{HTTP_USER_AGENT} ^Cartographer.* [NC]
RewriteCond %{HTTP_USER_AGENT} ^Crescent.* [NC]
RewriteCond %{HTTP_USER_AGENT} ^CherryPicker.* [NC]
RewriteCond %{HTTP_USER_AGENT} ^Zeus.*Webster.*
RewriteRule ^.*$ [F]
end of quote
I tested this over this slow bot weekend with some 75 bots listed and it didn't stop
squat :-(
Today I removed it and went back to my previous 300+ IP's.
It was my hope to save some of the time required to keep this file current.
I tried very configuration imagineable in defining the UA of this TrueBot and nothing worked. Today his IP deny
worked. :-)
Perhaps I missed something in the RewriteCond?
Thanks in advance
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*NEWT [OR]
RewriteCond %{HTTP_USER_AGENT} ^MSFrontPage [OR]
RewriteCond %{HTTP_USER_AGENT} ^[Ww]eb[Bb]andit [OR]
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Indy [OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus.*Webster [OR]
RewriteCond %{HTTP_USER_AGENT} ^Microsoft.URL [OR]
RewriteCond %{HTTP_USER_AGENT} ^Wget [OR]
RewriteCond %{HTTP_USER_AGENT} ^sitecheck.internetseer.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^InternetSeer.com [OR]
RewriteCond %{HTTP_USER_AGENT} ^Ping [OR]
RewriteCond %{HTTP_USER_AGENT} ^Link [OR]
RewriteCond %{HTTP_USER_AGENT} ^ia_archiver [OR]
RewriteCond %{HTTP_USER_AGENT} ^DIIbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^psbot [OR]
RewriteCond %{HTTP_USER_AGENT} ^EmailCollector
RewriteRule ^.* - [F]
[aspseek.org...]
Hey littleman,
Before Friday I had my htaccess files very organized with an established routine.
Since I had seen so much reference to the use of the previously defined rewritcond's I assumed it would work. In the process I had to shorten some files and spilt them up. It was a bad decision.
Currently things are somewhat disorganized.
Which two new bots seem to have taken advantage of (last night.)
I cannot tell you why I have the ASPSeek added.
I did look at their aforementioned page and it appears to me that anybody can become a ASPseek bot?
Without defining intent and use?
My deny's in this direction are not limited to ASP.
Any bot which does not have a URL defined which can provide an insight into intent and use is denied.
Even "libwww-perl."
It is something entirely specific to my site.
RewriteEngine on
RewriteBase /
RewriteCond %{HTTP_USER_AGENT} ^ASPSeek.* [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^Bot(.*)mailto:craftbot@yahoo.com [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^Cartographer.* [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^Crescent.* [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^webbandit.* [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^WebEMailExtrac.* [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^Webster(.*)Pro.* [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^WGET.* [NC][OR]
RewriteCond %{HTTP_USER_AGENT} ^Zeus.*Webster.*
RewriteRule ^.*$ [F]
Many thanks.
This will have to wait for the weekend.
Yesterday I was successful in adding about 30 rewrites for deleted and moved files. I learned with those that a solitary charcater in code can prevent site function.
Or even an extra space or lack of.