Forum Moderators: open
So everytime a session is started you have the ability of running server-side code. I am grabbing the server variable user agent. At that point if it matches the bad spiders I do a response.redirect away.
here is a good read on the global.asa
[w3schools.com...]
One thing you might have a problem with is your method of detecting spiders. Most cloakers don't rely on User Agent detection, because it is easily spoofed. A more reliable method is IP Address detection, or a combination of IP Address detection and User Agent detection.
We take a three pronged approach by first asking that unwanted bots go away in robots.txt.
Next we test the user agent in the Session_OnStart event and redirect malicious user agents to an explanatory page but with no links on it at all.
If all else fails and we can't rely on the user agent at all then we have an ISAPI filter that can block by IP Address.
To specifically answer your question, we haven't had any bad problems doing it this way. In fact we're very pleased with how it's working right now because of the ease of use in adding new entries to browscap.ini.
BTW, to help with identifying which user agents are malicions I maintain a browscap.ini file that's available for download from my personal site. It includes a special Parent section for what I call "Website Strippers". So just make sure your malicious user agent is in that section and one little test will trap it. The browscap.ini file is completely compatible with standard versions and lots of people download it from me every day.