Forum Moderators: open

Message Too Old, No Replies

Any found problems with cloaking in your global.asa

         

korkus2000

5:58 pm on Jun 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I want to cloak some of these malicious spiders and the best way I have found to do it on NT is to use my global.asa. Does anyone know of any problems I might have with this?

volatilegx

6:15 pm on Jun 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



korkus2000, I am not familiar with global.asa. I cloak on Unix/Linux based servers or Windows based servers running Apache. Could you describe your method?

korkus2000

6:26 pm on Jun 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sure. The global.asa has your application level events. Application on Start and on End. Everytime an iis document(asp, inc, ect) gets hit by a new user the iis fires your global.asa. The global.asa is just an optional file you create in your root to handle application and session events.

So everytime a session is started you have the ability of running server-side code. I am grabbing the server variable user agent. At that point if it matches the bad spiders I do a response.redirect away.

here is a good read on the global.asa
[w3schools.com...]

volatilegx

10:29 pm on Jun 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Sounds almost like the Windows version of the Apache .htaccess file.

One thing you might have a problem with is your method of detecting spiders. Most cloakers don't rely on User Agent detection, because it is easily spoofed. A more reliable method is IP Address detection, or a combination of IP Address detection and User Agent detection.

volatilegx

10:29 pm on Jun 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



thanks for the link, by the way. very valuable.

korkus2000

11:05 pm on Jun 10, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



ya maybe I should use both ip and user agent.

korkus2000

11:08 am on Jun 12, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



From a performance standpoint, does anyone know if the global.asa will cause problems if I run a script on every session? I know thats what its for, but has anyone had problems using the global.asa for session level events?

johnhamman

11:33 pm on Jun 15, 2002 (gmt 0)

10+ Year Member



are you using cloaking in asp.net or just asp?

Pushycat

12:02 am on Jun 16, 2002 (gmt 0)

10+ Year Member



korkus we do that at work on one the largest websites in its category.

We take a three pronged approach by first asking that unwanted bots go away in robots.txt.

Next we test the user agent in the Session_OnStart event and redirect malicious user agents to an explanatory page but with no links on it at all.

If all else fails and we can't rely on the user agent at all then we have an ISAPI filter that can block by IP Address.

To specifically answer your question, we haven't had any bad problems doing it this way. In fact we're very pleased with how it's working right now because of the ease of use in adding new entries to browscap.ini.

BTW, to help with identifying which user agents are malicions I maintain a browscap.ini file that's available for download from my personal site. It includes a special Parent section for what I call "Website Strippers". So just make sure your malicious user agent is in that section and one little test will trap it. The browscap.ini file is completely compatible with standard versions and lots of people download it from me every day.

korkus2000

11:40 am on Jun 17, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I am doing it in both asp and asp.net.

Thanks Pushycat thats what I was wondering. I have been doing it for a little while and wanted to know if someone had a bad experience.