Forum Moderators: open
I'm adding a script to the Session_OnStart section of my global.asa file that will do two things:
Identifying which robots and which file requests I want to ban is the easy part. For example, I'm banning a number of e-mail harvesters, and I'm also banning requests for URL stems such as the following:
Can anyone suggest what action will use the least server resources, or what advantages particular responses or status codes might have? For banned bots, is there a status I could send that would discourage them from returning?
Any thoughts or insight on these questions is appreciated. It's been a while since I've posted on WebmasterWorld, and it's good to be back!