Forum Moderators: open
What can be done on IIS based web servers to stop users from using crawling softwares which really effect the performance of web server drastically.
Thanks in advance
Dhaliwal
This system proved very effective, though we decided it was more worthwhile to allow the scrapers in because we benefit from the links they provide. (sounds like a dumb reason, right?)
I believe WebmasterWorld uses a similar system. Some ISP's offer spider traps at the firewall level.
I have done this with asp.net 1.0 and asp.net 2.0. The sites am refer to are completely dynamic, static files are filtered though special handlers so they are protected as well.