Could you guys share your experience on automatic idetification/blocking of web spiders/robots?
It's pretty easy to manually block UA's or IP's using .htaccess, but what is the most resource-friendly and efficient way to block them on-the-fly, without human intervention?
Thanks.