blend27 - 2:52 pm on Nov 8, 2012 (gmt 0)
I use Proactive methods.
If scraper comes along from an unknown data center IP Range, I usually hunt down and block all the ranges from the company that range is assigned to for ARIN/RIPE.
Most of the traffic from other RIRs i block auto-magically any way on most of the sites, except when HUMAN traffic starts coming in from search engines/sites that I know link to my sites.
RIPE ranges are slightly harder to maintain due to companies constantly coming/going in/out of bushiness.
But in any event the traffic from all those ranges is constantly monitored for human behavior and when the range "GOES HUMAN", it is first put on PROBATION(home cooked captchs/spider traps).
I started building my Software Firewall in early 04, it's quiet an Automated Beast at this point with hooks into several known Abuse APIs, and sites sharing banned IP data via RESTful web services from with in.
I refactor the code every 6 month based on the notes I take.