enigma1 - 3:03 pm on Jan 5, 2012 (gmt 0)
I had to disable the slow scraper detection because it blocked visitors
Yep, in general doing permanent blocks may backfire and what you experience is not the worse part. You cannot reliably tell if a request comes from a bot or human.
The worst part is, if your site generates revenue (selling online etc) and your competitor figures out you are using traps to ban IPs (that's not a remote possibility), he can use various HTML elements on his sites, pointing to your traps, to make sure all his clients are triggering those. There are also ways to redirect spiders if they know where your trap is located.
If you call this black hat SEO, take into account that what you're doing is called cloaking.