brokaddr - 3:34 am on Nov 11, 2012 (gmt 0)
I have a problematic host that is consistently trying to scrape my site.
Which is the most effective method of blocking, in relation to bandwidth/server consumption if the pest is persistent?
SecRule REQUEST_HEADERS:REMOTE_HOST "host-name-here" deny,status:403 - this doesn't seem to work.
SetEnvIfNoCase Remote_Host "host-name-here" bad_bot