Megaclinium - 1:47 am on Feb 17, 2011 (gmt 0)
I have my images set to avoid scraping. In the control panel, you simply put extensions that are only available from your own web pages. I have .jpg this way.
So if YOUR website accesses the image directly (via user requesting your web page) the image is transmitted.
If a image harvester tries to directly get your .jpg NOT from one of your web pages, the image won't transmit (gets a 302 back to the image harvester bot).
It's simple and effective.
Of course I also do what they are suggesting above, ban hosting ranges. I only do this when they start to hit my site.
My program to analyze the raw logs sorts the hits and filters out either individual IP#s or ranges I've already classified as bots into a separate file.
The result is that any NEW hits come to a separate file and I see them immediately. Because of this I even see widely separated IP#s that are nearby in time and, from the UA apparently the same person trying from their banned range and their non-banned range. (they had scraped me from the banned range and I 403'd them)