All it ever does is ask for robots.txt and then the front page. Honestly I don't think that's worth the trouble of blocking. Factor in the several seconds it would take to add a Deny From or BrowserMatch line (once) and the added nanoseconds for the server (on every request ever, in perpetuity) -- and for what?
ymmv, but if all they ever ask for is the front page, I don't even particularly care if they asked for robots.txt or not. That is, a previously unknown robotic visitor from a not-otherwise-blocked IP only warrants further action if (a) they asked for any interior page or (b) they show up with an ostentatiously fake* UA.
* Where "fake" includes giving an URL that leads to anything other than a currently valid informational page in some major European language.
Oh yes and... At the time I posted the above, I forgot that I'd had another visit from Findxbot just a couple of days ago. This time, over the course of twenty minutes, they asked for -- robots.txt -- one directory -- other directory (this is my personal site, which only has two) The funny part is that when I first looked at the record, I only considered the last part of the timestamp, and thought they had done all this in the course of ten seconds-- which would still have been perfectly acceptable for a three-request visit.