None of these programs work by magic, and all have to inspect your files.
Some simply fetch the homepage occasionally or subscribe to a pooled database (or maintain one). Others are more aggressive and demand - with varying degrees of success - that all your files be available for regular inspection.
Would one or more of the Slurp robots working for McAfee LinkScanner be any surprise?
Webmasters reported excessive crawling when the last one launched.
You might start by banning specific Slurp instances to see what happens.
It would be less risky and might be a source of knowledge.
The idea is to allow all humans - even Yahoomans.