Forum Moderators: open
I think many people here use grep or similar ways to search the raw logfiles, which can be very effective. I.E. grep all visits from googlebot to an extra file and have a look at it.
I try to look at least at the error-reports daily. Often you can see problems, bad bots etc. already there.
To support this you can build your site to contain some botstrap like areas that generate errors if robots.txt is ignored - like a 401 (unauthorized) oder 403 (forbidden).
Sorry, I cant offer a turnkey solution for your problem.
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]
[webmasterworld.com...]