Forum Moderators: open
Anything I want to block, I don't bother with robots.txtI feel like I’ve said this before--many times--but the only thing better than a blocked request is a request that isn’t made in the first place. Although rare, there do exist robots.txt compliant robots that some specific site might nevertheless not want. And even legitimate search engines may be disallowed from certain directories.
an example of a search engine or bot that does obey robots.txt that you blockOne that comes to mind is PetalBot. I've never bothered to find out what exactly it is, other than it's Chinese. I also currently Disallow Awario (both variants) simply because I don't see why it has to request the same file dozens of times every day. There are others.