While the Blekkobot and ScoutJet are following the deny in robots.txt and requesting nothing further, something from Blekko *is* crawling and caching pages.
I put up the denies for these two bots of theirs when I read their public statement announcing they would *not* support the nocache tag and that they would continue to post cached version of all web pages in their SERP.
After a month or so I just checked and they are displaying fresh cached versions of my web pages, so something else is getting them. Anyone know what it is and what range it's coming from?
I just added this block:
220.127.116.11 - 18.104.22.168