The problem you have screwing around with bad bots is if the bad bots actually publish your foolishness and then the good bots index your foolishness, it can all come back to bite you in the butt.
A quick for instance was a bright idea I had of tagging all links in a page with a code that ID's the source of the original page request. Fun for tracking humans using a TOR proxy, or rotating IP pools like AOL, or bots that crawl from multiple IPs, or simply tracking where the data lands. Unfortunately if you put it in the path you need to block all those paths in robots.txt, which Google WMT's will bitch about being kept away from, and if you use a parameter instead then the SE's think it's a new page.
Just plan carefully is all I'm saying or your little bit of fun could result in a trip to the burn unit.