First time in this corner of WW...
I'm going to be cloaking for various browsers but not search engines. I feel that if my site really sucks so bad that no one can find me, then it's my job to fix my entire site, not just how a search engine sees it.
I would not go so far as to for example send only special items for say Gecko, Opera, KHTML (while we all know we HAVE to for IE right now).
I think any search engine that really cared about if a page was cheating like larryhatch said would not care too much about say sending something for IE versus everyone else. By running two spiders you're only opening up a powerkeg...what if I send IE code and the second spider is looking for cloaked pages using an IE UA? I think it would be best that such a spider would resort to using an abnoral Firefox UA for example, say "Firefox/0.7.4". Therefore most people would just assume it's some Firefox build (I don't recall that far as I think I started using FF around 0.8ish myself). I've seen bots use Safari builds but the list on the MAC site is incomplete.
Ethically you have more to gain by not cloaking. If you can get your site to degrade nicely in say Opera, IE, and Netscape 4 then really there should be no reason to cloak.
Another PLUS for cloaking which I will eventually post here when version 2.7 of my personal site enters serverside development will be how to serve NOTHING to a blank UA as 99% of them tend to be listed in spam lists.
It's a fine line and I think as long as one is perceptive enought to walk fine lines within reason and not get all emotional I suppose then do what you think is most ethical.