A site checks the user agent at the server level and shows Internet Explorer users something different from what the rest see.
So, no, it doesn't target robots as such since you'll see the "de-cloaked" site just by visiting in Opera.
It's not cloaking then - just enhancing the user experience by altering the code to suit the browser - I think that's the argument.
But when the de-cloaked site is SEO heavy, keyword laden and more importantly contains content that's not reflected on the marketing heavy Internet Explorer version of the site then surely it is cloaking?
I agree. If the browser specific content is night and day from the SE specific content it'll get busted eventually. Plus using user agent swap methods is too risky. Yahoo is coming in now using mozilla user agent as their identity. So that cloak would mostlikely get busted by inktomi and google.