Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: open
It will be no more or less effective than any other User Agent sniffing method. HttpBrowserCapabilities relies on the filters defined to detect browser capabilities and likewise search engines. I guess the short answer is it will be as good as the filter definitions, then add the normal misses attributed to User Agent spoofing and we're pretty much where we started :)
As for effectiveness, that is dependent on what the purpose of the cloak is. If you are just sniffing to redirect for specific browsers then simple techniques can be effective.
If you're interested in serving up highly optimized pages to search engines while delivering different pages to end users then scripts need to be written, templates created for specific engines, rendering times evaluated and pages carefully created to ensure that human review can't determine that cloaking is being utilized.
I'm not sure what techniques the SEs are using as their de facto method to determine cloaking, but competitors that want to report cloaked sites use a variety of different methods. It pays to be familiar with all of them before cloaking a site.