Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: open
Lets assume the concerned industry is not that competitive ...so there will not be people trying to analyse high ranking sites...what is the risk if user agent based cloaking is used in this case...can the SE's easily find out cloaking is used...
Any SE can penetrate any cloak by simply surfing to a site from an "unknown IP," in other words masquerading as a typical surfer, and comparing the site's code to the code their spider retrieved. That's a resource hungry task and I haven't heard of any SE doing it routinely for a long time. I believe blatent "bad cloaking" (where a cloaked page ranks high for Disney.com but delivers the kiddies to a site designed for those over 21 years of age) still earns a ban pretty quickly at most SEs, due to user complaints.
IMHO, if you're going to the effort of cloaking (it is an effort!), use IP cloaking, and keep curious competitors out of your code. There is at least one, probably more, free scripts available.
For the past year or so, I've had better results using plain old on page optimization and I've discontinued cloaking. The SEs have evolved and become better at identifying sites relevant to a search.
More personal opinion, in a non-competitive category, cloaking just isn't needed unless your site is SE hostile. If it is, and you're in it for the long term, your efforts would be better invested in rebuilding the site.
i have dynamically generated meta tags from posts on my forums (keywords are generated from words in that post), and to keep it loading quickly, i only have it work for non-Mozilla browsers so most people won't get it...
However, what good does cloaking your meta tags do anyway? Meta tags don't really do all that much for your rankings...