I am new to this forum so i hope this post isn't a clone :)
I use the seo cloaking techniq for a couple of years now. (more then 500 websites UG cloaked) these sites are running well but i notice google has made some changes (more sites get blacklisted) I have to make more domains to restore the damage.
Some people say that UG cloaking isn't a good way to cloak, but it's to much work to edit more then 500 websites, so i hope u people have some good tips for me.
IP cloaking is preferred to User Agent cloaking because it is far more secure. It's extremely easy to crack User Agent cloaking, because it is a simple matter to fake your User Agent and visit a cloaked page masquerading as Googlebot.
Changing 500 sites to IP cloaking would be a tall order, though.
There are other environmental variables that can be used for cloaking purposes other than REMOTE_ADDR or HTTP_USER_AGENT such as: HTTP_REFERER, REMOTE_HOST, HTTP_ACCEPT_LANGUAGE, HTTP_COOKIE, and many others. Pay attention to what connections from spiders "look like" in whole not just what their IP is.
While upside is correct, if you are at all serious about cloaking, you should cloak by identifying the IP address of the visitor as belonging to a search engine. Every other indicator (save possibly REMOTE_HOST) can be spoofed.