Cava: I'm probably the wrong guy to ask, but I'll tell you what I _think_ I know ..
Lets there's a web page you want to present to the public, but one you
suspect Google/Yahoo won't like. Lets say Artichoke Porn.
A cloaker generates another page that the engines might like. Salad recipes?
He wants regular visitors to see the Arti-porn page, but Google to see only the nice page.
There may be other ways to do this, but I presume the cloaker relies on his .htaccess file
and/or so-called mod-rewrites.
I'm no .htaccess guru, but in effect the code says something like
rewrite engine on (turns on the filtering mechanism)
IF: user_agent = (Googlebot or Yahoo_Slurp or .. )
THEN: redirect (spider) from Sicko Artichokes to the salad page.
ELSE: Give them the dirty artichokes.
Properly written, I think that last line is redundant. It happens by default.
The real and substitute pages might not be very different at all.
One could simply be stuffed with keywords or other SEO tricks.
NOW! Before you even think of trying any of this, you better _really_ know what you're doing.
Deceptive by nature, cloaking naturally raises grave suspicions whenever detected.
It would be child's play for G or Y to find cloaked pages.
I'm sure there are good and legitimate uses for cloaking, but nevertheless.
Just the appearance of black-hat stuff can get a page or a site banned or delisted.
I never cloak pages. Risks far outweigh benefits for a site like mine. -Larry