First off, I'm not "detecting" Googlebot and deciding to serve different content. That would be blatant cloaking, I get that.
Let me spin what you said around...
Cloaking would be when the content seen by humans is not seen by Google.
I guess a better question to ask is "does Googlebot see my AJAX'ed post-loaded content?"
If Googlebot does see it, then it's certainly not cloaking since the content is identical.
If Googlebot does not see it, then what?
Is it Googlebot's fault it doesn't see the post-loaded content, or is it my fault because I haven't implemented an "HTML Snapshot" mechanism vis-a-vis [code.google.com
Granted, the link I referred to above warns against using the AJAX Crawlable methodology to cloak (intentionally serve different content for different users), but I'm wondering that if Googlebot can't see the post-loaded content then, under a Manual Review, it could be considered 2 different versions of content - humans getting 4/4 of the content, while Googlebot only sees 3/4 of the content.
I hope I'm making myself clear. I apologize if I'm not articulating well, but I'm certainly not referring to blatant user-agent-detection cloaking.