If I have a page that serves up random content, it is obviously not for SEO purposes. But if the googlebot comes along, and gets different content than the browser ( due to the randomizing ) can google tell the difference between that and cloaking?
Remember that the objective of a search engine is to return a list of websites that "should" provide information relevant to a particular query.
One of the factors that goes into determining how frequently your site is re-indexed is how frequently it changes (this can be determined by the spider over time), and then a decision has to made as to whether your content remains valid for a particular query.
One of those factors could well be Page Rank - a site with high page rank that changes frequently is far less likely to be penalized for changing frequently than a site with little or no page rank.
So it's a resounding "it depends" i'm afraid.
BBC News for example changes several times a day, but is a high PR site and not penalized in any way at all.