About the only thing I can think of is if the descriptive text shown on the SERP doesn't match anything on the indexed page, then it's possible that the page is cloaked. It's also possible that the page was recently changed, or that the owner subscribes to some sort of XML direct feed (sanctioned cloaking).
Expanding on what volatilegx said, I look for a couple things.
On Google: Compare the cached page (what Googlebot saw) with the actual page. Focus on the title and the snippet that's in the SERPs. If a page is NOT cached, be very suspicious and check them out further on other SEs.
On Ink and Fast: Compare the title and description in the SERPs to the actual page. Fast also has snippets worth checking.
Other tricks include visiting a site using a search engine's user agent string but this is only effective on "weak" cloaking. It won't work if the cloak relies on IPs or both IPs and user agents.
Unfortunately, cloaking done well is undetectable to all but the SEs. Fortunately, it's a lot of work and often it's not done very well.