I think the problem is harden that it seems. Even if googlebot was as "inteligent" as NS and IE, and it understood all consequences of JS and CSS, it's never going to be able to make a judgement on "good or bad" (if they could they'd have some fantastic AI there). For example, I you put some text and links etc in a layer that is hidden, but becomes visible on rollover (for example help text or navigation). How is that different (from a machine's point of view) a layer full of keywords and links that is hidden and made visible when rolling over on some transparent 1px gif.
Only humans can make that desision, and the only thing google can do is make easy interface for humans to check. For example I often look at the google's cache to see how it differs to the page I can see. If it would tell me how may keywords/links it thinks are in the page, and then highlites them (as they do currently), then you can compare the two.