Just heard the term cloaking, haven't researched it yet. But I was wondering if my ignorance has been hurting traffic. I have one site with over a million dynamic versions of the same page that is so content dense (and needs to be) that it loads slowly. So I disable display of some minor (processor intensive) content if the visitor is a bot. Page rank is lousy, but I assumed that was because each page has narrow audience and the slow load times. Now I'm wondering if I'm being penalized for cloaking as well. If so, and I convert that content to AJAX delivery, will the spiders index the content before or after the AJAX is loaded? (And by "spider" I mean GoogleBot...) Anyone have a feel for how long removal of cloaking is detected and penalties cease?