Just heard the term cloaking, haven't researched it yet. But I was wondering if my ignorance has been hurting traffic. I have one site with over a million dynamic versions of the same page that is so content dense (and needs to be) that it loads slowly. So I disable display of some minor (processor intensive) content if the visitor is a bot. Page rank is lousy, but I assumed that was because each page has narrow audience and the slow load times. Now I'm wondering if I'm being penalized for cloaking as well. If so, and I convert that content to AJAX delivery, will the spiders index the content before or after the AJAX is loaded? (And by "spider" I mean GoogleBot...) Anyone have a feel for how long removal of cloaking is detected and penalties cease?
You're thinking may be correct. Cloaking is not good in Google's way of thinking, and slow loading sites will also be less attractive in Google's demand for speed.
Can you remove the blocking for Googlebot and test it.
Change one thing and wait. If you change more than one thing you won't know what the problem was.
Find out how often Googlebot comes around. If it's frequently, you may have less time to wait before the pages are retrieved, but it will take longer for the pages to appear, or to re-rank in the index. Until you know exactly what the problem is, it's hard to tell. Sometimes, sites really struggle to come back.