What type of cloaking are we referring to here?
I mean, if I'm doing IP based delivery and controlling the bots while they are crawling my site, I would think I'm helping the search engines and they would actually appreciate the efforts I'm taking to keep certain things out of their indices.
For example, I might have a page that has all sorts of filters for the user to sort, display, etc. Googlebot is so good at indexing it will grab all of those filtered URIs. I sure don't want that to happen and I'm going to serve Googlebot a page that is minus the filters. Is there anything wrong with that?
Or maybe the page is heavy with <iframes> and other restrictive technologies. I may not display that to a bot. Is there anything wrong with that?
I believe this forum will become more active as we progress through this year and into 2008. The technical aspects of promoting larger scale sites revolve around what this forum is all about. :)