Dear all, You will find outlined below the procedure we will implement during the beta launch of our new public web site. Are we at risk of being seen as cloaking by the Googlebot?  If we detect a "SITE" cookie, we'll 302 redirect users to whatever site they got the cookie from.  If we detect a "bizStateId" cookie (i.e. they've used the old shopping cart), they fall through to the old site.  We attempt to disallow robots from entering the new site by looking to see if their user agent string contains "MSIE 6.0", "MSIE 7.0", "Firefox/", "Netscape/8", "Safari/" or "Chrome/". If they don't, they fall through to the old site.  If users makes it this far, we randomly select between the old site and new site and assign a "SITE" cookie.
If the user gets to the new site, we will see if the URL being provided is in our map of old to new URLs. If it is, we 302 redirect the user to the new URL. If not, we process the URL as normal.
Thank you in advance for any help you will give me
<my 2 cents> I think you have a real valid reason for what you're doing. Maybe to help protect against harm from a human review put "beta" somewhere on the site and link it to a page describing your intentions and method. If you shed light on it you should be ok. Heck, Google sponsors cloaking. [google.com] When I asked Senior Cutts about it he kind of smiled and gave me the canned answer of "As long as the material being presented is in the spirit of the underlying page, you're ok." </my 2 cents>
HOWEVER--- If you get smacked with a penalty for your beta test, it'll be a total pain to beg for forgiveness.