You will find outlined below the procedure we will implement during the beta launch of our new public web site. Are we at risk of being seen as cloaking by the Googlebot?
 If we detect a "SITE" cookie, we'll 302 redirect users to whatever site they got the cookie from.
 If we detect a "bizStateId" cookie (i.e. they've used the old shopping cart), they fall through to the old site.
 We attempt to disallow robots from entering the new site by looking to see if their user agent string contains "MSIE 6.0", "MSIE 7.0", "Firefox/", "Netscape/8", "Safari/" or "Chrome/". If they don't, they fall through to the old site.
 If users makes it this far, we randomly select between the old site and new site and assign a "SITE" cookie.
If the user gets to the new site, we will see if the URL being provided is in our map of old to new URLs.
If it is, we 302 redirect the user to the new URL. If not, we process the URL as normal.
Thank you in advance for any help you will give me