We operate a large e-commerce site with about 2500 skus. We found that most search engines were not picking up anything after the? in the url string which defined the product id such as?product=123456.
In order to aleviate this problem we now serve pages differently based on the user agent. If the user agent is identified as a search engine (GoogleBot, etc) then the page is served as
OURDOMAIN.com/Products/Widget.cfm versus OURDOMAIN.com/ProductPage.cfm?product=123456
This has allowed us to get indexed but we now have 2 questions....
1) Is this considered "Bad" by Google even thought the content on both pages is EXACTLY the same?
2) These new pages all have a PR of 0 since nobody links to them and the old non-idexable page has a page rank of 6...so now that we are indexed our results never come up!
Inbound links from outside websites to your product pages certainly won't hurt but they aren't necessary as long as there are links to the various product pages from a page Google already has in the index.
This thread reads like Gooooooogle has the man-power to human review every page or situation on the internet. In theory it may be safe since the content is the same - but, Google is looking for certain tricks of the trade to ban sites not what you intended.
So, your intentions may be honorable but Google's non-human, automated process may not see it that way.
This is a question I would take to the Google Forum and present to GoogleGuy or at least put it over there for him to have a shot at.
Cloaking is cloaking and Googlebot isn't making good vs. bad judgement calls...