Msg#: 4584970 posted 3:51 pm on Jun 17, 2013 (gmt 0)
Looking at incorporating some syndicated User Generated Content (UCG) through some third-party feeds together with unique and well-written content. Obviously want the unique content spidered and indexed, but also don't want to run the risk of Dupe Content penalties for the syndicated content.
The intent here is to provide a better experience for the user with additional content, such as user reviews, Twitter posts, etc., along with the unique product descriptions.
I know several years ago Yahoo proposed a norobots-content, but Google never implemented it.
What are people doing in this situation? I don't want to use plain JS or image content for this as the reasons are legitimate and don't want to be seen as cloaking.
The obivous option would be to just load everything third-party with AJAX, but given that Googlebot does read (execute?) JS is that sufficient to ensure that it isn't dupe content or appears illegitimate?