Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Pages with Partially Syndicated Content

No indexing of page elements

3:51 pm on Jun 17, 2013 (gmt 0)

Junior Member

10+ Year Member

joined:Jan 26, 2004
posts: 63
votes: 0

Looking at incorporating some syndicated User Generated Content (UCG) through some third-party feeds together with unique and well-written content. Obviously want the unique content spidered and indexed, but also don't want to run the risk of Dupe Content penalties for the syndicated content.

The intent here is to provide a better experience for the user with additional content, such as user reviews, Twitter posts, etc., along with the unique product descriptions.

I know several years ago Yahoo proposed a norobots-content, but Google never implemented it.

What are people doing in this situation? I don't want to use plain JS or image content for this as the reasons are legitimate and don't want to be seen as cloaking.

The obivous option would be to just load everything third-party with AJAX, but given that Googlebot does read (execute?) JS is that sufficient to ensure that it isn't dupe content or appears illegitimate?
4:59 pm on June 22, 2013 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
votes: 191

Perhaps you can load the UGC content into a separate page and frame that page within the page you want to show it on. Then you can either noindex framed URL or block it via robots.txt