I manage a yahoo store that has a feed to many 3rd party shopping / price comparison sites, it's the exact same description that's on my site, could that be seen as duplicate content or could something like this change the long tail traffic from the recent changes?
In many cases I think they are, we're adding unique text to the pages that won't get syndicated. In many cases, my pages aren't even indexed, only the syndicated products come up for specific searches.
Funny, I was just trying to figure this out - especially with posting our products on Amazon. How great would it be not to have to rewrite our content for amazon. But, I came across this on the google webmaster site:
In most cases Google does a good job of handling this type of duplication. However, you may also want to consider content that's being duplicated across domains. In particular, deciding to build a site whose purpose inherently involves content duplication is something you should think twice about if your business model is going to rely on search traffic, unless you can add a lot of additional value for users. For example, we sometimes hear from Amazon.com affiliates who are having a hard time ranking for content that originates solely from Amazon. Is this because Google wants to stop them from trying to sell Everyone Poops? No; it's because how the heck are they going to outrank Amazon if they're providing the exact same listing? Amazon has a lot of online business authority (most likely more than a typical Amazon affiliate site does), and the average Google search user probably wants the original information on Amazon, unless the affiliate site has added a significant amount of additional value.
The point is well taken - if you have the exact text as shopping.com or amazon.com, you're going to be ranked below them. That simple.