tedster - 4:14 pm on Apr 16, 2011 (gmt 0)
This has been one of the common complaints about the SERPs - that the scraper site problem got worse instead of better after Panda. Google claims that the update just before Panda was aimed directly at the problem, and then they mentioned copied content again in the interviews about Panda.
It's hard to make sense of Google's statements compared to the experience of many webmasters. One defensive tactic some sites are trying is delaying their RSS feed until googlebot spiders the first time.