Need an answer to the following question: Let's say I want to create a blog that will collect the information from RSS feeds on the related topics. But at the same time I want this information to be indexed by search engines and may be ranked in future as then I will add some of my original posts. So the question is the following: is the situation with content duplication on blogs similar to content duplication issues on web-sites? Any ideas?
SE's don't discriminate on the type of website. Duplicate content is duplicate content. Aggregating other people's feeds might fill some space for you, but original content is what the search engines and visitors want.
Blogs are no different than websites. They may come with built in properties that search engines like, eg RSS and user generated content, but if you understand what the SEs are looking for you would implement that on the rest of your site.
As for duplication, I aggregate posts from people who love widgets, post the first 50 words and a link back to their site. Rarely do I rank well for this content but ocasionally, Google mistakes me for the originator of the content and I rank higher than the original blog. It is a longshot for a site model, but go ahead and test it.
The SEs pay the page with the 25 most recent widget posts little attention, but for real people it is great. Where it benefits me is on my original content. I put three posts after my content with links to their blogs and the SEs can immediately understand what my page is about because of the links to already indexed pages.
An article with one link in and no links out is worthless.
An article with one link in and several links to relevent pages is great and will in time lead to . . .
An article with several relevant links in and several relevant links out.