Does anyone have any experience handling blogs and duplicate content? Blog software automatically creates duplicate posts and archives them in several different way. Would the best way to handle this problem be with the robots.txt file?
Does anyone else have any suggestions on how to maintain a blog that is Google Friendly?
I would look at the content of whole pages to see if a whole page is a duplicate of another. In my experience although posts (often post snippets) are repeated in archive pages, whole pages are not identical to others, and there isn't a problem.