tedster - 4:09 am on Apr 27, 2013 (gmt 0)
I can't find the reference right now (it was a video) but I remember that Matt Cutts stated that two URLs need to be "substantially different" not to be tagged as duplicate. Without giving away any secret formula he mentioned the number "around 85%" as a ballpark guess.
If you have just two URLs that are judged to be duplicate, one of them just gets ignored in the SERPs or plave under the "more results" link at the end. Make sense to me. If a site has a substantial number of duplicates, then the entire site might get ignored.