Several years ago, googlebot was not very smart about common cases of non-canonical urls. www vs no-www, index.html, session id parameters, and such caused real crawling and ranking problems. Canonicalization was a solution, and it worked.
I always thought it was silly that googlebot couldn't seem to deal with www and no-www returning the same content. Today, googlebot seems much smarter to me about some of the basic cases.
It also used to be the case that how you linked your site together internally mattered much more. Back in the days when pagerank sculpting worked wonders, canonicalization was a way of getting every last drop of pagerank your site had coming to it. A few years ago, Google implemented some algorithms that made sure that sites that are not well sculpted or have some canonicalization issues aren't at a disadvantage. As a result, it is no longer necessary to canonicalize as much as in the past for ranking boosts.
There are certainly cases in which Googlebot still doesn't identify non-canonical content properly and canonicalization can help. You can usually tell this is the case by looking in your logs and seeing if Googlebot is spending time crawling two versions of the same url.