Matt Cutts starts off with a very troubling premise right off the bat:
You can make an infinite number of autogenerated pages on your site
and then proceeds to give examples of various query strings you can (does not mean you should or that anyone did) submit to get a new URL/content. But that's just the issue: if Googlebot had not submitted forms, this would not be an issue. These autogenerated URLs do not exist for any intent or purpose until they are at least linked to (usually maliciously by a competitor) or Googlebot starts to try every combination of every query string parameter. In other words, Googlebot is creating new URLs for itself as it goes, and then the webmaster gets punished.
He then mentions that another search engine claims they have to sift through 20 Bil URLs to find 1 Bil non-spam pages. Well, if you did not let your bot create URLs on the fly, you will have much less URLs to sift through. Say, 5/1 instead of 20/1 ?