tedster - 3:12 pm on Sep 10, 2012 (gmt 0)
It just seems as if Google is creating these pages when they spider the site.
That's because Google doesn't exactly "spider" or "crawl" a site - at least not the old style way. Instead Google builds a list of URLs that they've discovered and then they assign URLs from that list to googlebot.
In other words, a crawl is not done by hitting the home page, following the links that are currently there, following more links on those new pages, etc. So, historical links still DO get requested and the WMT report will show them as 404 is they are currently 404.
This does not mean that these "crawl errors" are considered a problem. They are only errors in a purely technical sense.
If you want these URLs to return a 404 status, and they do, and there are no internal links left that point to them, then you are OK. At that point you can consider the Webmaster Tools report to be an FYI only. It's not a a list of things you still need to fix or else suffer some ranking problem.