Sounds strange, I'd have to admit, if there are no references to them from the outside. :)
I have noticed that Google in some cases have a hard time letting go.
A domain I have previously had a web-site on it. At some point maybe a year ago, I killed the web-site and turned the domain into a scammer trap. It then had a robots.txt with a Deny for all user-agents. So GoogleBot stayed away.
The result was, that WMT keep complaining to me, that this domain has about 1100 pages "blocked".
The pages have not been in the indexes for likely more than a year, cannot be found on search obviously, and yet WMT keep claiming that I was blocking a lot of data. And in WMT's front-page they list the domain as having "Severe health problems" because of the blocking robots.txt. :)
I recently added an Allow for GoogleBot specifically, and am now waiting to see if it will start visiting all those old URL that has not been seen for more than a year. They no longer exist on the new use of the domain, and will become 404s (hopefully). I'll await to see whether the count of URLs will start dropping in WMT.