See this thread [webmasterworld.com] for reference, and see if that makes sense for your situation -- Google will list any link that it finds, regardles of whether it is allowed by robots.txt to fetch the linked-to page. Ask Jeeves and Yahoo do this as well. The thread describes a fix that will work for html-type pages, but won't directly help for non-html pages.
However, you could try a conditional redirect based on the user agent, and see if you can get the spiders to accept an html page when they fetch one of your .swf pages. If so, then you can add the <meta name="robots" content="noindex"> to the html page. Two notes: First, I don't know whether this will work, and second, this is technically cloaking, but since there is no intent to mislead visitors, I wouldn't worry too much about it.
Because the spiders must be set up to accept an html-type 404 error response if, for example, an .swf page is missing, I suspect that they will accept the redirect (or just a 403 or 404 response), and see the meta-tag. Anyway, if you haven't come up with any other ideas, it might be worth a try.