Here it the situation.
Old site was hacked w/ generated urls
I removed the urls from google via wmt. I added those urls as disallows in robots.txt
Site has now been moved as a plain html site to another server
Question - should the robots.txt file on the new server have disallows for the urls that were generated on the old server given that it would be ALMOST impossible to have those urls generated on the new server.
There are probably still links out there pointing to the old generate URLS although I went through the disavow process using tactical nukess.
thanks
chris