(By 'subdomain' I mean shared hosting. 'www.domain.com' and 'www.subdomain.com' come up as live sites, not 'www.subdomain.domain.com').
I have been advised to "put a robot.txt file in the route that excludes the sub directories of the subdomains. Then put a robot.txt inside the subdomain dir that excludes it from going back to the route" but I have no clue as to how to do this.
Do you map subdomains to subdirectories or anything like that? -- An error in implementing such a mapping function could confuse the 'bot.
You might also want to implement a 301-Moved Permanently redirect to redirect all requests for www.<subdomain>.domain.com to subdomain.domain.com in order to get the search listings corrected sooner, but look for the underlying problem first.