I have many subdomains for my main site in order to help my rankings. However, I have recently read that subdomains with similar content may be considered spam. The content is different in each subdomain, but I do not want to take a chance of having the subdomains labeled as spam and then hurt the entire site. The subdomains do not have enough value to take this chance.
I do not want to simply take down all of the subdomains. Can I use the robots.txt file to prevent the SE's from further viewing the subdomains? If so, do I put a robots.txt file in each of the subdomains?
I would put in a robots.txt to keep spiders from accessing the information if it is very similar. However if the content is different than there should be no need as the content will not look like spam. If there is some overlap on content that is fine. If the content is more than 50% redundant between the subdomains and the main domain, than a robots txt might be needed.