I have a sub-domain to my main domain hosted on a different server to help with load issues. Since creating this sub-domain, it appears that Googlebot is disregarding the robots.txt file from my main site. It's crawling and indexing folders that are specifically excluded.
The robots.txt file I had under my main domain excluded about a dozen folders. The robots.txt file in the root of the sub-domain excluded everything. GWT now shows duplicate content issues because pages were crawled from my excluded folders.
I was wondering if I need the exact same robots.txt file to appear for the sub-domain.