Forum Moderators: goodroi
by just including something that has the effect of *PRINT.htm?
that solution is very convenienent for programming (no need to transfer information within real subdomains and no need to purchase multiple SSL certificates, one per subdomain). Unfortunately, it is very susceptible to any mistake in link placing.
we did not realize that until recently, when a mistake was made. SEs indexed same pages with various subdomains making our own pages duplicate content, that may lead to a penalty.
here comes the question:
how do I use robots.txt in that particular situation? do we just need to create 'real' subdomains (directories on server), create several robots.txt files, one per subdomain and disallow URLs not characteristic and put them into the purpose-created subdomain directories?