Hello,
Our new website needs to have (formal reason: required by potential institutional partners) additional non-English versions. Each language version will have e.g. 1,000 subpages, out of which 10 are general pages which are professionally translated & 990 user-generated subpages originally written in English with 99% content translated via Google plugin.
The real worth are the 990 autotranslated pages... What do you think is best?
A/
Block the entire non-English version via robot.txt (or Metatag Robot?) - because the 10 pages are worth little, and we're afraid that poor content on the other language versions may harm the SERPs of the quality content in English..
B/
Use robot.txt (or Metatag Robots?) for the 990 auto-translated pages
C/
Don't use any robot.txt, as there is nothing illicit on those subpages, there's just crappy content which Google may not show anyway until weimprove the content...
This is an interesting problem and i hope that answer may me helpful also for other people.
Thank you very much.