Which is the best to use for large sections of site that have low quality pages?
Will eventually build these pages out but in the meantime want to block them from consuming valuable pagerank and being indexed. Some of these sections of the site have close to 100k pages. Don't want to trip any alarms while doing this either.
in your situation i would generally suggest meta robots noindex. otherwise the urls may be indexed even though the pages are excluded by robots.txt, in which case the SE may choose something outside of your control to show for a snippet.