My website has about 200 content pages that I want to show up in the SERP's, and then another 2,000 or so pages that have little content and lots of duplications. The way I have it set up now, at the bottom of all of my 200 content pages I have links to my main pages and my site maps, and the other 2,000 pages I just dissallow crawlers in the robots.txt
Would it be better to allow the SE’s to crawl the 2000 pages (even though there are many duplications) and put sitewide links on all of them, or should I leave it as it is?