A major hospital website has thousands of pages indexed in Google, but there are some main pages that are not indexed. The site is run on a CMS, but there has not been any problems indexing (so it seems). What is the best way to get Google to see these other pages? Is it just a matter of doing better interlinking between internal pages? Or do I need inbound links from other sites?
Your ideas are all good ones, anything that improves PR circulation and link power to those inner pages should help. Also you can set up a Webmaster Tools account for the domain and see what kind of feedback you get in the diagnostics area - including the new Content Analysis [webmasterworld.com] section.
Especially if the site is very large, a Google Sitemap file [google.com] can help googlebot locate more urls.
However, even if the urls get spidered, that's no guarantee they will make it into the index. With the larger sites I work with, I find it is extremely rare to get anywhere near every url into Google's index.
I will try the interlinking idea, and I will ask if they created a Google Sitemaps file or not. If Google claims it's good, then it should be good, right? The CMS seems to be pretty SEO friendly. Not sure if there are duplicate pages being created. I just think these pages are deep in the directory and have not had other internal pages linking to them (other than the breadcrumb trail links to let users know "you are here").