phranque - 4:51 am on Jul 12, 2013 (gmt 0)
welcome to WebmasterWorld, burchy!
My main site is problematic in that it contains over a million variations of one page and the keywords used by visitors are almost unique to each of those pages.
i'm guessing you have a template or set of templates that you are filling with some fields from your database to generate these pages.
if a large percentage of the content of each page doesn't change then much of your content will be filtered, so while it may be indexed it won't rank because it looks like duplicate content.
you have to ask yourself what value each page offers to the visitor before you can expect it to rank.
when you have a large number of pages you need a good information architecture and a lot of link equity to get crawled and indexed well.
this thread would be informative for your situation:
How do huge sites get such complete index coverage?