@indyank, a metric that measures low-traffic pages divided by total number of pages wouldn't make any sense... it would punish sites that have answered very long-tail inquiries well and with good content. A great site could easily have 95% of its pages be low-traffic pages -- and in that scenario 100% of its pages could be high-quality content for those who land on those pages via search.
Maybe I misunderstood your metric, but it doesn't make sense to me.
It could also be the case that those very-very-long-tail pages were never linked to by anybody -- why would you link to an obscure question/answer that very few people are likely to ask -- e.g. what was the price of gas in Atlanta, Georgia in March 2009? -- so I would like to think that Google won't suddenly reassess quality based on deep links. That would fail miserably for many quality sites.