Ok, here are my findings:
1. How I diagnosed the problem:
According to Google webmaster central, only 57 of the 129 pages in my sitemap were indexed. But if I do "site:domain.com", tf shows 321 pages are actually there...ironically many of the pages are double-indexed; first 13 pages are all the semantic URLs and the subsequent pages are the unmasked URLs ... not sure how the heck Google found those. So, I guess I need to 301 redirect if I detect that the masked semantic URL isn't used? Regardless, neither version of any of the content actually ranks for anything in the serps.
2. Bing and Yahoo:
In both these cases, I can run "site:domain.com" and I find the pages are all there. Again though, not showing up for anything. Yahoo in particular won't even return us in the serps for a search of "domain.com", although ironically it does return us as #3 when I search simply for "domain". Ultimately though we're getting no traffic from anyone - excellent root-word domain name, 150 pages of unqiue article quality content, and about 10-15 visitors only each day. :( Note - neither Bing or Yahoo show the non-semantic content that Google somehow found.
3. Robots.txt is pretty clean this is it:
4. Duplicate Meta Statements:
This is a good point. Our title tags are all unique, but most of our title tags are the same. Can this really be such a big issue? Do I need to take the time now to write a meta desc for 100s of pages or can I just remove the meta desc?
5. Blog Tags:
We're using a modified WordPress framework for a CMS - I wonder if some of the meta tags can cause an issue:
<link rel="alternate" type="application/rss+xml" title="RSS 2.0" href="/feed/" />
<link rel="alternate" type="text/xml" title="RSS .92" href="/feed/rss/" />
<link rel="alternate" type="application/atom+xml" title="Atom 0.3" href="/feed/atom/" />
<link rel="pingback" href="/xmlrpc.php" />