Str82u - 1:36 am on Apr 16, 2013 (gmt 0)
@jimbeetle I can't find the source but I'm certain it was a MC video that he says blocking too many pages causes problems for a site. Personally I've had ranking drop over a week's time blocking 100 or so URLs then recover after removing the disallow.... twice.... and it's pages they don't index.
@lucy24 Yes, of the pages in our site maps for large sites, not all the pages are indexed and I know why, not duplicate content either. It's less than .20% of the total pages and they do get crawled, but not indexed. That's according to GWT and can be verified by the site: operator.
You'll fare best if you pretend spiders don't exist and just build a site for your visitors