serpsup - 10:33 pm on Oct 31, 2012 (gmt 0)
This discussion is related to some Disallow changes I recently added to avoid a faceted navigation spider trap - so I wanted to chime in with a related question. The disallowed urls stopped appearing in the site: inurl: within a week. This was for roughly a couple hundred thousand urls.
From a search engine perspective, is this enough keep those urls from counting for things like duplicate content checks? I've also added noindex, follow meta tag fwiw.