I see lots of people here using noindex,follow or suggesting that other folks use it. I've used it sparingly myself. I'm beginning to have my doubts that it actually works as intended though. Has anybody done an tests?
If nobody else has data I'll probably put together the following new pages test it:
MAIN SITE -> p1a -> p2a -> p3a -> p4a -> p5a -> p6a -> p7a -> p8a -> p9a
MAIN SITE -> p1b (ni,f) -> p2b -> p3b -> p4b -> p5b -> p6b -> p7b -> p8b -> p9b
MAIN SITE -> p1c (ni,nf) -> p2c -> p3c -> p4c -> p5c -> p6c -> p7c -> p8c -> p9c
MAIN SITE -> p1d (in robots.txt) -> p2d -> p3d -> p4d -> p5d -> p6d -> p7d -> p8d -> p9d
where only the p1 pages differ in terms of crawlablility. p2 through p9 would all be completely crawlable. Each page only links to the next page (and maybe to my main site so that if a user stumbles there they can get out). Each page would have a sentence or two of unique content on it. Then I'll compare pr of p2a, p2b, p2c to p2d and p3a, p3b, p3c to p3d, etc. I'll use the crawl rate of googlebot as a proxy for measuring pagerank. To get enough data, I would probably run this experiment at least ten times (by creating the similar page structure ten times)