Forum Moderators: open
I know that Google won't index the pages excluded by the robots.txt, but does that mean it also doesn't pass PR to pages that are excluded?
(I want to do this to make printer-friendly pages, which I don't want indexed, but I want to also make sure that it doesn't dilute the PR for the starting page if there are several links like that on one page).
If page A links to both page B and page C then those two pages will have to share the boost in PageRank.
Pages that are not crawled by googlebot are not given a PageRank. They do not take a share of the PageRank boost given by pages that link to them. AFAIK.
A solution might be to let those pages link back to the front page.
Yes, I would do this. This is much better than exculding the pages with robots.txt. The PR decrease for the front page would be non-significant (in the first case). Of course, using JS is also a solution but I would put a link on the printer-friendly pages.