We currently block our add to cart, other user only links (a total of 6 links) which are placed in our header menu using robots.txt. As far as I understand this means - the 6 links accumulate a lot of PageRank, never gets crawled and will not pass PageRank. BTW we have around 80 million webpages!
If we unblock the links, then Googlebot will get 302 redirected to a login page which is on a HTTPS subdomain; which again is not going to pass PageRank.
We cannot use a meta noindex, nofollow as requesting any of the 6 links get 302 redirected to the login page.
I think it's best for us to use the link rel=nofollow and remove the robots.txt block. So Googlebot will not index the 6 pages, and the pages will not accumulate pagerank. We are not worried about external links pointing to these pages.
We are more worried about Googlebot crawl activity. My belief is that link rel=nofollow is an indexer directive and it doesn't prevent Googlebot from crawling the 6 links? If I am right, then we might see a ton of 302s happening from the header which is present in the 80 million pages.
So to use rel=nofollow or to continue with the robots.txt block?