Robert_Charlton - 4:50 pm on Aug 31, 2010 (gmt 0)
Long term plan is to put these links in JS.
Nofollowing the links will not give the other pages on the site "a bump up."
Several options that occur, if you have any dynamic control of the page code (which I'm assuming you might, as how else would you nofollow the links?)....
- Removing the expired pages would be my preferred approach, as it would help by search engines and users.
- Archiving the expired pages, so there aren't multiple links to them taking a large share of available PageRank on a page, would be another approach.
- Adding the noindex,follow meta robots tag to each of the expired pages and linking back either to the parent page or home would be yet another approach. While this would recirculate PR and take the pages out of the index, it would not help the user experience.
For some further detail on robots.txt vs meta robots noindex vs rel="nofollow", see this discussion....
Robots.txt vs. meta robots noindex?