Robert_Charlton - 9:00 am on Aug 28, 2013 (gmt 0)
i think i'm going to try taking all the "noindex" off, and allow google to index the whole lot.
if no one can point to any penalties (other than the normal thin content and duplicate content ones, which i'm confident don't apply), then what is the harm in allowing google to index them?
The potential harm is that searchers finding an expired page are likely to be upset and back out to Google, which isn't a helpful signal to be sending. And, as netmeg points out, if you're on the edge of a quality devaluation, you don't want to risk that.
On the other hand, noindexed pages actually remain in the index, but they don't appear in the serps. A noindexed page that isn't nofollowed will continue to circulate PageRank to pages it links to, and will pass on the benefits of inbound links the page has received. The meta robots "noindex" tag defaults to "follow".
I see some potential problems, though, with trying to turn meta "noindex" on and off, as, once Google sees that a page is noindexed, it continues checking for the page, but does so less frequently with each subsequent try. (I wish I remember who posted that observation, so I could acknowledge that here). In any event, with Googlebot's return not predictable for noindexed pages, I don't know how you'd time things to get a page back into the serps. Maybe remove the noindex attribute and then use view as Googlebot to get the page re-spidered.
I think outdated pages in SERPs are a major PITA, and I suspect it's a problem that Google knows it needs to solve. Nothing worse for me than checking for a schedule on something that's an annual event and coming up with a page that has no year on it.
Again, netmeg's approach... of removing an expired page for a recurring event from active navigation and taxonomy and posting a notice... sounds like a very good one if it keeps the expired pages out of the serps.