We have news stories going back about 15 years that we're concerned contain links that were fine at the time but have subsequently become broken. I'm curious as to what the best way is to deal with them from an SEO perspective.
We can automatically find which stories have dead links but there are too many to manually fix in a short time. We will work through them in time but I'd like to get them removed automatically for now and we can spider our own content to locate them.
Would automatically replacing the links with a link to a single internal page that is barred in robots.txt page apologizing for the dead link and explaining why we can't fix them all straight away be acceptable in Google's eyes?