Welcome to WebmasterWorld Guest from

Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dealing with dead links too numerous to quickly fix manually

9:17 am on Sep 12, 2016 (gmt 0)

New User

5+ Year Member

joined:June 8, 2011
posts: 28
votes: 4

We have news stories going back about 15 years that we're concerned contain links that were fine at the time but have subsequently become broken. I'm curious as to what the best way is to deal with them from an SEO perspective.
We can automatically find which stories have dead links but there are too many to manually fix in a short time. We will work through them in time but I'd like to get them removed automatically for now and we can spider our own content to locate them.

Would automatically replacing the links with a link to a single internal page that is barred in robots.txt page apologizing for the dead link and explaining why we can't fix them all straight away be acceptable in Google's eyes?
1:38 pm on Sept 12, 2016 (gmt 0)

Senior Member from GB 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Apr 30, 2008
votes: 191

Google in fact advises this method when the link is a paid link, to supress passing link juice. As the page is blocked in robots.txt, Google would not know that you are not redirecting it elsewhere.

What you could do when you are replacing it is to add the original URL as a parameter (or something else that would appear to be unique parameter) and Google would be none the wiser.
4:44 am on Sept 15, 2016 (gmt 0)

Junior Member

Top Contributors Of The Month

joined:Oct 3, 2015
votes: 64

When I went through some files looking for bad links I put the emphasis on the most recent links/files first. Two advantages, one is that there are fewer bad links because the links are newer. The second is that newer pages have more impact on site presentation with the search engines, at least in my opinion.

Maybe Google cares about broken links on 15 year old pages, but broken links on pages of that age are probably more of a norm rather than a unique situation. So, I'd be inclined to leave them until I could fix them.

Conceptually you should be o.k. changing all these historical outbound links to a single internal page barred in robots.txt, but I avoid things like that.