Rosalind - 5:23 pm on Oct 1, 2007 (gmt 0) And so, in order to protect the directory from THIS hypothetical problem, you could propose that all sites get the "nofollow" tag for some period of time (say, a year). Of course, this hypothesis directly contradicts the other, and the solution is the diametric opposite of the other.
OK, so there's a hypothetical problem that old links might not be trusted. And one could just as easily suppose that new links might not be trusted. Certainly there was one vicious pornmeister that submitted lots of apparently attractive celebrity deeplinks to his site -- and then, as soon as each deeplink was added, he immediately converted that page to a porntal. I see a lot of websites that seem to have no other reason for existance than "grab links, then immediately transform into information-free marketroid activities." (At least, I can imagine no other reason for existance.) And so MY theory is that this would be a bigger problem if the webmasters who created such sites weren't so stupid and lazy -- if they spent two or three times as much work on those sites, the strategy might work.
No it doesn't, your logic is flawed. Just because some links go bad soon after they are added, doesn't mean that this is when the majority of links that eventually go bad are going to show problems. My definition of bad links here is not merely spam and porn, but also all those abandoned websites that don't have evergreen content. The bad quality of some newly added links doesn't make the old and forgotten links any better. You all know how it is when you clean out a cupboard you haven't looked at for a few years: it's a case of dust, dead spiders, mouldy socks, and the occasional Rembrant. The web is just the same.
And so, in order to protect the directory from THIS hypothetical problem, you could propose that all sites get the "nofollow" tag for some period of time (say, a year).
Of course, this hypothesis directly contradicts the other, and the solution is the diametric opposite of the other.