Hi, I just found out that Wikipedia implements "no follow" because of the existence of deadlinks (red links) due to internal hyperlinking.
As my website is also using the Mediawiki software like Wikipedia and since it is a one-year old site with a lot of deadlinks (red links), I wonder whether I can have a global "no follow" in the robot.txt file, followed by the following:
<meta name="Robots" content="index,follow">
on each article page, as soon as I manage to clear all the red links for each page.
I do not know about mediawiki but I do know that robots.txt can not apply nofollow to links. Robots.txt is a voluntary protocol that helps webmasters tell robots which files and folders should be accessed and which should not be accessed. If you block a file using robots.txt the search engine bots will not access it so they will not see the individual robots meta tag.