aakk9999 - 2:25 pm on Aug 29, 2013 (gmt 0)
If the page is already indexed, then simply adding that line will not remove it from the index, and the content still exists and the links do as well.
There are no rules for this. Sometimes this is how it happens, sometimes not. I would guess it depends on other external factors (perhaps on links pointing to page etc).
Otherwise, disallowing the site in robots.txt would have no effect and the site would continue to rank rather than being dropped from index like a stone (a very recent experience).
Further, if the above is the standard behaviour, it would be a heaven for spammers - just create a page, let it be indexed and rank it, then disallow it in robots and put spammy content on it instead and watch it being ranked for the old content.
I think this would be more close to what happens - blocking a page that was previously indexedvia robots.txt may or may not result in this page remaining in index and it may or may not rank equally well after being blocked (or may drop like a stone).