There isn't any way for a site to force a particular behavior from any spider. The companies who operate the spiders program them according to their own wishes. Some spiders do not ever seem to check a robots.txt file at all (Googlebot for instance). There certainly is no way to force a robot to crawl your whole site, and there wasn't any four years ago when this short paper was written.
This paper looks like either notes or a proposal. It's not a report on actual existing standards ... and it's four years old. There are updates from this author on the webcrawler site for 1997 and 1998, then they stop.
Whatever this paper is or was, it is just not the way things actually work. Stick with "disallow".
Thank you very much! Common sense tells me you are right about not being able to force the robots to do anything. I think I was overcomplicating the whole thing. The "promotion" info I read may have been utter nonesense.
I've been studying this whole promotion thing voratiously, and have been able to get some very good results, but after obsessing over every bit of advice from WPG, and trying to make the "perfect" page, I'm starting to get it that you just give the engines what they want instead of agonizing over perfect prominence or whatever. I've even decided to quit using words in alt tags and comments, or anything else that could resemble spam. I want these pages to stick!
Sounds like you are evolving a very sane approach. I know that after a period of experimenting with various "tricks", I returned to an increased focus on writing good content. My client sites are better off for it.
Traffic is not equivalent to ranking, and conversions are not the same as traffic. I feel like I am keeping my eye on the ball a lot more -- and that ball is actual paying customers for my clients.
As in your case, my WPG experience provided a good foundation and a jump in awareness that will always be useful. But I too have stopped obsessing.