Welcome to WebmasterWorld Guest from 18.104.22.168
Forum Moderators: goodroi
What I want, instead, is to disallow all but one page, *without having to specifically disallow each page on the site* (pages are added constantly). Am I correct that robots.txt provides no way to do this, since the protocol is to disallow rather than allow?
Welcome to WebmasterWorld [webmasterworld.com]!
It can be helpful to split files which are allowed from those which are disallowed into different subdirectories. In this way, you can disallow a subdirectory path, and add content to it at will. Since the whole subdirectory is disallowed, you won't need to change robots.txt.