Welcome to WebmasterWorld Guest from 126.96.36.199
Forum Moderators: goodroi
What I want, instead, is to disallow all but one page, *without having to specifically disallow each page on the site* (pages are added constantly). Am I correct that robots.txt provides no way to do this, since the protocol is to disallow rather than allow?
then have the page you want to allow in the root and disallow everything but.
Welcome to WebmasterWorld [webmasterworld.com]!
It can be helpful to split files which are allowed from those which are disallowed into different subdirectories. In this way, you can disallow a subdirectory path, and add content to it at will. Since the whole subdirectory is disallowed, you won't need to change robots.txt.