Welcome to WebmasterWorld Guest from 54.160.163.163

Forum Moderators: goodroi

Message Too Old, No Replies

robots.txt expert question

Is it possible for robots.txt to disallow all but one page?

     

DanielDiGriz

7:58 pm on Aug 16, 2003 (gmt 0)

10+ Year Member



In robots.txt I have:

User-agent: *
Disallow: /

What I want, instead, is to disallow all but one page, *without having to specifically disallow each page on the site* (pages are added constantly). Am I correct that robots.txt provides no way to do this, since the protocol is to disallow rather than allow?

jatar_k

9:02 pm on Aug 16, 2003 (gmt 0)

WebmasterWorld Administrator jatar_k is a WebmasterWorld Top Contributor of All Time 10+ Year Member



There is no allow, an option might be to disallow everything but that one page by doing them all explicitly.

User-agent: *
Disallow: /index.html
Disallow: /pages/
Disallow: /img/
Disallow: /otherpage.html

then have the page you want to allow in the root and disallow everything but.

jdMorgan

9:34 pm on Aug 16, 2003 (gmt 0)

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member



DanielDiGriz,

Welcome to WebmasterWorld [webmasterworld.com]!

It can be helpful to split files which are allowed from those which are disallowed into different subdirectories. In this way, you can disallow a subdirectory path, and add content to it at will. Since the whole subdirectory is disallowed, you won't need to change robots.txt.

Jim

DanielDiGriz

11:38 pm on Aug 16, 2003 (gmt 0)

10+ Year Member



Explicitly is not an option, but I like the subdirectory idea. Thank you for the responses. :)
 

Featured Threads

Hot Threads This Week

Hot Threads This Month