Welcome to WebmasterWorld Guest from 184.72.177.182

Forum Moderators: goodroi

Message Too Old, No Replies

robots.txt expert question

Is it possible for robots.txt to disallow all but one page?

     
7:58 pm on Aug 16, 2003 (gmt 0)

New User

10+ Year Member

joined:July 24, 2003
posts:4
votes: 0


In robots.txt I have:

User-agent: *
Disallow: /

What I want, instead, is to disallow all but one page, *without having to specifically disallow each page on the site* (pages are added constantly). Am I correct that robots.txt provides no way to do this, since the protocol is to disallow rather than allow?

9:02 pm on Aug 16, 2003 (gmt 0)

Administrator

WebmasterWorld Administrator jatar_k is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:July 24, 2001
posts:15755
votes: 0


There is no allow, an option might be to disallow everything but that one page by doing them all explicitly.

User-agent: *
Disallow: /index.html
Disallow: /pages/
Disallow: /img/
Disallow: /otherpage.html

then have the page you want to allow in the root and disallow everything but.

9:34 pm on Aug 16, 2003 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
posts:25430
votes: 0


DanielDiGriz,

Welcome to WebmasterWorld [webmasterworld.com]!

It can be helpful to split files which are allowed from those which are disallowed into different subdirectories. In this way, you can disallow a subdirectory path, and add content to it at will. Since the whole subdirectory is disallowed, you won't need to change robots.txt.

Jim

11:38 pm on Aug 16, 2003 (gmt 0)

New User

10+ Year Member

joined:July 24, 2003
posts:4
votes: 0


Explicitly is not an option, but I like the subdirectory idea. Thank you for the responses. :)
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members