Forum Moderators: goodroi

Message Too Old, No Replies

robot.txt

problem disallowing pages in home directory

         

surfpro203

11:48 am on Aug 15, 2006 (gmt 0)

10+ Year Member



Hi,
I need to disallow all pages in my home directory as well as all sub directories. I know how to disallow all the subdirectories by using the following

User-Agent: *
Disallow: /subdirectory_folder_name/

But I don't know what to enter so that all pages in home directory gets disallow except for my index.html file

I can't seem to find this info
Any help would be greatly appreciated.

thanks
surfpro

tedster

2:23 am on Aug 16, 2006 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I've been thinking about your question for a while, and I don't think there's a direct way to do what you want and stay within the robots.txt standard (no wildcards, etc.) Two approaches that do come to mind:

1. Change your directory structure -- create a new subdirectory and move all your pages except the home page into it; then change links around the site to reflect the new paths.

2. Use <meta name="robots" content="noindex"> in the <head> section of all those root directory pages that you don't want to see in the search engines.