homepage Welcome to WebmasterWorld Guest from 54.204.73.126
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
Forum Library, Charter, Moderators: goodroi

Sitemaps, Meta Data, and robots.txt Forum

    
Preventing robot access to all subdirectories within a directory
Leonardi

10+ Year Member



 
Msg#: 397 posted 2:25 am on Jun 6, 2004 (gmt 0)

Hi. I am new to the forums, quite helpful folks I can notice.

I have a simple question. My robots file is as follows:

User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /upload/
Disallow: /andamentos/

Inside the /andamentos/ directory, there are several others subdirectories, like /whatever, /info, and so on.

My question is: if I simply put /andamentos/ as disallowed, like it already is, does that mean robots cannot crawl anything inside the /andamentos directory, including these other /whatever, /info subdirectories, or do I need to add other lines disallowing robot access for each of those, like "Disallow: /andamentos/whatever/", and so on?

Another thing: can robots crawl password protected pages?

 

robotsdobetter

WebmasterWorld Senior Member 10+ Year Member



 
Msg#: 397 posted 12:16 am on Jun 7, 2004 (gmt 0)

if you put /andamentos/ as disallowed on the robots.txt robots won't be allowed to crawl anything in /andamentos/ or the subdirectories in it.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Search Engines / Sitemaps, Meta Data, and robots.txt
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved