Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Preventing robot access to all subdirectories within a directory



2:25 am on Jun 6, 2004 (gmt 0)

Inactive Member
Account Expired


Hi. I am new to the forums, quite helpful folks I can notice.

I have a simple question. My robots file is as follows:

User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /upload/
Disallow: /andamentos/

Inside the /andamentos/ directory, there are several others subdirectories, like /whatever, /info, and so on.

My question is: if I simply put /andamentos/ as disallowed, like it already is, does that mean robots cannot crawl anything inside the /andamentos directory, including these other /whatever, /info subdirectories, or do I need to add other lines disallowing robot access for each of those, like "Disallow: /andamentos/whatever/", and so on?

Another thing: can robots crawl password protected pages?

12:16 am on June 7, 2004 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Jan 31, 2004
votes: 0

if you put /andamentos/ as disallowed on the robots.txt robots won't be allowed to crawl anything in /andamentos/ or the subdirectories in it.

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members