Welcome to WebmasterWorld Guest from

Forum Moderators: goodroi

Message Too Old, No Replies

Preventing robot access to all subdirectories within a directory



2:25 am on Jun 6, 2004 (gmt 0)

Hi. I am new to the forums, quite helpful folks I can notice.

I have a simple question. My robots file is as follows:

User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /upload/
Disallow: /andamentos/

Inside the /andamentos/ directory, there are several others subdirectories, like /whatever, /info, and so on.

My question is: if I simply put /andamentos/ as disallowed, like it already is, does that mean robots cannot crawl anything inside the /andamentos directory, including these other /whatever, /info subdirectories, or do I need to add other lines disallowing robot access for each of those, like "Disallow: /andamentos/whatever/", and so on?

Another thing: can robots crawl password protected pages?


12:16 am on Jun 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member

if you put /andamentos/ as disallowed on the robots.txt robots won't be allowed to crawl anything in /andamentos/ or the subdirectories in it.

Featured Threads

Hot Threads This Week

Hot Threads This Month