Forum Moderators: goodroi
I have a simple question. My robots file is as follows:
User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
Disallow: /upload/
Disallow: /andamentos/
Inside the /andamentos/ directory, there are several others subdirectories, like /whatever, /info, and so on.
My question is: if I simply put /andamentos/ as disallowed, like it already is, does that mean robots cannot crawl anything inside the /andamentos directory, including these other /whatever, /info subdirectories, or do I need to add other lines disallowing robot access for each of those, like "Disallow: /andamentos/whatever/", and so on?
Another thing: can robots crawl password protected pages?