Forum Moderators: goodroi
example.com/username1/foo/
example.com/username2/foo/
example.com/username3/foo/
...
example.com/username5000/foo/
There are going to be thousands of "username" folders and I want them indexed. However I don't want any of the "foo" folders indexed. Which is the best way to block those folders
User-agent: *
Disallow: /foo/
or
User-agent: *
Disallow: /*/foo/
We'll be using meta tags as well just want to keep the robots file in order.
Note also that regular expression are not supported in either the User-agent or Disallow lines. The '*' in the User-agent field is a special value meaning "any robot". Specifically, you cannot have lines like "Disallow: /tmp/*" or "Disallow: *.gif".
in other words, they so don't want to support wildcarding that they mistakenly refer to it as a regular expression, which they also don't want to support.
note also that grammar are not important to robots...
[google.com...]
<added>
And of course, if there are links to the pages in the subfolders, the meta robots will ensure that those pages don't wind up as a URL only listing in the SERPs.
</added>
also you may want to use htaccess