We have multiple folders in multiple locations where a subfolder is "popups", none of which we want crawled. In the robots.txt file, would the following be valid, then? disallow: */popups
12:25 pm on Oct 13, 2009 (gmt 0)
The big 3 search engines will accept it and will block all subfolders that are named popups. You are using wildcard aka pattern matching which technically it is not an official part of the robots.txt protocol. Some of the smaller engines which may have trouble with this unofficial work around.
11:09 pm on Oct 18, 2009 (gmt 0)
I thought is was good enough to use: disallow: popups/ to cover _all_ "popups" sub-directories on a site.
1:44 am on Oct 20, 2009 (gmt 0)
I think good is right - I believe you have to use a wildcard to cover everything (*)