I'm searching my heart out, but must not be using the correct terms to get to the following answer.
We have a large site that has plenty of subdirectories of similar construction.
Within each subdir, there are specific folders that we want to disallow searches of.
I'm trying to ascertain how little I can get by with in the robots.txt file to achieve the desired results.
If the server has the four following structures:
www.server.com/Images/picture.gif
www.server.com/subdirectory1/Images/picture.gif
www.server.com/subdirectory2/Images/picture.gif
www.server.com/subdirectory3/andevenmorefolderstructure/Images/picture.gif
what disallow language would tell the search engines not to search any "Images" folder?
Would */Images/ do it?
or /Images/
*/Images/
to catch the root, and then all deeper buried folders?
or, must I specify entire subdir paths to the folders I want blocked?
/Images/
/subdirectory1/Images/
/subdirectory2/Images/
/subdirectory3/andevenmorefolderstructure/Images/