I'm searching my heart out, but must not be using the correct terms to get to the following answer.
We have a large site that has plenty of subdirectories of similar construction.
Within each subdir, there are specific folders that we want to disallow searches of.
I'm trying to ascertain how little I can get by with in the robots.txt file to achieve the desired results.
If the server has the four following structures:
what disallow language would tell the search engines not to search any "Images" folder?
Would */Images/ do it?
to catch the root, and then all deeper buried folders?
or, must I specify entire subdir paths to the folders I want blocked?