lucy24 - 9:57 am on Jun 8, 2013 (gmt 0)
Gotcha. Not the whole content of the directories, just their indexes. And when your fingers typed 404 in the first post, your brain really meant to say 403.
Did all those subdirectories formerly have automatic index files, so any passing robot could see what's there? By switching off the auto-indexing, you've prevented google and other robots from discovering any new pages in the directories-- unless they learn about them from other means-- but you haven't stopped them from requesting the pages they already know about.
I kinda think it would be safer to slap a global no-index label on the directory. If it isn't practical to add meta tags to all the existing files, Option B is to make a supplementary little htaccess file and put it in your target directory. You may already have one there if that's how you turned off auto-indexing for the directory. Add a line that says
Header set X-Robots-Tag "noindex"
and it will cover everything in the directory.
:: looking vaguely around for someone who knows the answer to the SEO aspect of the question, on which subject I am clueless ::