Welcome to WebmasterWorld Guest from 220.127.116.11
Forum Moderators: goodroi
is it possible to disallow the directory and still allow the subdirectory contents?
No - robots.txt standard calls for case-insensitive match of given disallowed string in a URL that is being checked against robots.txt: since subdirs will all have main dir matched, it means you will disallow access to everything.
You can try however disallowing access to specific files in that main directory ie:
This will not be practical for lots of files (and you can't use regular expressions or wildcards in robots.txt), so you might be better off moving subdirs you want to index elsewhere or allowing to index everything.