I don't want the contents of a specific directory spidered however there is a subdirectory of this directory whose contents I would like spidering, is it possible to disallow the directory and still allow the subdirectory contents?
is it possible to disallow the directory and still allow the subdirectory contents?
No - robots.txt standard calls for case-insensitive match of given disallowed string in a URL that is being checked against robots.txt: since subdirs will all have main dir matched, it means you will disallow access to everything.
You can try however disallowing access to specific files in that main directory ie:
This will not be practical for lots of files (and you can't use regular expressions or wildcards in robots.txt), so you might be better off moving subdirs you want to index elsewhere or allowing to index everything.