Google supports extensions to the Standard for Robots Exclusion, but they are the only ones I know of. See their Wbemaster Help pages for more info.
If possible, re-arrange your directory structure into two branches - one branch for files you want indexed, and another for files you don't want indexed. Then you can exclude robots from entire directories, without relying on a technique that only Google supports.
The Standard was invented in simpler times, and is not very flexible. This should be considered when initially designing your sites.