1script - 12:32 am on Jan 12, 2013 (gmt 0)
Found the issue!
It turned out that I was using a UserAgent - based 403 rule for bad bots that was within <Location> tags , which was (has to be) after <Directory> and <Files>, and which was conflicting with any file-based matches.
I think this was the part that was different about 2.2 - if you have any rule in <Location> tags that can also apply to an actual file that exists in the filesystem, it will take whatever is in <Location> and not in <Files>, and my UA-based rules were structured like Allow All, Deny BadBot so the Allow All in <Location> took precedence over Deny All in <Files>
I am still wrapping my head around why on Earth it would behave like this (de-403 a request that was already 403-ed by a previous rule) but even the manual says don't use <Location> rules for anything that can apply to actual files. I guess, the assumption is that it should only be used on "Virtual" URLs - like "prettyfied" URLs.