incrediBILL - 11:46 pm on Apr 19, 2013 (gmt 0)
The difference is less than a millisecond. It's less than even a microsecond
Which can add up to a lot of microseconds when people run thousands of rules processing tens of thousands of visitors per hour. I'm all for any optimizations possible because it's good technique which should always be encouraged over slow and sloppy code.
jdMorgan, the prior moderator of this forum, used to make some optimizations that I thought weren't that special until I installed them on my high volume site and you could often note a little extra snappiness all of a sudden.
Also consider many sites run on big shared servers with hundreds or thousands of sites on a single server, or now the cloud, and if all the Apache files are optimized for all those sites on the server(s) it'll give a bit extra resources across the board. Remember, most have huge .htaccess files with tons of bot blocking stuff, sometimes tons of redirects, etc. so any improvement is a good thing and the most optimum method is always best.
FYI, Google rates sites based on performance these days so responding even a fraction of a second better than the competition is an improvement that may make you outrank them and it's not to be overlooked IMO.