will be much more useful over the long term. Using "Deny,Allow" means that you can add exceptions to allow all user-agents to fetch your robots.txt file and your custom 403 error page(s). Failure to do so results in two different kinds of 'infinite-error loops'.
SetEnvif Request_URI "^/(robots\.txt|my-403-error-page\.html)$" allowit # Order deny,allow Deny from all Allow from env=allowit
Requesting example.com/.htaccess results in a viewable file on *many* servers. And lots of these sites get their .htaccess files hacked (because they're easy to find using an HTTP scan, not because they're visible makes them intrinsically easier to modify).
"...that unless the server is misconfigured..."
This is just one of those things where you can *assume* that your host has configured the server correctly and will continue to always configure it correctly, and that any new host that you might move to in the future will do this as well... Or you can just include a few lines of your own code to be sure that no matter whether a host makes a mistake, your .htaccess code will remain 'private'.