| 6:52 pm on Jan 19, 2011 (gmt 0)|
You need both the order and the deny directives I believe.
| 12:12 am on Jan 20, 2011 (gmt 0)|
you will likely find that
Deny from all
will be much more useful over the long term. Using "Deny,Allow" means that you can add exceptions to allow all user-agents to fetch your robots.txt file and your custom 403 error page(s). Failure to do so results in two different kinds of 'infinite-error loops'.
SetEnvif Request_URI "^/(robots\.txt|my-403-error-page\.html)$" allowit
Deny from all
Allow from env=allowit
avoids these problems.
| 9:06 pm on Jan 22, 2011 (gmt 0)|
Thanks gs1md and Jim - it's appreciated.
If you didn't want anyone viewing your .htaccess file, wouldn't :
deny from all
be best ?
| 10:15 pm on Jan 22, 2011 (gmt 0)|
See above, you need the
order line too.
| 10:24 pm on Jan 22, 2011 (gmt 0)|
Please guide me?
It's been my understanding for more than ten years, that UNLESS the server is misconfigured, htaccess may not be viewed by default?
I've never seen a request for ".htaccess" in my raw logs!
Am I to understand now that an http visitor could potentially view htaccess?
Thanks in advance.
| 11:04 pm on Jan 24, 2011 (gmt 0)|
Requesting example.com/.htaccess results in a viewable file on *many* servers. And lots of these sites get their .htaccess files hacked (because they're easy to find using an HTTP scan, not because they're visible makes them intrinsically easier to modify).
"...that unless the server is misconfigured..."
This is just one of those things where you can *assume* that your host has configured the server correctly and will continue to always configure it correctly, and that any new host that you might move to in the future will do this as well... Or you can just include a few lines of your own code to be sure that no matter whether a host makes a mistake, your .htaccess code will remain 'private'.