Forum Moderators: phranque
I've opened httpd.conf and removed the # for this line:
LoadModule rewrite_module modules/mod_rewrite.so
I made sure the file mod_rewrite.so exists in the modules directory.
Restarted apache 2 without any errors.
phpinfo confirms mod_rewrite is loaded.
I then created a .htaccess file in my root directory (links to [localhost...]
I create a few simple lines in the .htaccess file:
RewriteEngine On
RewriteBase /
Redirect /google.html [google.com...]
However I get a 'Not Found' message when typing this:
[localhost...]
The apache error log only has this line:
[Tue Mar 15 19:14:19 2005] [error] [client 127.0.0.1] File does not exist: C:/Program Files/Apache Group/Apache2/htdocs/google.html
Any help will be great :)
Open up the httpd.conf and find these lines:
#
# AllowOverride controls what directives may be placed in .htaccess files.
# It can be "All", "None", or any combination of the keywords:
# Options FileInfo AuthConfig Limit
#
AllowOverride None
And change it to 'All' (wihout quotes)
Open up the httpd.conf and find these lines:#
# AllowOverride controls what directives may be placed in .htaccess files.
# It can be "All", "None", or any combination of the keywords:
# Options FileInfo AuthConfig Limit
#
AllowOverride NoneAnd change it to 'All' (wihout quotes)
Actually this is either a bad idea or a REALLY bad idea, depending on your setup. 'All' relinquishes a LOT of control to any users you may have (if you don't have any users, it's /still/ a bad idea, but I'll get to that in a minute). Reading from [httpd.apache.org ], by using 'All', you've essentially done this:
AllowOverride AuthConfig FileInfo Indexes Limit Options
If you don't have any users (or are close personal friends with all of them) there's something else to consider; if AllowOverride is set to anything other than 'none' (let's use your example and set it to 'all' in the DocumentRoot), and a request comes in for:
http://www.example.com/foo/bar/baz/bing/quux/thingy/blatt/mypage.html
For the record, checking the docs for the docs for the RewriteEngine directive at [httpd.apache.org ]
would have shown you *which* override you need to enable for RewriteEngine to work in a .htaccess file; the Apache docs list the applicable Override in the first portion of the documentation for each directive.
One other thing to note; I see a LOT of folks on this board using .htaccess files for their mod_rewrite directives. I also know that some (perhaps even most) of the people who do have no choice, as they are users on someone else's system. Those of you who *run* servers should *not* be using .htaccess files for mod_rewrite directives unless there's an overriding reason to do so; there's a measureable performance impact for doing so, which is mentioned in the mod_rewrite docs:
Right there, at the second bullet point. =)
Yes, so you've said several times. For proof you've provided a handful of urls. Call my a flying monkey but I think you've left out the middle of your argument. Aside from that, this url
[httpd.apache.org...]
suggests the opposite:
"Default: AllowOverride All"
Soooo could you break it down into bite sized pieces exactly why this is a bad idea? :D
I run apache2triad under WinXP Pro to play with PHP applications before uploading them. I like .htaccess -- mainly because up until now I was complete unaware that I could use httpd.conf -- but also because on a shared webserver elsewhere I don't have control over httpd.conf. So that brings me back to needing .htaccess and seemingly needing 'all'
Thanks much for your patience and willingness to share your experience
:)
Culture:
--------------------
Just to give you some background so you know where I'm coming from; in my current job, I run medium-scale server farms (multiple farms of 12-30 hosts each) which, by a conservative estimate, serve several million hits a day per farm. In my previous job, I ran a small farm (half a dozen hosts), which provided web services for 50-60k users and roughly 9k domains which averaged hundreds of thousands to millions of hits each day. In jobs such as these, two things need to be kept in mind (well, a lot more than two things, but we'll discuss two here). Is your environment like mine? Most likely not. This doesn't mean you can't (or shouldn't be) keeping some/all of the items below in mind.
1) The servers need to be up all the time. Full stop.
Most of the folks on this board run smaller sites. Maybe they're the only user of their site. Maybe they're paying for web hosting but have no root, maybe they have root on the one host they run for their personal site, or maybe they re-sell webservices. In all these cases, you (obviously) want your servers up all the time. In a single-site environment, accomplishing this isn't a huge deal; you control all the host-based variables and with this type of environment comes a certain degree of freedom; config files (and the server installation itself) don't have to be /as/ secure as some larger environments, because no one else has access. This also makes keeping the system up a bit easier, since you're the only one (or one of only a few people) making changes.
Now take an environment such as my previous job; I've got 50-60k users using my systems. Which leads into my next point:
2) You can't trust anyone...
...not at all. Not even a little bit.
OK, that's overstating it. slightly. Do I think my neighbors are out to get me? Is that cat under CIA control [mprofaca.cro.net]? Of course not. Are every one of my users out to do me harm? Of course not. The overwhelming majority of them are basically nice folks, with varying degrees of Clue(tm), who simply want their website up.
A few (ok, more than a few) are ignorant. They don't know anything about How Stuff Works(tm), but they know they can Do Cool Stuff if they follow the helpful directions on $WEBSITE. Sometimes $WEBSITE is dead wrong. Sometimes they (and 'they' can very easily be the person you've just paid $20k to build your website for you) have skipped step #6 because they thought it didn't apply, or it didn't make sense to them. Sometimes $WEBSITE has posted bad data on purpose. And a very few are just plain out to get you. Or rather, your systems. They'll do anything and everything they can to do damage to you, your systems, your network, and so forth.
And you have NO WAY to tell the bad guys from the good guys until it's too late. This means that in order to ensure that your systems are up All The Time(tm), you need take steps to ensure that the ignorant good guys and the malicious bad guys, all of whom are, willfully or not, out to do you harm, won't take down your systems.
Best Practices
--------------------
(note: these are mine. They have evolved over time, and will continue to do so as I learn more. They are not cast in stone; more like whipping cream. They will not work for everyone. They do not solve all problems. Some of these you may not want to use. Some of these you shouldn't use until you learn more about the pitfalls of doing so. Some of you may know exactly what those pitfalls are, and have been down this road before. Caveat Administrator. Do not taunt Happy Fun Ball, etc, etc, etc....)
The fastest time for a server to be hacked is fifteen minutes after it was plugged into the network.-- [securitydocs.com ]
One way to secure a system is to turn on CGI capability in one particular directory, make it so no one except you can place files there, and make your users submit CGIs for your inspection before deploying them. Of course, this doesn't scale all that well, and you're assuming (perhaps correctly, perhaps not) that you know more perl/python/C++ (or even shell script. You laugh, I've seen them) than the person who send you the code.
Some folks just disable CGIs completely. At least PHP allows you to restrict which functions get used, which paths you'll let let opened (do you really want someone open()'ing /etc/passwd on your system and perusing it? Sure, no passwords are stored there (you *have* enabled shadowed passwords, right?), but the usernames are, which is half the battle. I tell you three times and what I tell you three times is true; *any* information an attacker gets about your system, no matter how insignificant, is a weakening of your security. "Eh. it's just my httpd.conf; there's no sensitive information in there, like passwords". Probably true. The attacker does, however, now know what sites are hosted with you, how you have your filesystem laid out, where your documentroot is, etc, etc. This is all potentially useful information.) Have I used CGIs? Sure. I've written them. I administer them. But the Right Tool For The Right Job, and a fork()'d CGI is, in my opinion, rarely the Right Tool.
So, let's say you disable CGI access completely, but you have 'AllowOverride all' in your httpd.conf. Guess what? CGIs aren't as disabled you thought they were. Or, let's (potentially) hit a little closer to home. Let's say your selling webspace to folks, but charge a premium for the ability to run CGIs? Guess what? You're losing money. Likewise for Server-side Includes; AllowOverride All allows any user to turn these on. Or, let's say you've disable CGIs, but enabled server-side includes. AllowOverride All means that users can set 'Options +Includes', which (unlike 'Options IncludesNOEXEC') allows them to exec CGIs from an SSI-enabled page. Woops. And then there's FollowSymLinks vs SymlinksIfOwnerMatch. Mmmmm...more chaos...
In summary:
--------------------
Is using AllowOverride All going to harm most folks in here, really? If they know everyone involved with their site, *and* trust that the password used for uploading content won't be compromised, probably not. You *do* rotate your passwords, don't you? Did you know that there are SSL-aware FTP servers out there? Did you know that you can upload content over HTTP, using DAV [webdav.org]? This can also be made SSL-aware).
Flip the question around; why do you want it enabled? Best Practices (and not just mine) states that you disallow EVERYTHING and only allow what's REQUIRED. This applied to the filesystem, your application, and your network. While it's much more important to follow these rules in larger environments, why not get into the habit of Thinking Securely(tm)? If you have write access to httpd.conf, I'm hard-pressed to come up with a valid reason for using AllowOverride All; AFAIK, it doesn't get you anything that can't be accomplished in other, more secure, ways But hey, if you've got one, I'm all ears. =)
If we are not the server admin, should we adjust settings to conform to these standards, as much as possible, on a per directory basis through the use of .htaccess?
If so, which one(s) should we implement, and what impact will they possibly have on server load/serve time?
Justin
This might seem a 'lazy post' but I think the points made in this thread only strengthen the forum information as a whole.