Forum Moderators: phranque

Message Too Old, No Replies

Securing php and apache for virtual hosts

Securing php and apache for virtual hosts

         

jmichaels

6:12 am on Sep 18, 2009 (gmt 0)

10+ Year Member



Not sure this is the correct forum, the php one did not seem right either.

I have php 5.3.0 with Apache 2 on Mac OS X, using Pureftpd as an ftp server with TLS. I am not currently offering SSH other than to myself.

I plan to host multiple clients sites on this server. This question is in regards to securing the system as a whole.

All files are uploaded as user/group pureftp. 644/755
I noticed that I could do things in php like file_get_contents(../../someone/elses/stuff/database.inc.php)

Is the correct way to solve that to use "open_basedir" in the Virtual Host for Apache?

Does "open_basedir" stop exec() from something like exec('cat ../../somefile.php'), if not, what is the suggested method to secure this?

I think the summary is I am looking for how to secure php in a shared hosting environment.

within httpd.conf I include a virtualhost conf file. Is it acceptable to next the Directory block inside the VirtualHost block. This seems more organized to me, and seems to work.

Any other suggestions? Perhaps there are ways to have Apache run and lock down all access to a certain directory. However, I know php needs access to /tmp and possibly other things, like curl being able to fully follow redirects.

How are large shared hosting places setting these systems up?
Thank you for any pointers.

jdMorgan

3:05 pm on Sep 18, 2009 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This is an extremely complex subject, and one in which I can claim no real expertise, especially as regards Pureftpd.

But on the HTTP/Apache side, your best practice would be to declare each "client's site" in a separate VirtualHost, and then either symlink or ScriptAlias PHP (and all other common resources) so that they appear to be present in each clients' private filespace while still actually being only a single "real" instance in the server's filespace.

By declaring a separate vHost and DocumentRoot for each client, you make it impossible for any client to use "../../" to "go above" their assigned document root directory via HTTP.

Jim

jmichaels

6:50 pm on Sep 18, 2009 (gmt 0)

10+ Year Member



I declare distinct vhost and doc roots. I am not sure how aliasing php's internals would help, as if using mod_php, all files must still be owned by something Apache can serve.

There are other ways like suexec php and the like, but those come at something like 30x slower parsing, and the user has to remember to make sure to set user/group correct, or they are still hosed.

vHost and DocumentRoot do not solve <?php file_get_contents('../../../....') ?> and open_basedir only partially solve that.

Looking for hosting places that have been down this road before.
Thanks