| 1:46 pm on Jul 17, 2002 (gmt 0)|
Ive seen it on a host before but I am not so sure it normal or due to a lack of knowledge on how to set it up. As long as you cant see there files I dont think there is much of a risk of any trouble happening due to it.
| 2:13 pm on Jul 17, 2002 (gmt 0)|
diddlydazz no I don't think it is normal. I had to move a site from another developer and the ftp access they gave me allowed me into all their accounts, stupid or what.
It's certainly not neccessary, imo they have not set server permissions corectly.
Giving out info of who else is hosting on a server is not clever, especially if there was any expectation of confidentiality...
| 2:29 pm on Jul 17, 2002 (gmt 0)|
Yes, this is quite normal on a shared server. Reading another users files is pretty much aequivalent to fetching them through the httpd with a web browser, after all. In theory, it would be possible to "hide" the other accounts from you, but for one those mechanisms can still be circumvented (eg. through a cgi script), and they also put quite some strain on the server resources.
If a user stores any confidential data on such a machine (such as credit card numbers), then they'll have to encrypt them with a public key, keeping the private key offline. If you need full confidentiality for all your files, then your best option is to rent a dedicated server.
| 2:38 pm on Jul 17, 2002 (gmt 0)|
Thanks for your replies,
I dont care as long as they can't do anything to my files ;)
it makes quite interesting reading looking at all the accounts :) (not really)
| 3:18 pm on Jul 17, 2002 (gmt 0)|
make some chmod changes to your own user directory wherever so if your directory bob is in /home/bob go into /home/ and type your chmod command to make it so group and other cant read/write/execute your files. :)
| 9:54 pm on Jul 17, 2002 (gmt 0)|
The big problem of web hosting on Unix is that you don't want other users on the shared host to read your files, as EliteWeb has mentioned. However, you *do* want the webserver to read them! Therefore you cannot just chmod go-rwx because the webserver will not be able to enter your home directory to retrieve your files/scripts. Somehow you want to exclude the other users, except the webserver process...
That is why some people change their directories to be executable but not readable, i.e. chmod 711. Everyone can come into your directory (executable) including the webserver, but they cannot do a file listing. It is okay for the webserver because it knows exactly which file it needs, but for the human users, it can be difficult. Usually a human user needs to browse the directory around because he/she knows which file he/she wants to look at.
It is not a good solution and it only works "okay". I am with a web host that provides a suid command to toggle the owner of the file between the webserver and yourself, so that you can safely set it to be mode 600. But it is quite troublesome if you need to do it for everyfile...
| 10:01 pm on Jul 17, 2002 (gmt 0)|
How about getting the admin to do something like:
$> chown bob:apache /home/bob
$> chmod 770 /home/bob
I'm not a Linux expert but that's the way I'd go about trying to secure it the way you want...
| 10:20 pm on Jul 17, 2002 (gmt 0)|
$> chown bob:apache /home/bob
$> chmod 770 /home/bob
That's exactly one of the examples I was hinting at above, that looks secure but really isn't. All another user on the same server has to do is write a CGI script that allows him to browse through directories on the server. Since this script runs under the user:group of the httpd, your chmod 770 files above will all be accessible to him without any problems. Nothing gained other than an administrative nightmare: Do you really want to involve an admin each time you upload a new file?
The only practicable solution for data that the httpd needs to read and write and that shouldn't be accesible by other people on the same machine is to use something like cgiwrap. This will change the identity of the CGI process temporarily to that of the user, so that it can access directories and files that are closed for others. Of course, the user needs to understand how to use this program, and it's definitively not worth the effort just for the normal html files sitting around. And if you have really critical data, then you should encrypt them all the same.
| 6:56 am on Jul 18, 2002 (gmt 0)|
Am I missing something... diddlydazz talked about being able to use ftp to see other directory names but not their files.
If they set up his user account with ftp locked to his directory entry that should solve the ftp issue because he would ftp into his dir as if it were the root of his server and would not (using ftp) be able to get into others or a higher level.
| 10:41 am on Jul 18, 2002 (gmt 0)|
Looks like a lot of unix crew in ;)
<<-- If they set up his user account with ftp locked to his directory entry that should solve the ftp issue because he would ftp into his dir as if it were the root of his server and would not (using ftp) be able to get into others or a higher level.
Thats how it is with the other 7 hosts that I use, I don't understand why it has to be like this.
The techie (who I am starting to doubt for other reasons) said "oh this is the way it HAS to be" and I said well none of my other hosts are like this and he said and I quote ;) "We are a professional hosting company, and I have been doing this for 15 years"
So I dont know :)
I dont care as long as it is secure.
| 1:04 pm on Jul 18, 2002 (gmt 0)|
ftp locked to his directory entry that should solve the ftp issue
Yes, this solves the "FTP issue". Since the FTP daemon only needs access to a handful programs (ls and gzip, essentially), this part is very easy to implement. It also makes it possible that the host can offer "anonymous FTP" for your domain on a shared server. I couldn't leave with this last feature myself.
However, it doesn't change the fact that a CGI script can (and potentially needs to) access almost everything on the machine. There's also the problem that any nontrivial hosting account probably includes shell access, so that limiting the view by FTP may be nice and practical, but rather pointless as a security measure.
Ok, now I read the original post again very carefully, and noticed that it indeed only mentions FTP. As long as you're talking about FTP alone, then yes, I don't see why you'd need access outside of your account through this one channel, and the restriction is easy to implement. But it is very important to remember that doing so improves the system security only marginally if at all. It is mostly done for practical reasons, and to keep any potential newbie customers from getting confused.