Forum Moderators: phranque

Message Too Old, No Replies

Secure Folder

Without More Passwords

         

Jeremy_H

8:31 pm on Nov 30, 2005 (gmt 0)

10+ Year Member



I'm constantly working on my website from many different computers, and sometimes I need access to certain files such as Photoshop or CAD documents, but due to their nature, I want to keep them secure.

I find the easiest way to access these files is by uploading them on my server. But, I know this offers me little protection, as I do not have access to a place on my server that is not automatically made publicly available.

Is there an easy way to create a secure folder online that is not accessible from the http protocol, but only when I ftp into my site to download the files?

I'm trying not to deal with any more passwords then what I need to log into ftp.

skinter

10:06 pm on Nov 30, 2005 (gmt 0)

10+ Year Member



Don't put it in the public_html (or www, or whatever your server calls it) directory.
Put it in your home directory, or FTP (public_ftp, which is still public, though) directory.

Other than that, I don't know what to tell you.

Jeremy_H

6:47 am on Dec 5, 2005 (gmt 0)

10+ Year Member



Thanks for your reply skinter,

Unfortunately, I don't have a section available to me that isn't automatically public. (I only get what's in the public_html folder).

It's not a life or death situation, but I thought I could do something about it with an .htaccess file.

Thanks again

jdMorgan

1:52 pm on Dec 5, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you are only interested in using FTP to access the files, then you could indeed use .htaccess to prevent access via HTTP.

Either mod_access or mod_rewrite can be used to deny any access to a subdirectory via HTTP.

If desired, you could create exceptions to allow HTTP access from specific IP addresess, hostnames, or using a password.

Since .htaccess does not affect access via the FTP protocol, no change would occur in the 'FTP view' of your server.

See Apache mod_access, and the [F] flag of RewriteRule in Apache mod_rewrite.

Jim

Jeremy_H

8:56 pm on Dec 5, 2005 (gmt 0)

10+ Year Member



Thanks so much!

I was able to create an .htaccess file and put it into the secure folder.

All the file says is:

order deny,allow
deny from all

However, does anybody see any potential bugs or pitfalls with this?

Thanks again!

jd01

4:53 pm on Dec 7, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use something similar to this for access to sites that have online stat information...

If you use Fire-Fox's user-agent switcher, you can set a custom user-agent and only allow access to the directory you are protecting to that user-agent.

(It's like password protecting, only different --- because you can set the user-agent allowed to be the same for all sites/directories, and switch once to have access to all directories.)

It looks something like this:

RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} !^my_custom_string-1234$
RewriteRule ^theprotected/directory/ - [F]

Just another idea.

Justin