Forum Moderators: phranque
I am pretty sure that placement of the phpaccess105UnixServer.php file in relation to the directory structure and file locations, etc. is OK. It just isn't protecting php, which is a big problem since the site depends on php.
Thanks.
Dave
Welcome to WebmasterWorld!
Is this an Apache 1.x server?
Do you have access to the httpd.conf configuration file?
If so, examine the order of the LoadModule directives in httpd.conf. The PHP module must appear *before* mod_auth and mod_rewrite if you wish to use these modules to protect PHP.
Under Apache 1.x, modules are executed in reverse order of their position in the LoadModule list. It is very common for server admins to simply add PHP to the end of the list, thereby disabling all other modules for PHP file requests.
Jim
Jim - This sounds like it. The web site is hosted by GoDaddy.com. I have no idea if I have the access you describe but I will attempt to find out.
Thanks!
Dave
The only substative suggestion they offered was that I can protect the files with a php script. I've used php for user/password access to individual files. Can it also protect entire directories in a similar way to .htaccess?
Is there another solution that GoDaddy might support? I also haven't given up on getting GoDaddy to solve the problem since I don't definitely know the cause, although module loading order sounds likely. I may need a new host but I don't want to change unless truly necessary.
Thanks again for your help.
Dave
Is there some imortant reason GoDaddy might not want to make this change to their server configuration?
Is there some other way to do what I want to do without writing a specific PHP script? I guess it wouldn't be so bad to "include" this script in every PHP file in all directories and subdirectories, but it would be a pain and rather inelegant.
Dave
If the server config is incorrect, then they should fix it. Do insist on talking to the 'hosting' department!
Why wouldn't they? Inertia. Fear. Ingnorance? I dunno.
Rather than trying to put a band-aid on a major wound, I suggest you talk to their hosting support.
Jim
"In regards to the .htaccess issue, using an .htaccess file will not protect PHP files on our servers. You can protect your html files and folders, but not your PHP files. Our servers are currently configured on an SBox, which does not allow for password protection of PHP files. This cannot be changed for your hosting account, and cannot be changed server-wide either. I apologize for the confusion."
What is an SBox? Is it true that it cannot be changed server-wide? Why would they do this? There must be some benefit.
I think I need a new web host that does protect PHP files via .htaccess. Can anyone suggest someone?
Finally, can anyone suggest an alternative to switching hosts. I really don't have time for it right now.
Thanks. I've learned a lot here.
Dave
If you really want to avoid moving, then you could change all the links to php files on your site so that they link to some other filetype, and then use mod_rewrite to check authorization and then rewrite back to the php files. This may or may not work, so do a small test before committing to it.
They used SBox to make their job easier, not yours. This is the cost of cheap hosting.
Jim
This problem came to light when a user googled himself and gained unrestructed access to the entire database, including scripts allowing him to edit any data. I am concerned that any solution involving links will not solve the googling/web crawler problem. A new host may be the only feasible solution.
Godaddy's most recent suggestion:
Some suggestions are to keep the PHP files themselves with file level protection. You can accomplish this through using session variables and PHP scripting to check for these variables, sites like www.planetsourcecode.com have several examples (that are also free) on using PHP to protect individual pages using sessions and login information.
Everybody else has thrown a couple of cents in and I though I might too... Been away from the Apache section for too long.
You could actually probably use a double rewrite to accomplish what Jim suggested, and I believe it *should* work to protect your files, because the protection of html files happens before the information is served (in the httpd.conf), and by using a 301, external redirect, you are effectively creating a new request:
This is a very simple example:
RewriteEngine ON
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /[^.]+\.php\ HTTP/
RewriteRule ^([^.]+)\.php$ /$1.html [R=301,L]
RewriteRule ^([^.]+)\.html$ /$1.php [L]
The basics of how this works are:
1st Rule qualifies anything .php, but only php.
1st Condition checks to see if it is an original request, and if it is, rewrites that to exactly the same file, with .html
R=301 should bounce any qualifying php files straight back to the httpd.conf file as .html files and start over as a new request, which would be protected by the .htaccess file.
2nd Rule qualifies anything .html, and 'silently' or 'internally' serves the information from the original .php file. This will not qualify for the first condition, because it is not the original request.
Obviously, the actual file could be much more complex than this, depending on how your files are actually structured, but wanted to make sure you knew it was possible.
Hope this helps.
Justin
I'm traveling and will address more fully when I return.
And I can see I've found a great forum here for these questions. My usual sources of help have been enitrely silent on this request, not for lack of willoingness but for lack of knowledge.
Dave