Forum Moderators: phranque

Message Too Old, No Replies

htaccess - how do I allow read on just one file in a secured folder

         

kiwiplayer

12:44 am on Aug 27, 2012 (gmt 0)

10+ Year Member



Hey there,

I'm trying to figure out the correct syntax for an htaccess file, any advice would be most welcome! The situation is:

I have a folder PRIVATE, which is secured by htaccess as password protected.
But in the subfolder PRIVATE/CLICKCOUNT (a utility script), is a php file that visitors need to access in order for the utility to work.

I tried creating a separate htaccess file in PRIVATE/CLICKCOUNT but an ID / password was still requested.

Then I tried adding this to the htaccess for PRIVATE:

<Files /clickcount/click.php>
order deny,allow
allow from all
</Files>

but that doesn't work either - I still get a request for an ID / password.

I'm not sure what the syntax should be to correctly identify the click.php file to htaccess, I've tried various combinations without success - I'm not too sure that <Files> even permits a directory path!

Anyone got any ideas on what code should be in my htaccess file to make this work ...?

OK then,
Thank Q!

wilderness

1:59 am on Aug 27, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Explore AddHandler and see if that will work.

kiwiplayer

3:23 am on Aug 27, 2012 (gmt 0)

10+ Year Member



Thanks for that suggestion, but that directive only seems to change how defined files are parsed, and doesn't affect access rights.

Anyway I've got it working now, but I had to sacrifice password control over the PRIVATE folder to achieve it.

So the htaccess on PRIVATE is now simply:

Order Deny,Allow
Deny from all

<FilesMatch "click.php">
Allow from all
</FilesMatch>

I guess password control takes precedence over everything else, and doesn't permit exceptions, so implementing it is a kind of all-or-nothing approach!

OK then,
Cheers

kiwiplayer

3:39 am on Aug 27, 2012 (gmt 0)

10+ Year Member



Footnote for anyone else running into this situation ...

I've now restuctured my folders and moved stuff around thus:

The rule above, which says:
PRIVATE - Deny from all, except PRIVATE/CLICKCOUNT/click.php

Then a new sub-folder:
PRIVATE/SECURE - Password controlled via it's own htaccess.

lucy24

5:03 am on Aug 27, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Whoops! Didn't realize how long I had this tab open. Subtract two hours...

Putting a supplementary htaccess file in a subdirectory can't and won't work, because the only way a request can reach that inner directory is by going through the outer, locked one. It doesn't simply levitate to its final destination. (Same principle as people going to your domain if you are on shared hosting. The DNS points them to the right physical location-- but to get there they have to pass any barriers put up by the server's overall config file and possibly also your shared userspace.) So you have to lay out all the rules before the request gets to the private directory.

How do you block access to the main folder? You said htaccess. Did you mean htpasswd? Are you using a Require [httpd.apache.org] directive? If so, you may be able to do something with "Satisfy Any".

Require valid-user
Order allow,deny
Allow from {something other than IP here}
Satisfy Any


Does the CLICKCOUNT script use the same method as everything else in the directory? If not-- for example if it's POST while everything else is GET-- a simple <Limit> should do it ... Well, except that you've now allowed anyone and everyone to POST within your protected directory. Solve one problem and you've created another ;)

kiwiplayer

5:42 am on Aug 27, 2012 (gmt 0)

10+ Year Member



Thanks Lucy for those tips - yes, access rules look fairly straightforward but they're devilishly tricky to get right aren't they!

But by shuffling my folders around I have at last managed to get things working the way I intended ...

I've now removed the htaccess for the PRIVATE folder, as there's nothing in it anymore except subfolders and a blank index.html.

Then under PRIVATE I've got CLICKCOUNT, LOGS and SECURE - each folder has a blank index.html, and it's own htaccess, and the SECURE folder has an htpasswd file too.

CLICKCOUNT is relatively open, as the utility has a signon requirement built-in anyway, but I will use the htaccess file to limit it to just my IP address once the site goes live.

Site LOGS is Deny All, as I can access them via Windows Explorer anyway, so browser access is superfluous.

And SECURE now requests a password, which stops anyone seeing my MySQL IDs, but still allows PHP scripts to access them via predefined constants.

All sorted, fingers crossed!

I'll bear in mind the points you've made though - thanks - as I'll probably run into more access control issues in the next development phase!

Cheers!

lucy24

9:13 am on Aug 27, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hm. Why the blank "index.html"? If your directories are set to Options -Indexes --which they certainly should be, by default-- and there is no named index file, visitors will simply get a 403. In fact that's what 403 means to most ordinary humans ;) But if they get a perfectly blank page they will think there is something wrong, either with your site or their browser.

kiwiplayer

10:00 am on Aug 27, 2012 (gmt 0)

10+ Year Member



Yes, good points there! Still, the index.html method is the way I've always stopped the display of directory listings!

The Options -Indexes approach isn't one I've used, partly because I don't want more htaccess processing overhead than is absolutely necessary (as I read somewhere that htaccess files get refreshed with every single pageload), and also because at this stage I'm not too familiar with all the ins and outs of Apache!

I've got error capturing and logging on every clickable entity on all my pages, so the only way visitors would land on a blank page is if they're sniffing about the site in an *ahem* 'inappropiate' manner (!), so whether they got a 40x or a completely blank page doesn't concern me - I actually prefer that they get a blank page, as it tells them nothing - not even a 40x error - that being the principle of 'Security thru Obscurity' ;) And such activity lands in my logging system, so I get to know about every instance of 'inappropiate' hits on my index.html's.

Mind you, my testing indicates that the blank page approach doesn't always work, and they'd end up getting a 40x anyway, depending on where the error got processed, so I've got my own 40x error pages too - clickable back to the Home Page - and they get logged too ...

OK then,
All The Best!