Welcome to WebmasterWorld Guest from 34.239.158.107

Forum Moderators: Ocean10000 & phranque

Message Too Old, No Replies

Blocking Direct Access

Stop Posting Address in Browser Address Bar

     
9:03 am on Nov 20, 2008 (gmt 0)

New User

10+ Year Member

joined:Mar 29, 2005
posts: 14
votes: 0


Hello,

I have a directory on my site, and I wish to stop people just copying the address shown in the browser address bar, then just pasting that back in the bar (when off my site)to enter that directory again.

I've tried htaccess:

IndexIgnore */*

--------------

<Directory />
Order Deny,Allow
Deny from all
Allow from linkhomes2000.co.uk
</Directory>

The first one does nothing, while the second stops all access, even when you click the menu on site you can't access the directory.

can this be done? The file in this directory is actually a false one, and only contains a php redirect to the actual file there accessing.

Kind Regards

2:42 pm on Nov 20, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
posts:25430
votes: 0


This problem would require a referrer-based solution. Your "Allow from" directive above was based on the Remote IP Address, which is not the same thing. It would have allowed only your server to request that directory from itself.

Unfortunately, HTTP referers are problematic. It is the client (browser or search engine robot, for example) that decides whether it will send a referer or not. Search engines never send a referer. Corporate and ISP caching proxies almost never send a referer. In fact, the HTTP specification *requires* that a client must not send a referer header unless that referer specifies a single, unique page on the Web which is the source of that referral. In the case of search engines, this cannot be done, because the search engine likely found links to your page on many other pages. In the case of cahing proxies, it cannot be dome because the cache is used by many employees or customers, and each may have found a link to your page on a different Web site.

So, the problem with requiring the referrer to be from your own site is that a blank referrer will not match the URL of your own site, and requests from search engines, caching proxies, direct type-ins, and links created by JavaScript in Internet Explorer will all be blank.

A better solution is to use a cookies-and-script based approach: When a visitor comes to an "authorizing page" on your Web site (for example, just the home page, or any publicly-accessible page), set a cookie. If the visitor requests your "private content" page(s), then rewrite that request to a script. If the cookie is set, then the script will "include" the contents of the private page by accessing the file locally in the filesystem, and send the contents to the requesting client. If the client does not send the authorizing cookie, then the script can return a 403-Forbidden response, or redirect the request to your home page, your "sign-up" page, a "help" page, or any other page.

Using this approach, the private content is never directly accessible using HTTP, and will be "hidden" behind the script that checks the cookie. Therefore, you can block all HTTP access to that private content, because it will only be accessed by the script using a local filesystem read.

Jim

4:44 pm on Nov 20, 2008 (gmt 0)

New User

10+ Year Member

joined:Mar 29, 2005
posts:14
votes: 0


Hello Jim,

I see what you mean, session cookies seem like the way forward. I'm not clear as to how I will do this yet, just wondering how secure session cookies will be?

I can see from what you've said and what I've tried it's not going to work the way I have been trying...

Regards
D)

5:12 pm on Nov 20, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member

joined:Apr 30, 2007
posts:1394
votes: 0


Hi Jim,

Some search engines do send a referrer and I believe is subject to the document's structure, perhaps search engine code side effects that need to be rectified. Entry from log (with the actual domain/link replaced):

67.195.37.121 - - [01/Nov/2008:00:52:34 -0400] "GET /stylesheet.css HTTP/1.0" 304 - "http://www.example.com/somepage.php" "Mozilla/5.0 (compatible; Yahoo! Slurp/3.0; [help.yahoo.com...]

I can only guess this happens because the stylesheet.css represents a resource, slurp needs for proper display of the page and it is a secondary script. I haven't seen it with regular pages yet.

5:19 pm on Nov 20, 2008 (gmt 0)

Senior Member

WebmasterWorld Senior Member jdmorgan is a WebmasterWorld Top Contributor of All Time 10+ Year Member

joined:Mar 31, 2002
posts:25430
votes: 0


I didn't specify session cookies, but rather, just a generic cookie, created by a script.

This is done simply by outputting a "Set-Cookie:" response header with a valid cookie-formatted text string in it. The browser will then send that cookie back to the server in a "Cookie:" header with every request it makes to the cookie-specified "realm" until that cookie expires.

Depending on the cookie syntax you choose, the cookie may expire when the browser is closed, or on a specified date. The hardest thing about cookies is calculating that expiry date; Thank goodness there's a library function in PERL and PHP to do it!

You also need to be very careful in specifying the cookie's "realm" - Note the leading dot in the example below if you intend to use the cookie across all subdomains. An error there can cause a lot of unexpected behaviors.

Here's a PERL code example of how to set a cookie for the entire domain which expires when the browser is closed:


print("Set-Cookie: view=web; path=/; domain=.example.com; HttpOnly\n");

I'm sure you can easily find a similar example for PHP somewhere around here or on the Web.

Jim

 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members