Forum Moderators: phranque

Message Too Old, No Replies

stop downloads of css and js file types

can this be done via htaccess

         

adwhite

12:17 pm on Dec 22, 2005 (gmt 0)

10+ Year Member



Hi,

before christmas with time on my hands I'm trying to find a way to allow js and css files to be served to web pages but then stop people downloading the actual content.

I've tried a number of htaccess and rewritemap options but none have worked successfully.
The particular site I'm trying this on uses this to serve html pages.

RewriteEngine On
RewriteRule ^([a-zA-Z0-9_-]+)\.html$ toptemplate.php?page=$1 [L]

so that may complicate the issue and explain why RewriteMap didn't work.

Sosthenes

3:46 pm on Dec 22, 2005 (gmt 0)

10+ Year Member



What about generating the CSS or JS content on the server with PHP. Then there is no static page for downloading?

adwhite

3:50 pm on Dec 22, 2005 (gmt 0)

10+ Year Member



I believe the problem with that is that the generated code is then within the page (as per other generated code) and consequently could be stripped out by just viewing the source file.

I may well have come up with a solution which I will post here if anyone is interested. Which would then allow any logic flaws to be checked.

jdMorgan

4:16 pm on Dec 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Since CSS and JavaScript are both handled on the client (browser) side, they are -by definition- already downloaded before your page is even rendered by the browser.

Jim

Sosthenes

9:51 pm on Dec 22, 2005 (gmt 0)

10+ Year Member



What about if the generated CSS/JS is included in the head part of the page, and not the body. Would that be viewable then?

Sosthenes

10:19 pm on Dec 22, 2005 (gmt 0)

10+ Year Member



It's an interesting problem, and I'm sure it can be solved.

What about if access to the said CSS/JS files was restricted to the server-sided html file on localhost, that included the files via a script or link tag?

Direct access for downloading not from localhost could be denied. I'm not sure how though.

jdMorgan

10:32 pm on Dec 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



If you want the client to run the JS or render the page using your CSS, then it will be visible to the client and "downloadable."

Jim

Key_Master

10:41 pm on Dec 22, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



adwhite, I don't understand what you need info on. Please be more specific.

Are you trying to deny hotlinking to css/js files or are you trying to prevent browser caching of these files, or something else entirely?

adwhite

11:33 pm on Dec 22, 2005 (gmt 0)

10+ Year Member



It's one of those christmas problems you get.

it's a straightforward fact, if it gets served in a web page then you must be able to download it. But I started to wonder if there is a way to serve it to the page but stop people downloading it. This is not stopping hotlinking but actually stopping someone downloading it.

I've had some success today but it's still causing me some problems.

What I've got is two domains on the same server (they must be on the same server in case the server goes down).

Domain A has hardcoded into the header

<link rel="stylesheet" type="text/css" href="http://www.domainb.co.uk/downloads/#*$!x.css">
<script src="http://www.domainb.co.uk/downloads/menu.js" type="text/javascript"></script>

So domain A is getting the css and js files from domain B which works fine (both on the same server so no problem if the server goes down)

Then in domain B you put a htaccess file in the directory downloads which says something like this :-

Options -indexes
RewriteEngine on
RewriteCond %{HTTP_REFERER}!^http://(www\.)?domaina.co.uk/.*$ [NC]
RewriteRule \.(js¦css)$ - [F]

which only allows access from domain A thereby stopping someone typing in the link information from domain B.

It works although I am having some strange results sometimes, but I'm sure if there is a flaw in my logic the people in this forum will find it out.

jdMorgan

12:05 am on Dec 23, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



The problem is that you're basing the access control on HTTP_REFERER. It is up to the client and any caching proxies between the client and your server (such as AOL's proxies or corporate or other ISP proxies) to pass that header to your server. Also, client-side software, such as Norton Security, may block the HTTP_REFERER header. In many cases, the referrer is blocked without any knowledge of your visitor, so if you block them, they'll assume your site is broken and browse/shop elsewhere. And if the visitor is truly malicious, the referrer can in fact be easily spoofed, and there is no way for the server to detect it.

If the CSS or JS is to affect the client browser, then in fact, it is already 'downloaded', and available in the client's cache. It has to be, since CSS and JS are processed by the client, not the server. So, it's a simple matter to save it to disk.

The server 'cannot tell' any difference between a 'view' and a 'download' request, unless you do client-state tracking on the session using cookies and a script to manage the current state of the visitor's session. Simple referrer-based access control is incapable of reliably handling this problem.

If you have CSS that's so cool or JS that is so clever that you don't want it copied, then the only foolproof solution is to take it off the Web, or retain an attorney to actively enforce your copyright under the DMCA. Referrer-based access control simply won't be sufficient. We've obviously failed to convince you of this fact, but a search of WebmasterWorld for 'stop download' or 'hotlinking' or 'referrer unreliable' will turn up hundreds of threads, all with the same conclusion... Sorry.

Jim

adwhite

8:35 am on Dec 23, 2005 (gmt 0)

10+ Year Member



Hi Jim,

It's purely a hypothetical problem, this would be far too much work in the normal scheme of things, although I can see people who constantly have their sites ripped being interested.

If there is no viable solution then that's that, but in a search of this forum I did not discover a thread that contained this idea.
The http referer is possibly the sticking point, but are there other ways to reference the site (ip for instance).

People on this site have much more understanding of Apache than I do so someone may be inspired.