Forum Moderators: open
I was wondering if there is a way to protect a javascript code?
I took a long time to code some nice tools for my visitors but im afraid they will be copied easily without even asking for permission.
Not that i dont want to share, just that i d like to know about it kinda respect point of view :)
Thanks
As a bonus, obfuscated code should be smaller and therefore quicker to load.
I've never used such tools, but if you use the eval statement, I would test it is handled correctly.
Kaled.
I've just had a go with the PHP code provided. It does seem to work.
I saved as .mht, but the part that referred to the script was blank. So it does seem to bar all convenient methods of grabbing an external script.
i.e.
Add the .JS extension to your httpd.conf or .htaccess file:
AddType application/x-httpd-php .js In EVERY .JS file, add the appropriate header info with PHP as the FIRST line:
<?php
header("Content-type: text/javascript");
?>
... rest of scripts here ... In your HTML pages, include the .JS file of choice:
<script type="text/javascript" src="somescripts.js"></script> That's it. When the .JS file is parsed by PHP on request, it is not stored in the visitor's cache, AND visitors cannot see its contents when viewing the source.
Apparently. I haven't tried it, myself. :)
I don't care about the cache - unless I'm missing something it's simply not relevant.
Kaled.
As I understand it, browser behaviour depends on the content-type : this has been set to text/javascript. When I enter the url of a javscript file on my website IE offers to download it. Php or no php, I see no reason for IE to behave differently, but even if the content-type changes, I could still use an HTTP header checker that delivers raw page contents.
Is there a test site out there? I'll happily add a correction if I'm wrong.
Kaled.
[quote]I could still use an HTTP header checker that delivers[/code]
Yes indeed. I think we're just looking for a hindrance to 'normal' or 'convenient' methods of retrieving the code. Nothing more. How those terms are defined is, of course, another matter. It does seem that you can't just cut'n'paste the src of the script tag, then download the script. That would put most off bothering.
Maybe further response header doings could 'discourage' the browser from cacheing the script file.
In my view, the balance of ease of implementation v. hindering effect is quite good (if you can be bothered in the first place that is). But the only 100% certainty involved is that it's not 100% safe - by any measure.
I wonder whether the the mime-type/content-type mix up would cause IE (with SP2) to refuse to use the script in a webpage.
1) Keep a .JS file from being downloaded separately
2) Keep a .JS file from being cached
(Building on the PHP header idea ... )
At the top of each .JS page:
[b]somepage.html:[/b]
<script type="text/javascript" src="somescript.js"></script> [b]somescript.js:[/b]
<?php
if (!eregi("somepage",$_SERVER["REQUEST_URI"])) {
header("Location: http://www.google.com");
exit;
}
else {
header("Content-type: text/javascript");
}
?>
... script contents ... Of course, you would need to use an appropriate page name in each instance, or you could set up an array of all of the page names that use the .JS file and walk that array.
Anyway ... if the user requests the page normally, and the server is set up to parse .JS files with PHP, the page behaves normally, and the .JS file is not cached.
http://www.domain.com/somepage.html If the user tries to get the file by direct download, after seeing it's name in the source, then the "somepage" string is missing, and they're redirected to Google (or wherever).
http://www.domain.com/somescript.js Where else is left for the user to get the .JS file data? Not in the cache, not in the source, not by downloading it ... am I missing a method?
header("Location: http://www.mydomain.com/my_bad_robot_trap.php");
If you are using something like the PHP Bad Bot Script [webmasterworld.com] posted here at WW, snoopers get banned just like a bad bot... :D
The alert only fires when the page is first loaded in a session (on Mozilla, at least). I imagine this may have to do with how the browser caches different types of files. This doesn't necessarily mean functions or variables would be unavailable; it simply suggests that the file is not re-read when the page reloads.
It is also possible that the manipulation of the expires header could modify the behaviour.
It's all so alpha! :)
Now that does have me somewhat baffled. The following methods are all used to switch off caching.
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Pragma: no-cache
Kaled.
I do wish people would read threads before posting on them.
The former is outright delusional, the latter pathetically vain. Why would you want to hide your code?
Surely, if their script is all that, then they'd know...
On a different tack..
There are some sites (some even very good ones) that offer scripted systems for menus, and the like - all fully documented and supported. These scripts do get stolen, by which I mean used without licence. I know that Garrett Smith of DHTML kitchen has complained bitterly about this in the past. These scripts aren't 'hidden', and all the instructions are available. Not much can be done in this respect.
But how about scripts that aren't meant to be taken? What is the risk of one's oh-so-ingenious script being lifted off the web and used by someone else, for their own evil ends and gains? Is it fair to say that the more "stealworthy" a script is, the harder it is to port to another page without instruction? (perhaps not).
Does it actually happen much?
So anything you write that runs on the client side is at risk.
That's a simple fact of web life.
Now look at all the insanely successful websites: Goggle, eBay, Amazon....
Even if you steal their JS and their HTML you've got maybe 1/100th of a percent of anything of value about their site.
Their depth, their value, their success comes from focusing on the server side processes.
Server side intelligence can be hidden. Usually is. And is the way to wealth.
Discussions about hiding client side stuff is a distraction from making the best of the web.
I'd say the prevailing reason I enjoy coding is for the challenge of tackling something new to me. Any new ideas on protecting code are worth exploring for the same reason -- a problem to solve.
Recognizing the patched together nature of the code I write, I realize that for someone to take it and use it, they would need to be smart enough to reverse engineer my junk to modify for their purposes but not smart enough to write it themselves.
I know that a determined car thief will take a crowbar to my window yet I still lock my doors. Why wouldn't I "lock the doors" on my website? ;)