homepage Welcome to WebmasterWorld Guest from 54.166.228.100
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Code, Content, and Presentation / HTML
Forum Library, Charter, Moderators: incrediBILL

HTML Forum

This 35 message thread spans 2 pages: 35 ( [1] 2 > >     
Is there still no way to stop people saving a web page?
zeus




msg:577490
 12:00 am on Jan 6, 2003 (gmt 0)

I know the right click script, but what about something against people that saves the whole site on the Hardisk is there still no protection there.

zeus

 

Lots0




msg:577491
 12:07 am on Jan 6, 2003 (gmt 0)

Not that I have heard of.

Once your page is DL'ed on someones hard-disk they can perty much do what they want with it.

txbakers




msg:577492
 1:24 am on Jan 6, 2003 (gmt 0)

Your HTML and images need to be downloaded to the client computer so the client browser can render it.

There is no way to stop that.

nvision




msg:577493
 1:27 am on Jan 6, 2003 (gmt 0)

Sadly I don't have any examples that I remember, but I have come across some sites/pages that won't save completely - might have something to do with the encoding. So there is something, somehow that if not blocks completely, does so partially in such a way that you don't get the bits you usually want :)

Key_Master




msg:577494
 1:41 am on Jan 6, 2003 (gmt 0)

nvision is right, you can render useless about 80% of a downloaded Web page. It's too complicated to explain how to do it in one thread though.

andreasfriedrich




msg:577495
 1:48 am on Jan 6, 2003 (gmt 0)

Could you post a pointer to an article explaining how it is done.

Saying something is possible but too complicated to explain is not really helpful.

So far I have been with txbakers in that only what is downloaded can be rendered. What is downloaded can be saved simply by copying it from the browser´s cache or by talking to the server directly.

Andreas

SethCall




msg:577496
 2:02 am on Jan 6, 2003 (gmt 0)

KeyMaster:

but you are only saying making it easy to stop a browser, right? If someone made their own HTTP program that merely saved all info it received, would your mysterious technqiue stop that too?

Key_Master




msg:577497
 2:15 am on Jan 6, 2003 (gmt 0)

andreasfriedrich,

AFAIK, there are no articles that explains how to do this. So many people think it's impossible it's made me think there might be a market for this type of information. I will however sticky you an example URL if you are interested and maybe you can extract something useful from it.

One thing I've learned over the years is that just because somebody says something is impossible doesn't necessarily make it so. That piece of advice should be about as useful as the comment "There is no way to stop that".

If someone made their own HTTP program that merely saved all info it received, would your mysterious technqiue stop that too?

Yes, it could be done. With a good spider trap the downloader wouldn't get that far though.

andreasfriedrich




msg:577498
 2:21 am on Jan 6, 2003 (gmt 0)

I will however sticky you an example URL if you are interested

That would be great Key_Master.

Andreas

txbakers




msg:577499
 2:43 am on Jan 6, 2003 (gmt 0)

I would appreciate that sticky as well.

I don't believe in impossibilities either.

Impossible means someone hasn't figured it out yet.

ggrot




msg:577500
 2:47 am on Jan 6, 2003 (gmt 0)

I'd be interested in seeing that too. I'm guessing that perhaps you would be detecting the user agent and perhaps certain browsers have different user agents when downloading vs. when browsing? I don't know if I am off there.

Unfortunately, I do believe it is impossible to completely prevent anyone from possibly downloading and saving your content, if they are smart enough to work through it. As was said above, to browse the content, one has to provide a way for it to be downloaded in the first place. You can do some things such as offering encrpyted content that is decrypted through some java applet or something, but even then, it isn't impossible to access and save the content, just another layer of difficulty.

chiyo




msg:577501
 2:57 am on Jan 6, 2003 (gmt 0)

If you use a server side scripting like cfm or php you can hide a lot of code from people who may want to copy it if that is your concern.

Once any page is downlaoded to someones hard disk, (as happens every time they view it), your copy and images and code are still copyright to you.

Having said that, publishing something publically on the Web means you make it available for any personal use to those who download it.

andreasfriedrich




msg:577502
 3:17 am on Jan 6, 2003 (gmt 0)

I admit it does make it quite a bit harder. But carefully crafting a bot to retrieve a site is still possible and successful. But then that is almost always the case given enough effort and resources.

All in all an impressive solution.

Andreas

GeorgeGG




msg:577503
 3:18 am on Jan 6, 2003 (gmt 0)

I would appreciate that sticky also

I have a JavaScript page that uses the 'SRC' attrib to
send client info to a server/script and returns info
back to the client as JS variables.

Depending on how or where the script is requested
from, depends on what info the client receives, if any.

Adding a URL to my data base will allow another site
to use the page/script.

GeorgeGG

dingman




msg:577504
 11:58 pm on Jan 6, 2003 (gmt 0)

But carefully crafting a bot to retrieve a site is still possible and successful.

Indeed. If you're after a particular site, it's not even terribly hard, since the routines to get around the protections don't have to be sophisticated. I've done it before when after the content of a site hosted by a large commercial entity with lots of roadblocks in place. (The desirable content had all been written by either myself or one of a small group of friends, without relinquishing our copyright.)

hurlimann




msg:577505
 12:10 am on Jan 7, 2003 (gmt 0)

Yes if you mean the source but search engines will not see the page as it is encrypted.

I can SM an example for those that wish to see it.

Key_Master




msg:577506
 12:40 am on Jan 7, 2003 (gmt 0)

I could build a spider friendly Web page that (assuming you were able to copy the code to begin with) would absoutely fall apart in a browser and wouldn't display properly without extensive hand editing.

What good is having the ability to download the page if the rules for it's proper display were changed the moment it was downloaded?

Syren_Song




msg:577507
 3:42 am on Jan 7, 2003 (gmt 0)

Key_Master -

I'd appreciate a copy of that coding via SM too, if you don't mind.

GeorgeGG -

Same goes for your javascript coding, if it's too long to post here (or if you'd just prefer not to post it).

I know several folks who'd be more interested in having websites done for them if they had better protection available to prevent easy downloads of their imagery.

GeorgeGG




msg:577508
 4:58 am on Jan 7, 2003 (gmt 0)

Syren_Song

Be kind of hard to post about 400k of server code :)
(Dont have any html/JS code, all server scripts.)

But will snip a little webpage JS stuff:
<SNIP>
<script TYPE="text/javascript">
document.write("<script type=\"text/JavaScript\" src=\"http://www.#####.###/~georgegg/G/_.js
?A="+escape(document.URL?GG_du=document.URL:GG_du=GG_sk)
+"\"></"+"script>");
</script>
<SNIP>

By sending the JS 'document.URL' the server script checks a database for an
allowed webpage/site.

The server returns JS variables... and the lines:
if(window!= top) top.location.href = location.href;
if(document.URL.indexOf('http://www.#####.###/~georgegg/') == -1)
window.location="http://www.#####.###/~georgegg/";

Which checks if the sent document.URL is really the calling JS webpage/site.
If not, redirect to 'my' site.....

You can send any 'client side' JS variable to the server this way....
like the JS 'document.referrer'

Back in the late 90's used something like this to prevent people from copying
my MIDI pages and linking to the files.

GeorgeGG

pissant




msg:577509
 7:56 am on Jan 13, 2003 (gmt 0)

I have a question... why on earth do you need to hide your code?
I'm sure everyone knows that the web became popular at least in part because of the view source aspect. With everybody finding new ways to do stuff and sharing techniques etc
This is certainly the way that I learnt! Why hide it? To me that is just against the spirit of the whole thing.

pissant




msg:577510
 7:56 am on Jan 13, 2003 (gmt 0)

I have a question... why on earth do you need to hide your code?
I'm sure everyone knows that the web became popular at least in part because of the view source aspect. With everybody finding new ways to do stuff and sharing techniques etc
This is certainly the way that I learnt! Why hide it? To me that is just against the spirit of the whole thing.

Key_Master




msg:577511
 8:28 am on Jan 13, 2003 (gmt 0)

I'm sure everyone knows that the web became popular at least in part because of the view source aspect.

I doubt sincerely that most surfers even know what view source is much less what it means. Easy access to information and pure greed spurred Internet growth.

People hide their code because of those lazy thief’s out there who want to copy your stuff. I know, I've had it happen to me several times. A lot of people (myself included) won't post source code or media to their sites because of this concern.

Orange_XL




msg:577512
 10:43 am on Jan 13, 2003 (gmt 0)

In Internet Explorer, this little snippit will cause it to not save webpages (actually it will save, but when it comes to this snippet it will abort and delete everything it already saved).


<frame><noframes></frame></noframes>

If you use (i)frames, it will still work, but you'll have to put it in the frame-html (not frameset-page)

A different way is to put in an illigal drive letter (like: z:\my documents\...) in a javascript or stylesheet src-attribute.

phaze3k




msg:577513
 10:52 am on Jan 13, 2003 (gmt 0)

There is absolutely no way to stop people from viewing your HTML source. You can use Javascript obfusication to make it harder to view the source, but as you have to put the decryption routine in the source too that's not going to stop someone with a clue from viewing your source.

Why exactly are you so concerned with hiding your HTML source anyway? If you're using insecure hidden values in HTML forms then hiding the source isn't going to stop anyone, they can easily see what their browser is posting. If you're worried about someone copying your site and setting up something which looks the same (maybe to capture credit card numbers) then there's nothing to stop someone re-creating a site that looks very similar anyway, even if you could stop people viewing your HTML.

Mikkel Svendsen




msg:577514
 11:37 am on Jan 13, 2003 (gmt 0)

If you want your pages crawled by search engines (and I guess that's why you are here, right?) then you let them download your page - otherwise they can't parse it.

I am running some advanced spiders (doing different stuff). Please sticky me the example of a "non-downloadable" page or site. If I can crawl it using my spiders, I can save everything I get, on my HD, to a DB, in XML - or anywhere you'd like :)

Brett_Tabke




msg:577515
 11:40 am on Jan 13, 2003 (gmt 0)
If you are going to go to that hassle of js, why not just dump it into a gif or flash it?

Any page that contains html, I will spit back at you in perfect working order. Sure, you can incode stuff via js, but it can be decoded just as easy.

Mikkel Svendsen




msg:577516
 12:00 pm on Jan 13, 2003 (gmt 0)

The gif or flash wont stop either me or you, Brett. You know we can both parse that as well :)

Brett_Tabke




msg:577517
 12:19 pm on Jan 13, 2003 (gmt 0)

Well, I thought about that - I guess the question is about protecting their code. If their code is that complex and proprietary that they think they have to protect it, I garantee you they are 50% less successful than they could be with simple standard code. Overblown code and site design is the #1 cause of site failure. Focus on your mission to provide unique content on your subject and you can put up some pretty ugly pages and still be mega successful. Page and graphic design are the two biggest fantasy sales jobs on the net. They talk about marketers over selling the "build it, they will come" factor, but we can't hold a candle to the snow job the page design community has done on webmasters.

Mikkel Svendsen




msg:577518
 12:40 pm on Jan 13, 2003 (gmt 0)

You are so right, Brett. I would love to see proof that this sort of "messy" code increase sales - I have only seen proof of the oposite :)

volatilegx




msg:577519
 9:08 pm on Jan 13, 2003 (gmt 0)

I purchased a product that double-encrypts the source code, and it's extremely difficult even to get the source code unencrypted once... I couldn't figure out the second encryption scheme. As far as I can tell, unless you're an expert as de-encryption, the software is pretty bullet-proof. Problem is, the search engine spiders can't read it either.

Sticky-mail for a URL.

<added>Oops can't find the URL and I must've installed this program in my other office (in another state). Sorry</added>

This 35 message thread spans 2 pages: 35 ( [1] 2 > >
Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / HTML
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved