homepage Welcome to WebmasterWorld Guest from 54.237.95.6
register, free tools, login, search, subscribe, help, library, announcements, recent posts, open posts,
Subscribe and Support WebmasterWorld
Visit PubCon.com
Home / Forums Index / Code, Content, and Presentation / HTML
Forum Library, Charter, Moderators: incrediBILL

HTML Forum

    
Getting large images to appear faster?
Does this simple Javascript need optimization?
R1chard




msg:588040
 4:43 pm on Sep 2, 2004 (gmt 0)

OK, I've been optimizing some code for a friend, in order to make the display faster. I've replaced what he originally had with code of about half the length, and although there is roughly a 50% increase in speed, it still feels sluggish on a 700MHz machine. So I guess I'd like to ask what is the limiting factor in the following:

function Over(x,y,n) {
document.getElementById('selection').style.left = x;
document.getElementById('selection').style.top = y;
document.playerportrait.src = portrait[n].src;
document.playername.src = name[n].src;
if (n==20){ document.getElementById('gillselect').style.visibility = '';}
else { document.getElementById('gillselect').style.visibility = 'hidden';}
}

<a href="sean/sean.html" onMouseover="Over(422,232,5);"
style="position:absolute; left:422px; top:232px; width:82px; height:56px; z-index:4">
&nbsp;</a>

The nbsp is in very large font and is a hack to fix IE6 (it works fine without it in IE5, Opera 5+, Moz, etc)

There is a kind of grid image in the background that provides the locations and vidual references of the links (it's essentially a page that just serves sets of images, so we're really not too bothered about accessibility).

The link above is one of about 20 on the page. When mouseover occurs, a small animated circle ('selection') is moved to the given location (x,y), and highlights the target area.

playerportrait and playername are then swapped to an image from an array (n). The latter of these is a small image, and the former is medium-sized: a 485x246 pixels png, which gets resized by the browser to 844x492 (it's about 20kb, and is one of the centerpieces of the page).

Although most of the function is carried out pretty instantly, it's this slightly larger portrait image that takes all the time (well over a second, even after it's loaded). There are other images behind and in front of it, and it's all a complex layout.

So is there anything I can add/delete/change in order to speed things up a little? Rollovers like this have been around for donkey's years, so is there something simple I've overlooked? I've thought about using one huge image with tiles and then moving it into place, but don't know how much it would help...

Thanks for any advice.

 

vkaryl




msg:588041
 8:38 pm on Sep 2, 2004 (gmt 0)

When you say the nbsp is in a very large font, do you mean you have a piece of css which gives "&nbsp;" the font-size value of 300% or something? I don't know whether it would affect your situation, but I might want to try using <p>&nbsp;</p> instead, even in multiples if necessary, rather than forcing the browser to parse a font size change....

[Added: I think I've seen mention here somewhere that using browser resizing of graphics isn't ideal, either....]

tedster




msg:588042
 8:52 pm on Sep 2, 2004 (gmt 0)

browser resizing of graphics isn't ideal, either

I'd say that depends on the graphic. If the image is a gif/png with only vertical and horizontal elements, and no curves or diagonals, then and resize is beautiful and not pixelated.

About the rollover script - I don't see any preload for the images there. This would create a time lag when the user hovers and requires another trip to the server. On the other hand, if there are 20 images being preloaded in another chunk of script somewhere, that would also slow things down. I would not include a large preload script in the head section - I would do that at the very end of the html document.

vkaryl




msg:588043
 10:29 pm on Sep 2, 2004 (gmt 0)

ted: wasn't meaning the quality, just the load-time....

tedster




msg:588044
 11:48 pm on Sep 2, 2004 (gmt 0)

I get you - yes, resizing down from a large image to a smaller dimension does create unneccesarily long download times. In the example in the first post, the browser is resizing the image UP - from 485x246 to 844x492 - and that should save time compared to downloading a full sized source file.

As far as images go, there is also time added after the download for de-compressing a jpg image. Depending on the degree of compression and the speed of the processor, it can be noticable - like it's 5 seconds after the packets are received or more, and the image still isn't rendered.

vkaryl




msg:588045
 12:12 am on Sep 3, 2004 (gmt 0)

AUGH! Totally missed the "up-size"! Esh.... sorry!

R1chard




msg:588046
 10:07 am on Sep 3, 2004 (gmt 0)

Thanks for the responses...

vkaryl- yes, there is a line of CSS for making the link font-size: 56px; and to remove the underline. Each link is essentially a rectangular area, and I want the text to be invisible (it's all over a grid-like image behind, and that's what the user points at). The nbsp isn't even needed in Opera/Mozilla, since they allow the empty link to be displayed, and draw it at the size specified in the <a>. I think if I use extra <p>s then I would have problems with overflow or whatever. And I figured that after the links are drawn, nothing else ever happens to them (they stay the same size on hover, and are not processed in any way)

As for the large images, yes, they are highly detailed in both directions (drawn human faces). They are all loaded small (via a preload loop script onLoad, which I'm pretty sure is triggered when everything else is finished?) And then they are stretched to double their size (ie almost to fill the screen). I'm quite happy with the way they look- the only problem is that even after they have all loaded (about 450k), there is a very noticable delay (eg, if you keep switching back and forth between two images, they both take a while).

'time added after the download for de-compressing a jpg image'. Yeah, this is the sort of thing that I'd forgotten about... Hidden overheads with image display. Maybe the processing for the resizing doesn't help, either (I guess it's quicker if it's a neat factor of two?). I suppose there's no way around this. In general, PNG is quicker to decode than GIF and JPG, right?

Would the idea work of having one large tiled image that gets moved around to display one part of it? I think that's how skinned GUI toolbar buttons work in Mozilla...

Or would it help to display them all at once in the same place, and then just change visibility/z-index?

tedster




msg:588047
 6:00 pm on Sep 3, 2004 (gmt 0)

I've found it can speed up the final display when pre-loading images into the cache if I use the final dimensions in the preload script --

image01=new Image(844,492)

vkaryl




msg:588048
 8:21 pm on Sep 6, 2004 (gmt 0)

I hate to ask the obvious, because of course I don't want anyone to think that I think he's too sideways to have thought of it himself....

but these ARE compressed with a decent algorithm, right? I mean something besides the crap one that comes with PSP etc....

R1chard




msg:588049
 3:26 pm on Sep 7, 2004 (gmt 0)

Um, I'm not sure. The large images are gif (others are png), but I'm not sure of the algorithm used. He's a games hacker/graphics guy, so I'd guess it was all top stuff...

Thanks for bringing it up though. I'll ask him.

StupidScript




msg:588050
 8:04 pm on Sep 8, 2004 (gmt 0)

one large tiled image that gets moved around

How large would that large image be, filesize-wise? It's obvious there is little concern for users who have not got Javascript enabled, so this might actually be an option for you.

If the total aggregate size of the images you are displaying is less than 20% (off the top of my head) of the combined 'large' image, you may see the download time of the large image is not worth the benefit. However if the filesizes are close, then one image, loaded and clipped is definitely faster re: displaying a portion at the proper size. There's no resize rendering overhead, at least.

R1chard




msg:588051
 9:01 pm on Sep 8, 2004 (gmt 0)

^ well, yeah, we're assuming that people have both JavaScript and images turned on (it's a noncommercial/fun site about the graphics and animation in a highly-technically-demanding videogame, with hardly any textual content). I suppose with Lynx the links would still work, so they could download all the stuff for opening with other software...

I suppose the large image would be somewhere around 400kb, maybe a little less (I haven't made it, I assume it's linear with dimension? They're quite low color depth, so maybe a lot less). There are 20 images, and most of them are about 20kb, so I suppose some of the overheads would be reduced with only one file. However, the large tiled image would still need to be resized, unless it was saved at double-resolution. I guess then you have a tradeoff between download time and render time...

PS: I just checked with Linux Mozilla, and it's slightly different to Win/Moz: the image is not displayed all at once- As they change, they appear in two or three parts from top to bottom. Nothing too drastic, but something I could do without...

tedster




msg:588052
 9:41 pm on Sep 8, 2004 (gmt 0)

I just checked with Linux Mozilla, and it's slightly different to Win/Moz: the image is not displayed all at once- As they change, they appear in two or three parts from top to bottom.

Are these jpg files? It sounds like a "progressive" jpg - some browsers (including IE) display them all at once, instead of progressively as was the original intent of the file format. I haven't been paying attention to this issue lately -- since I know IE has it wrong, I started saving all my jpg files as standard format so that something renders early on for dial-up users.

However, I assumed that Moz/Win had it right. From what you say, it sounds like only Moz/Linux does -- even though you don't like it

frogg




msg:588053
 12:48 am on Sep 9, 2004 (gmt 0)

I don't know if this will make any kind of noticable performance increase (it's looking like the delay is really in the browser's image handling code), but yes, your original function can be further optimised. Your original code:

function Over(x,y,n) {
document.getElementById('selection').style.left = x;
document.getElementById('selection').style.top = y;
document.playerportrait.src = portrait[n].src;
document.playername.src = name[n].src;
if (n==20){ document.getElementById('gillselect').style.visibility = '';}
else { document.getElementById('gillselect').style.visibility = 'hidden';}
}

You can remove the repeated calls to document.getElementById(...), and replace with a one-time initialised reference to the same object:

// global vars
var el_sel = document.getElementById('selection');
var el_gillsel = document.getElementById('gillselect');

function Over(x,y,n) {
el_sel.style.left = x;
el_sel.style.top = y;
document.playerportrait.src = portrait[n].src;
document.playername.src = name[n].src;
if (n==20){ el_gillsel.style.visibility = '';}
else { el_gillsel.style.visibility = 'hidden';}
}

This will execute marginally faster -- the cost of doing an object reference (document.*), calling a function (.foobar(...)), and performing a hashed name lookup of another object (which is what getElementById(...) is likely doing) every time is obviously slower than just doing an object reference (el_sel.*).

Whether this is a useful speed gain remains to be seen, but that change does also make the code slightly easier to maintain and read. (Although global variable are frowned upon by some people)

You can make similar optimisations to the above throughout your code, replacing any instances of getElementById with a reference to a global object that is calculated once at page load (this obviously doesn't apply if you are passing a variable into getElementById - if it's dynamic in such a way, you quite obviously can't keep a wide scoped reference to it so easily (did I need to say that?)).

Furthermore, you should also gain some performance (again it may be trivial) by passing less parameters into functions. For each parameter passed it will cost you one push to the stack before going into the function, and one pop from stack once inside the function. This will likely cost less than using an index into an array. i.e. you could do something like:

// global vars
var el_sel = document.getElementById('selection');
var el_gillsel = document.getElementById('gillselect');
var x_coords = [10,20,30,40,50,60,70,80];
var y_coords = [10,20,30,40,50,60,70,80];

function Over(n) {
el_sel.style.left = x_coords[x];
el_sel.style.top = y_coords[y];
document.playerportrait.src = portrait[n].src;
document.playername.src = name[n].src;
if (n==20){ el_gillsel.style.visibility = '';}
else { el_gillsel.style.visibility = 'hidden';}
}

..and modify your HTML accordingly, so that the x and y coords are not passed into the function, only the index is -- and, you must ensure that the arrays of x_coords and y_coords are initialised properly (mine aren't, it's only dummy data).

Now, obviously whilst both of these optimisations may appear sound in theory: I've not made any tests of them. To ensure that any optimisations you make actually make the code faster (and not slower due to some strange interpreter quirk), you should always perform some real world benchmarks to confirm your changes are for the better.

The hassles of properly benchmarking JavaScript are beyond the scope of this post (luckily for me).

btw, regarding the rest of the thread, I think using a single image is probably the way to go.

Hope some of this helps?

[In case you can't guess, I'm a programmer / software engineer! fwiw, I learnt to optimise code when programming games (10+ years experience), and since then I spent 3yrs at a company who produced a server-side scripting language (which taught me about parsing, execution of interpreted code, VMs and all that kind of stuff). Probably not everyone's cup of tea.]

Actually, you can probably go one step further with the globabally cached object references, by doing something this instead:

var elSelStyle = document.getElementById('selection').style;
...
elSelStyle.left = x;
elSelStyle.top = y;
...

..which again, will be marginally quicker (but might not be as 'useful' elsewhere in the code).

One would probably need to do some benchmarks to see the difference -- and it might not be very much of a gain.

frogg




msg:588054
 12:57 am on Sep 9, 2004 (gmt 0)

tedster said:
I've found it can speed up the final display when pre-loading images into the cache if I use the final dimensions in the preload script --

image01=new Image(844,492)


I think doing it that way will cause the browser to pre-scale the image at page load time, as opposed to loading the image at one size and then rescaling when it needs to draw the image. Definitely a good thing to do.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Code, Content, and Presentation / HTML
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About
© Webmaster World 1996-2014 all rights reserved