|How to reduce load time of page?|
Reduce load time of page
I am trying to reduce load time of a web page <snip>
<snip> 18.16 seconds load time on T1. I have tried to optimize images, but still it is around 18 seconds. Can any body help me in reducing load time of this page?
[edited by: trillianjedi at 7:53 am (utc) on June 15, 2007]
[edit reason] No specifics as per TOS, please [/edit]
Is the problem with the page or the server?
How many K in total and how many HTTP objects in the page?
I don't think there is any problem on server, because i have some more sites on same server, but they are not facing load time problem.
File Size is - 59.61K
HTTP Object - 83
----Total Images - 78 (css images 61, images 17)
----Scripts imports - 2
----Css - 1
I have placed header and footer them in SSI files, but not sure it will help me or not. Please assist me.
Far too many objects to about 10-12 maximum. I saw your numbers and I almost choked! Reduce the count, they can't all be downloaded at once. Strip out the fancy graphics, break long pages into multiple pages, and intelligently reuse graphics.
If you have images used as buttons with text on the image, use a single background image, and then the text on top with HTML, etc.
Absolutely - 83 HTTP Objects is the problem.
You either need to cut the 70 odd images in the page, or redesign by splitting into multiple pages.
Here's you challenge: make it < 10.
Look for opportunities to combine adjacent and/or overlayed images. Don't use the browser as a compositing engine!
Does the 59K include the size of the images? If not, what is the total size of the images?
As people previously said - reduce number of pics.
Since most browsers (IIRC) by default allow 4 simultaneous connection, you can try to "trick" it by setting up additional cnames and serve some of the pics/objects from there. So for example set up new cnmaes for
But , I would first try to reduce number of objects if possible....
Oh, and of course you're using height/width attributes on each image, right?
I'm very skeptical of these free online tools we all seem to run across.
What would it say about these sites?
The number 1 bookseller on the web:
Documents (6 files)39 KB (154 KB uncompressed)
Images (67 files)167 KB
Objects (3 files)47 KB
Scripts (5 files)120 KB
Style Sheets (1 file)6 KB
Total 379 KB (494 KB uncompressed)
Or how about the auction giant:
Documents (9 files)19 KB (79 KB uncompressed)
Images (40 files)142 KB
Scripts (13 files)139 KB (474 KB uncompressed)
Style Sheets (3 files)2 KB (10 KB uncompressed)
Total 302 KB (705 KB uncompressed)
I checked a few more and my unscientific survey tells me the BIG guys have BIG homepages.
If you are running PHP on Apache, you can add the following line into your .htaccess file:
php_value output_handler ob_gzhandler
As others have also said kill any extraneous images and make sure remaining images have been optimized as far as possible to reduce their file size.
Many of the big guys do, indeed, have bloated pages.
On the other hand, they presumably know every trick in the book. As well, many of them have deployed their sites to the edge, either using Akami, etc. or their own networks.
For example, Yahoo apparently has deployed servers directly on the Cox Communications network (or else has direct links to it), and, presumably those of other major ISPs.
These "edge" tricks, however, don't really help users who are on slower Internet connections as often times the bottleneck for many users is the "final mile".
One thing to keep in mind is that "the big guys" are ubiquitous and it's hard to avoid using them. Users will put up with a bit more. Additionally, as long as they don't change the bulk of their images often, they will be found in cache, reducing load time. The ubiquity helps to keep their images in cache.
For the most part, the rest of us don't have that advantage. Our users are more likely to walk away in disgust if our site is slow the first time. And our images are more likely to expire in cache, as well.
Unfortunately, that's the reality. The rest of us have to try harder than Yahoo, etc.
This won't help for the initial page load and I don't know if there is specific set of circumstances but from my experience IE won't cache background images. Place them in a hidden DIV at the end of your page which will make IE cache them for subsequent page loads.
Try using some free online tools that will help you analyze your page load times better. <snip>
As people said earlier - reduce the number of pics, flashy graphics and so on.
[edited by: trillianjedi at 1:07 pm (utc) on June 20, 2007]
[edit reason] Please see our TOS re links. [/edit]
Thanks all for your input, sorry i was away so i was not able to check forum.
jtara - if i will combine adjacent images, how it will effect, i guess size will be same? please assist me.
Tastatura - means you are saying to setup sub domains for images, FF and IE supoort 4 simultaneous connection?
LifeinAsia- Yes i am using height and width attributes, how they will effect page load? can you please explain this and how to avoid this. and 59.k is does not include images, imags size is 48k.
KenB - i am using static pages, no server side script. Can i still use gzip compression on html pages?
I have tried to reduce images size and also used css to handle some images but still it did not effect load time. :(
|if i will combine adjacent images, how it will effect, i guess size will be same? |
The size, actually, will be somewhat smaller.
The big advantage, though, is in reducing the number of images that have to be downloaded.