Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
I am trying to reduce load time of a web page <snip>
<snip> 18.16 seconds load time on T1. I have tried to optimize images, but still it is around 18 seconds. Can any body help me in reducing load time of this page?
[edited by: trillianjedi at 7:53 am (utc) on June 15, 2007]
[edit reason] No specifics as per TOS, please [/edit]
I don't think there is any problem on server, because i have some more sites on same server, but they are not facing load time problem.
File Size is - 59.61K
HTTP Object - 83
----Total Images - 78 (css images 61, images 17)
----Scripts imports - 2
----Css - 1
I have placed header and footer them in SSI files, but not sure it will help me or not. Please assist me.
If you have images used as buttons with text on the image, use a single background image, and then the text on top with HTML, etc.
Since most browsers (IIRC) by default allow 4 simultaneous connection, you can try to "trick" it by setting up additional cnames and serve some of the pics/objects from there. So for example set up new cnmaes for
But , I would first try to reduce number of objects if possible....
What would it say about these sites?
The number 1 bookseller on the web:
Documents (6 files)39 KB (154 KB uncompressed)
Images (67 files)167 KB
Objects (3 files)47 KB
Scripts (5 files)120 KB
Style Sheets (1 file)6 KB
Total 379 KB (494 KB uncompressed)
Or how about the auction giant:
Documents (9 files)19 KB (79 KB uncompressed)
Images (40 files)142 KB
Scripts (13 files)139 KB (474 KB uncompressed)
Style Sheets (3 files)2 KB (10 KB uncompressed)
Total 302 KB (705 KB uncompressed)
I checked a few more and my unscientific survey tells me the BIG guys have BIG homepages.
If you are running PHP on Apache, you can add the following line into your .htaccess file:
php_value output_handler ob_gzhandler
As others have also said kill any extraneous images and make sure remaining images have been optimized as far as possible to reduce their file size.
On the other hand, they presumably know every trick in the book. As well, many of them have deployed their sites to the edge, either using Akami, etc. or their own networks.
For example, Yahoo apparently has deployed servers directly on the Cox Communications network (or else has direct links to it), and, presumably those of other major ISPs.
One thing to keep in mind is that "the big guys" are ubiquitous and it's hard to avoid using them. Users will put up with a bit more. Additionally, as long as they don't change the bulk of their images often, they will be found in cache, reducing load time. The ubiquity helps to keep their images in cache.
For the most part, the rest of us don't have that advantage. Our users are more likely to walk away in disgust if our site is slow the first time. And our images are more likely to expire in cache, as well.
Unfortunately, that's the reality. The rest of us have to try harder than Yahoo, etc.
Thanks all for your input, sorry i was away so i was not able to check forum.
jtara - if i will combine adjacent images, how it will effect, i guess size will be same? please assist me.
Tastatura - means you are saying to setup sub domains for images, FF and IE supoort 4 simultaneous connection?
LifeinAsia- Yes i am using height and width attributes, how they will effect page load? can you please explain this and how to avoid this. and 59.k is does not include images, imags size is 48k.
KenB - i am using static pages, no server side script. Can i still use gzip compression on html pages?
I have tried to reduce images size and also used css to handle some images but still it did not effect load time. :(