Forum Moderators: open

Message Too Old, No Replies

keeping it lean

         

potbelly

12:47 am on Feb 7, 2004 (gmt 0)

10+ Year Member



What are the general rules when trying to reduce the size of your pages.
I have seen tools that wipe out all spaces and leave the code in a solid
block. Are these a good idea?
What are the common mistakes that people make?
Are there any good resources out there?

brdwlsh

3:48 am on Feb 7, 2004 (gmt 0)

10+ Year Member



keep image files small--i shoot for less than 20kb for the bigger ones--it all depends what the dimensions are though.
search this site for image editing software for ideas on what to use.

choose wisely between .gif, .jpg, and .png

use CSS instead of all that cumbersome HTML stuff--external style sheets are even better.
CSS Crash Course at WebmasterWorld [webmasterworld.com]

i don't use flash--personal preference.

don't double-up on images--this is relevant for buttons and other elements that are repeated. you need only one instance of an image even if it is displayed several times.

don't use audio--unless it is absolutely necessary. :p

write your code, don't mock the page up in a program like photoshop/imageready and then transfer it to html as all images.

that's all i could muster up for now.

pshea

4:58 am on Feb 7, 2004 (gmt 0)

10+ Year Member



hey potbelly, two of my favorites techniques are 1) after reducing a jpg to the smallest weight possible for the look you wish to achieve, run the image through a 'cleaner' to reduce it even further. A 'cleaner' strips out all the unnecessary information that photo processing programs attach to a file, such as the camera type used to create the file. 2) if you utilize tables for layout, look at your page layout and figure out which portion of that table fills up about half of the top portion of the page. Make that top portion a separate table entirely. Create the rest of the page as the second table. Because a table won't render until it's entire contents are captured, this will help to give your visitor something to look at quickly while the 'beneath the screen' content loads for viewing.

keyplyr

6:39 am on Feb 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I have seen tools that wipe out all spaces and leave the code in a solid
block. Are these a good idea?

Yes, HTML compression can remove a couple of kb per page. However, removing spaces from JavaScipt may cause it to break so watch out.

twist

7:03 am on Feb 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Here's a few tips. Use of the following ideas is not for everybody. It all depends on what kind of site you have and what kind of server your hosted on. It also depends on whether you cater to visitors using 56k modems.

My pages are written completely in php. I use gzip compression for my pages before sending them. My home page dropped from 11,000 to around 2,000 bytes using compression.

I liked compression so much I got rid of my external .js and .css files and added them directly into my php script. I learned that each time a browser has to call a external file, such as a image or script, it adds a latency of about .2 seconds for each call. I removed all my images, I had 8 small images, and my .js and .css file from my homepage. This removed that .2 seconds latency for each object making my page load 2 seconds faster on a 56k modem. Not to mention the 4 seconds I saved removing my images.

Something else to watch for is if you use a favicon. It doesn't matter in IE but if your visitors are using Moz, NN, or Opera then they may be trying to download your favicon image as the page loads. Favicons are a uncompressed format. I went from a 32x32 256 color favicon over 2,000 bytes in size down to a 16x16 16 color image only 318 bytes. Knocks about .3 seconds of the download time for 56k'ers.

Since I moved my css into my main script I lose the caching of my .css file by the browser. So to make up for this I split my css up and only use what is needed for each page. I share the main setting on every page. Something like this,

body,td.s_content,td.c_content,td.s_sp { background-color: #666666; }
body { padding: 0; }
table,td,img { background-color: #FFFFFF; }
body,table,td { width: 100%; }
table,img { border: none; }
img { display: block; }
body,h1 { margin: 0; }
font,td,table { color: #000000; font-family: Verdana, sans-serif; text-decoration: none; font-size: 11px; }
a:link,a:active,a:visited,a:hover { text-decoration: none; }
a:link,a:active,a:visited { color: #333333; }
a:hover { color: #336600; }';

And then each individual page only tacks on it's own css. It's also made editing very easy because I can mess around with css for one page without screwing up another. I use my css for my regular site and my forums. If I used a external css file it would be huge and since your external css has to be loaded before your page starts to render it could cause quite a delay on your homepage.

Last tip, define width and height on images, tables, rows or whatever you can. If you don't tell the browser how to draw the page then it has to do it for you. A visitor using a older computer may have to sit and wait for his browser to draw your page.

Most of these things only knock a portion of a second off here are there but when you put them all together you can really speed up your site. My homepage is currently 2,847 bytes and loads in about a second. It may seem like overkill but where it really works well is on my gallery. Most of my gallery pages load in under 8 seconds, they used to load in about 20 seconds. Saves the 56k visitor about 12 seconds per gallery page.

Good luck, hope I could be of some help.

I have seen tools that wipe out all spaces and leave the code in a solid
block.

P.S. The compression I'm talking about isn't the same as the compression your talking about.

tombola

7:33 am on Feb 7, 2004 (gmt 0)

10+ Year Member



I do exactly the same as twist :-)

twist

10:08 am on Feb 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I do exactly the same as twist :-)

Really? lol

I picked up almost all these tips from hanging around here the last few weeks.

Oh yeah, one more thing. Don't completely get rid of tables but try using divs here and there. They can be very handy for keeping down the clutter.

Another way to keep yourself honest is by trying to convert your site over to xhtml 1.1. I learned all sorts of things doing the conversion. Couldn't be happier now.

tombola

10:42 am on Feb 7, 2004 (gmt 0)

10+ Year Member



Another way to keep yourself honest is by trying to convert your site over to xhtml 1.1

twist, I've the feeling that you read my advice in other forums ;-)

encyclo

2:27 pm on Feb 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Twist, you can trim off a few bytes of your CSS - something like this:

body, td.s_content, td.c_content, td.s_sp {background:#666;} 
body {padding:0;}
table, td, img {background:#fff;border:none;}
body, table, td {width:100%;}
img {display:block;}
body, h1 {margin:0;}
font, td, table {color:#000;font:11px Verdana, sans-serif;}
a {color:#333;text-decoration:none;}
a:hover {color:#360;}

Every little helps ;)

Other things to look at - if your site is in one language, using one charset, you can define them both in .htaccess or httpd.conf (Apache only!) and lose the meta tags. For example, if your site is in English and uses iso-8859-1, use this:

AddDefaultCharset ISO-8859-1 
DefaultLanguage en

After that, optimize all your images in Photoshop (you can save a huge amount here), use HTML 4.01 Strict and CSS, and simplify your design to reduce code bloat - your users will thank you for it, too, because not only your site will be easier to use, it will load quicker too.

For HTML pages, GZIP compression is the way to go, of course. Everything else is window dressing. Removing the white-space does save a few bytes, but makes editing a nightmare.

twist

6:45 pm on Feb 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Man this place is great :)

Cleaned up my css but I have a question about the charset.

Does,

AddDefaultCharset ISO-8859-1

replace this entire meta tag or just the charset portion?

<meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1" />

P.S. If you do achieve some type of validation whether it be html 4.01 or xhtml. Consider using a text link to confirm validation instead of a image. After I turned the xhtml 1.1 image I got from w3c into a gif format using all 191 original colors it was 2,233 bytes. My entire homepage is only 2,847 bytes. If I had kept the css validation image too I would have almost tripled the size of my homepage.

tombola

10:39 pm on Feb 7, 2004 (gmt 0)

10+ Year Member



You can also use content negotiation (Apache), so you can omit filename extensions.
For example: http://example.com/faq.html -> http://example.com/faq

If your pages have the filename extension .html, you can save 5 bytes per link ;-)

[edited by: tedster at 3:57 am (utc) on Feb. 8, 2004]
[edit reason] switch to example.com [/edit]

encyclo

10:51 pm on Feb 7, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



AddDefaultCharset
replaces the whole meta tag - another 72 bytes saved (70 if HTML) per page ;) Moves the content another bit closer to the top of the document, too. You can use http headers for stuff like PICS labels as well. The only meta tag you should have left is description (and maybe keywords, although I don't bother with that one).

twist

3:03 am on Feb 8, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I just started doing the Apache RewriteRule thing yesterday. It took a little while to figure it out. I only have a few pages left to go now.

One section of my website takes 4 vars,

h*tp://example.com/page.php?name=var&cat=1&type=2&id=3

Now it looks like this,

h*tp://example.com/page/var/1/2/3

I keep learning more and more here everyday. I haven't even tapped into the SEO side of things yet.

tedster

4:11 am on Feb 8, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



One savings in CSS and HTML file size comes from the ability to assign more than one class in a single attribute. Many people don't know they can do this, and I've found that it really pays off as a site grows.

For example, <div class="c s t"> will apply the rules for each of the classes .c, .s and .t to the div. So I build a set of "utility" classes for the rules I use frequently, such as --

.c {text-align:center;}
.s {font-size:.7em;}
.t {margin-top:0;padding-top:0;}

Then I can mix and match these classes throughout the site, instead of defining too many large sets of detailed rules that I can only apply once in a rare while.

I also use use short class names (one or two letters that are easily remembered) for these "utility" CSS classes, to keep the attributes short and sweet inside the HTML.

g1smd

11:22 pm on Feb 8, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



>> a:link,a:active,a:visited,a:hover

Shouldn't the correct order be:

a:link,a:visited,a:hover,a:active instead?

(Yes, I do know that since all four are defined, that it can be shorted to just "a" in this case though)

twist

12:59 am on Feb 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Shouldn't the correct order be:

a:link,a:visited,a:hover,a:active instead?

Yeah, I only had it that way since all four were defined the same way. I just cut and pasted from my css, I should have looked it over a little more before posting. :)

ergophobe

1:19 am on Feb 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month




each time a browser has to call a external file, such as a image or script, it adds a latency of about .2 seconds for each call.

Isn't this true for HTTP 1.0, but not for 1.1 which tries to keep the connection open until all objects have been acquired?

I recently got a site looking the way I wanted, then went hunting for extra fluff in the CSS. Lots of this was whitespace and indentation (even some big strings of spaces appended to the ends of lines). Theree was also a lot of redundancy (because I didn't want to optimize before the final format was set). Anyway, the final product is between 50% and 60% of the original. A significant savings.

- combine selectors
- use multiple selectors
- use shorthand forms

CSS done right will save bandwidth. Done poorly it will not.

Tom

Krapulator

3:23 am on Feb 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



HTTP Compression can also be added to your webserver (assuming your host allows) which will send your html files in a compressed form (kind of like zip files). This reduces html files to well under half their original size.

twist

7:16 am on Feb 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Isn't this true for HTTP 1.0, but not for 1.1 which tries to keep the connection open until all objects have been acquired?

I read about the latency on a website. I don't know for sure about the latency. I think I asked once here but nobody answered. If you, or anybody, could find out for sure that would be great because I would really like to know for sure.

I took tedster's advice about mixing classes i.e. class="a b c"

It took my homepage from 2,847 down to 2,726. It's not easy to shave over 100 bytes off of a already compressed page so I am liking it.

P.S. whatever happened to potbelly?

ergophobe

5:53 pm on Feb 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Twist,

In brief, HTTP 1.0 opens a new TCP connection for each object, but HTTP 1.1 will create persistent connections and send as many objects as it can through that connection.

In not so brief....

This from Key Differences between HTTP 1.0 and HTTP 1.1 [research.att.com]


HTTP almost always uses TCP as its transport protocol. TCP works best for long-lived connections, but the original HTTP design used a new TCP connection for each request, so each request incurred the cost of setting up a new TCP connection (at least one round-trip time across the network, plus several overhead packets). Since most Web interactions are short (the median response message size is about 4 Kbytes [MDFK97]), the TCP connections seldom get past the ``slow-start'' region [Jac88] and therefore fail to maximize their use of the available bandwidth.

Web pages frequently have embedded images, sometimes many of them, and each image is retrieved via a separate HTTP request. The use of a new TCP connection for each image retrieval serializes the display of the entire page on the connection-setup latencies for all of the requests. Netscape introduced the use of parallel TCP connections to compensate for this serialization, but the possibility of increased congestion limits the utility of this approach.

To resolve these problems, Padmanabhan and Mogul [PM95] recommended the use of persistent connections and the pipelining of requests on a persistent connection.

....

Persistent Connections
HTTP/1.0, in its documented form, made no provision for persistent connections. Some HTTP/1.0 implementations, however, use a Keep-Alive header (described in [Fie95]) to request that a connection persist. This design did not interoperate with intermediate proxies (see Section 19.6.2 of [FGM+98]); HTTP/1.1 specifies a more general solution.

In recognition of their desirable properties, HTTP/1.1 makes persistent connections the default. HTTP/1.1 clients, servers, and proxies assume that a connection will be kept open after the transmission of a request and its response. The protocol does allow an implementation to close a connection at any time, in order to manage its resources, although it is best to do so only after the end of a response.

Pipelining

Although HTTP/1.1 encourages the transmission of multiple requests over a single TCP connection, each request must still be sent in one contiguous message, and a server must send responses (on a given connection) in the order that it received the corresponding requests. However, a client need not wait to receive the response for one request before sending another request on the same connection. In fact, a client could send an arbitrarily large number of requests over a TCP connection before receiving any of the responses. This practice, known as pipelining, can greatly improve performance [NGBS+97]. It avoids the need to wait for network round-trips, and it makes the best possible use of the TCP protocol.

You may also be interested in the results of some testing as given on the W3C site
[w3.org...]

Cheers,

Tom

twist

7:54 pm on Feb 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Thanks for the response ergophobe.

Not saying I understand it 100% but it does look like I will be moving back to a external style sheet.

twist

9:35 pm on Feb 9, 2004 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Okay, here is some stuff I've been looking into.

I opened my compressed page in NN with whitespaces left in my css. It was 2,729 Bytes.

Then I removed all the whitespaces in my css like so,

<?
// php commenting
echo 'table,img{border:none;}';
echo 'img{display:block;}';
echo 'a{text-decoration:none;}';
and so on...
?>

I tested the size of my page again. It only reduced it by 3 bytes to 2,726. The compression process must remove whitespace overhead. So if I keep my css in my html and skip the external style sheet, which I don't know how to compress, I can make the formatting easy to read without adding bytes. I can also leave in my php comments with no penalty.

I tried this also,

I removed my css from my html, which reduced it by 607 bytes to 2,122 bytes. I added a external css with the same css minus the php comments and the file is 1,573 bytes. So I gain 966 bytes initially and lose comments but drop 607 bytes from my other pages because of file-caching.

I can't decide whether to put my css into a external file or just leave it in php. My other concern is code-to-text ratio if I leave my css in my html.

potbelly

10:51 pm on Feb 9, 2004 (gmt 0)

10+ Year Member



P.S. whatever happened to potbelly?

I`m still here. sorry, didn`t mean to be rude.

To be perfectly honest, parts of this thread have gone so far over my head that i feel i need to go back to css school.

I have built a site with a basic knowledge of css and html. I then hired in someone to add the backend. (in cf) and i am now left with a site that is far too heavy, in short a monster.
Picture this,
A greasy spoon cafe with bikes lent up against the windows and a door hanging off it`s hinges serving gourmet food fit for a king.
That`s what my site is like.

Thanks for all of the replies, i`m off to read a bit more.
Potbelly.