Forum Moderators: open

Message Too Old, No Replies

Download speed - did the whole web forget the 40kb sweet spot?

a modest rant

         

tedster

9:44 pm on Mar 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



What's going on with the web lately? Did everyone forget about the research that shows 30-40kb total page weight is the optimum?

It's a rare page that comes in under 100kb these days. Throw in slow server calls to back office databases, 3rd party ad servers and other little frills and curlicues - and a dial-up user can wait 45 seconds to a minute for a page to read.

One of the reasons I don't do more news reading online these days is because of this nonsense.

Ah well, it just makes fast pages stand out more from the crowd.

Funny thing is, with css, better image compression algos in Photoshop and other technology progress, it's easier than ever to hit that 40kb sweet spot.

Brett_Tabke

9:47 pm on Mar 28, 2002 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I ditto 100% everything you said.

The only thing I could add, is that there a very dim but bright spot. That is to buy a PDA and discover the joys of the hidden 'pda' web. Small pages [kevxml2a.infospace.com], fast loading [wired.com]

pcguru333

9:49 pm on Mar 28, 2002 (gmt 0)

10+ Year Member



I don't know what the logic is behind the higher page weights.

Personally I like to sell the point that my client's pages will load fast AND still look good. The last site I did had a 43K weight.

I would venture to say that those same 'heavy' sites don't come close to having valid code. It shows a lack of consideration that the Web is 'World Wide' and not just high speed-IE6-lastest hardware-users.

rcjordan

9:54 pm on Mar 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>40kb sweet spot

Have you tried to convince a site owner of that lately? Once they grasp what you're telling them (WHAT!!! 40k! You mean everything on the page? You're kidding, right?), they are ready to battle you tooth and nail. I can't imagine many developers willing to put in the effort to convince them. And that's assuming the developer cares, "fat" is easier.

tedster

10:07 pm on Mar 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



This is a big reason that people get so frustrated with pop-ups and pop-unders. They add extra bandwidth load to an already frustrating experience.

If the page (and the pop-up) just came roraing down the pipe, I doubt that I'd be as bothered. I close most pop-ups before they ever load, because I'm HOPING to see the page I asked for sometime before dinner.

AlbinoRhyno

10:19 pm on Mar 28, 2002 (gmt 0)

10+ Year Member



I'm on a cable modem, and someone would have to pry it away from my hands if they wanted me to go back to dialup! I would not surf the internet at 56k anymore. I think there is a large majority (70%+?) still on 56k, so designers are missing the boat on this one!

I managed to design an attractive website at aprox 10k. It's php, has a style sheet, two header graphics (255x50 and 90x110), divider graphics (a 1x1 resized), and more. If I, a big ole newbie with nary a site to his name, can do that, I'm sure the experienced designers can as well. I think it comes down to the same problems the software industry experienced - sloppiness. Why make the best product you possibly can, when you can whip out a decent product for the same price and make extra money fixing it later.

It reminds me of the Dilbert cartoon from a couple days ago:

We're the least expensive vendor unless your requirements change mid-project.

txbakers

10:23 pm on Mar 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



How do we measure "weight"? Is it the file size, or the file size after rendering?

I'd like to check mine.

txbakers

10:25 pm on Mar 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



) If the popup just came roraing down the pipe, I doubt that I'd be as bothered

I'm still bothered by pop-ups and unders. What a pain in the %%#$@%#%!

I've never been to Orbitz, and due their excessive pop-unders I never really want to.

tedster

11:41 pm on Mar 28, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



> How do we measure "weight"?

HTML + graphics + any external files (css, js, whatever)

It's a rough measure because 15 kb of HTML will transmit faster that 15 kb of highly compressed jpg files, due to native compression in the modems.

I often think differently - how fast I can get readable content on the screen. I've read some tests that show 3 seconds is the optimum. After 4 seconds of blank screen, the Back Button phenomenon starts to kick in with a vengeance. With CSS, 3 seconds for at least some content on screen is very doable -- IF the developer cares, that is.

This 3 second goal for rendering something is an important mark. Even after the entire download is finished, nested tables can take a while for the browser to calculate and render, and the browser also takes extra time to decompress jpg files.

So rendering time is really the issue - but total page weight is one "rule of thumb" way to address it.

luma

7:49 am on Mar 29, 2002 (gmt 0)

10+ Year Member



And I thought the limit was 3 KB. ;) Jakob Nielsen's Why This Site Has Almost No Graphics [useit.com] states this.

Telling from my own surf behavior, I prefer the no-fat pages. Thank god they still exist and are hugely successful over here. As always, Opera's no/cached images only is your friend, too.

chiyo

8:06 am on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



"...I don't know what the logic is behind the higher page weights..."

There is a logic but its flawed. It assumes that 1. broadband makes page weights superfluous (ridiculous) and 2. that all people that "matter" will be on cable/DSL braodband soon. (debateable at least)

Agree with brett that PDA deivery is where some excitement is for fast delivery. Leave the heavy stuff on the Web and desktop computers, but for really delivering content, gear up for the PDA revolution!

sean

12:01 pm on Mar 29, 2002 (gmt 0)

10+ Year Member



Chiyo, from where I sit broadband does make page weights superfluous, which is precisely makes it is so dangerous, like swallowing a handful of pills before operating heavy machinery.

glengara

1:09 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Odds are stacked against "lean'n'mean", hardly anybody in its corner.

The client usually looks for bang for buck, the back-end boys will probably insist on a database, and the designers can't have their genius constrained by KB.

In some ways we seem to be going backwards.

pcguru333

1:37 pm on Mar 29, 2002 (gmt 0)

10+ Year Member



I think it is still worth trying to sell faster downloads. Granted there will always be clients that insist on doing things their way. But most of my clients think it is great when I point out the fact that their page is more accessible to their clients.

I personally will click the 'Stop' button and type in a different URL if the page I want is slow.

High Speed Internet access is great, but I don't know ANYONE that has it. I used to but I can't afford the cost. Market downturns can affect this. O and BTW I don't live in rural Arizona where you would expect less High Speed access users. I live in the Phoenix metro area.

One thing to possibly look at is Internet access while on the job versus home use. You can get an idea of this number by comparing page requests during and after business hours.

If we want lean pages on the net then it starts with the designers (i.e US). So I say push the issue to our clients and point on the benefits to them.

tedster

3:12 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I sell my clients on the leaner download concept as their "secret weapon". If someone gives me art work or page ideas that won't work without mega bandwidth, I can be a very obstinate man!

However, I've found that a little education, particularly with show-and-tell rather than mere abstract concepts, can win the day.

brotherhood of LAN

3:14 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Great thread,

Looking at my stats, each page per unique user uses 6 kilobytes of bandwidth. I suppose this is way under the 40k mark :)

IMO 40k or less i totally agree with, as a modem user. Modems are here to stay for a long time. Maybe its worth mentioning here about ONDigital in the UK, who have severely set back the Govt. aims of getting everyone online and using digital TV. All in all, modems are mainstream for at least another 5 years.

Text will always be your best friend on a web page, and it uses a single byte for every character...great !! :) If you are putting this text in a table, using H tags etc, then it will be something slightly above 1 byte per visible character you use

Graphics, Flash, or basically anything else takes up MUCH more space, and as said, if this these are poorly optimised, then its gonna be a slow loading / soon to be annoying site.

99% of things I see on the web can be optimised. Im from the school that gets annoyed about these un-optimised objects because en masse, they slow the whole web down!

If 40K is the "true" benchmark, I hope I am not penalised for having deep content pages at a fraction of that file size (about 20k) total.

P.S. Would the "weight" of a 3rd party banner on each of your pages affect any optimizing via file size? (since they are pretty large beasts when it comes to k's)

tedster

4:55 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>> Would the "weight" of a 3rd party banner on each of your pages affect any optimizing via file size?

You betcha - they can be very big offenders, and particularly aggravating when the 3rd party server is sluggish. Most of the time, everything else just waits for the ad to load.

There are wide variations in allowable file size for banners. I created some that had to be under 7kb, but others were as high as 15kb. I've seen some "rich media" that are even more. And forget it on skyscrapers, interstitials etc

ggrot

5:22 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Has anybody tried multiple content delivery (cloaking) based on factors that might indicate user bandwidth? For example, certain ip ranges are very likely to be cable or dsl users, certain tld's (edu, gov) are likely to be using shared t3s, etc.

pageoneresults

5:26 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Okay, dumb question. How is everyone determining the weight of their pages?

In IE, if I go to...
File > Properties > Size
Is that the proper weight of the page? (no)

In NN, if I go to...
View > Page Info > Content Length
Is that the proper weight of the page? (no)

If that is the case, whew, all my pages are under 30k! As of this post, this thread now shows a content length of 39092k.

<edit>The above methods are strictly for content and do not take into consideration the images or any external files that are being called!</edit>

I answered my own question and forgot about Brett's Webpage Size Checker [searchengineworld.com]!

Webmaster World passes with flying colors!
Grand Total: Images + HTML = 25184k (bytes)

(edited by: pageoneresults at 5:44 pm (utc) on Mar. 29, 2002)

ggrot

5:32 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use dreamweaver, which gives me a very good estimate at the bottom of the screen(minus the minor cookie passing my server throws in dynamically for tracking). The browsers wont sum up external file sizes like images or css.

Also remember to consider caching. If there is a common image (logo) that is loaded on the first page, you can count that out of the calculations for the other pages.

I've also seen some webmasters go crazy with caching images and write javascript functions that get called on the onload() function which preload a couple images that are on the next level of pages while the user is reading the currently loaded page. I have to admit that I've never done this, but have considered it.

tedster

6:38 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I use Homesite which also gives a good page weight total. Any good editor should givbe you this total rather easily. It's essential info, in my book.

I often pre-load some images for the next most probable page. Studying server logs shows me that this has been very effective in getting people to read through to page 7 of an article - and not going elsewhere after page 3. As a friend of mine said about pages that come in fast, it's impressive. It makes you feel like this company has its act together.

A common scheme for a page uses thumbnails and links for pop-up enlargements. I almost always preload those enlargement images, unless there are many many many. I also see that this helps people feel comfortable in exploring the site more in depth.

brotherhood of LAN

6:43 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



pageoneresults

to get exact file sizes could prove to be pretty difficult :(

If you are using external files, chances are after the first page view they are cached, altering the average page size for the end user, which is what matters here

Just as a reference, i preview my pages in a browser and save them offline. The value of the folder containing external files and the .htm file should be a good indicator of how fast your pages are

rcjordan

7:02 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>> Would the "weight" of a 3rd party banner on each of your pages affect any optimizing via file size?

>You betcha - they can be very big offenders, and particularly aggravating when the 3rd party server is sluggish. Most of the time, everything else just waits for the ad to load.

Very, very big offenders. It's one of the main reasons that I moved my 468x60 banner inventory to house ads, 15K was getting to be on the light side of what advertisers want to deliver. Affiliate programs give you some choice and added control, but they often have tracking calls added to the burden.

Also, if you're using ssi to deliver third party ads you should be aware that their code may not validate under the W3 type you've chosen. Perhaps not a big point (yet), but something that bears watching.

tedster

10:02 pm on Mar 29, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Oh no! The boundaries of my world are shattered once again.

I just went to a Jakob Nielsen Alertbox page about WebTV usability - and Opera clocked it at 158 kb - before I got a "transmission interruted" message. Are there no idols without feet of clay?

adamxcl

11:57 pm on Mar 29, 2002 (gmt 0)

10+ Year Member



I'm big on small pages, minimal but look decent. I just looked at my catalog of sites...over 23,000 html pages done, average size of 12K with the absolute largest of them all at 120K

I'm amazed when I look at competitors pages. My site 9 seconds on 28.8 speed. Most competition closer to a 100 seconds or more. It's easy to find tiny graphics that are 100k, never optimized for the web.

I just had a new client that had HUGE but tiny images on their menu. The page took 280 seconds on 28.8. I made the images the way they should be and it went down to 11 seconds. Gotta laugh out loud.

brotherhood of LAN

12:02 am on Mar 30, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Adam, I like it :)

Minimalism on pages is quite a good thing. Information overload is bad.

Slow loading images VERY BAD!! I hate seeing small time sites that have images any bigger than 50k

Any bigger than that it has to be a very VERY important image ;)

tedster

1:36 am on Mar 30, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Ever done this?

You're loading a page and notice a gif that's coming in pretty slow. The common eye says it's made of a small number of colors on a white background. So you open it up in PhotoShop or whatever and you see a 256 color palette. Maybe 200 of them occur in just one pixel somewhere.

Where can you find such a lazy, negligent website? Could be CNN, could be MSN, could be almost anywhere. Somtimes its even a website selling software for compressing graphic images!

Maybe the time is approaching for some kind of accreditation for web design.

chiyo

6:35 am on Mar 30, 2002 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



sean said "...Chiyo, from where I sit broadband does make page weights superfluous..."

Sean, my reasoning is that a page (especially one optimised for braodband) is made up of various elements. Depite someone accessing via broadband, loading time is influenced by retreiving all those elements, often from different servers (such as 3rd party banner deliverers) and different formats. Your page loading is often governed by the weakest link. It is also influenced by the delivery path, and the power of the receiving computer (it does take memory to load plug ins and the like and many computers dont have all the plug ins that braodband page designers feel they should have!) Finally a 100kb page WILL load slower than a 10kb page, no matter whether the user is on broadband or dial up. That may well make the difference to getting "instantaneous" browsing experience, or a less than smooth browsing experience. Thats why i said that the assumption that broadband makes page weight superfluous is ridiculous!

Rhys

11:28 am on Mar 31, 2002 (gmt 0)

10+ Year Member



Hi -
Being paranoic about having small fast web pages, I often analyse slow pages to see where the drag is coming from. Mostly the guilty parties are un-optimised graphics (i.e. just reducing width and height tags doesn't make the file smaller), photos posted as gifs, using lengthy java scripts for rollovers, and using proprietary pagewriting programs like Dreamweaver, etc., instead of coding the page with HTML. E.G. I just tried writing a medium sized page with Dreamweaver and it weighs in at 31.6kb for just the page code without the graphics or style sheet. For instance it has put a width, and a font, and a div tag in every table cell.
I use an "inline" rollover code that adds only 65 bytes of code to each <a href tag, so the massive script rollover code is just extra weight!!

sean

2:53 pm on Mar 31, 2002 (gmt 0)

10+ Year Member



>Finally a 100kb page WILL load slower than a 10kb page, no matter whether the user is on broadband or dial up.

Yes, but it takes a lot of extra page weight to notice the difference. Here I am talking only about pure weight in terms of K, which is what I thought we were talking about. All of the other factors in page load are important, and I think they will actually gain in importance relative to pure weight as broadband increases.

This 54 message thread spans 2 pages: 54