Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: incrediBILL
The load times I see in my office with upmarket ISDN connections and state-of-the-art communications are obviously better than I see at home with a 56k modem on a domestic telephone line. Yet those home office results are a lot better than a couple of years ago because the technology has improved.
So what is the "typical user" profile you design for and what benchmarks do you use? For example I'd suggest a typical web user profile is:
Uses Windows98 on a Pentium II box
Has a 56k modem on a dial-up domestic telephone line
Considers 8-10 secs an acceptable page load time ... will accept longer for a major corporation or top quality site.
Is that about right... are there any other considerations?
I've come to the conclusion from my log files that you should save the heavy stuff for deeper into the site when you have them hooked. The index should be light say, 25K max with an easy navigation bar to the real content. I don't think you can go wrong this way.
Yeah, that's a very technical term, BH. Hated to use it, scares the lurkers.
>The index should be light say, 25K max
I used to think that way, but with big sites particularly, who can say they'll enter the site through the index page? If I can direct their sequence through the site, I can incrementally load their cache and by the 4th or 5th page they're opening pages that are 125k like they were 25k. BUT, I pity the poor guy that surfs in on a 3 word key and hits page 5 first.
That said, I honestly can't imagine load times are as huge a problem these days as people say they are. I'm used to waiting. If I'm at a site I really want to see, I'll wait over a minute for it to load.
I think a lot of home dial-up users are increasingly used to waiting... as businesses and big cities have joined the broadband revolution and forgotten about the rest of us, we've *had* to get used to waiting.
However, when I come across a good-looking, fast-loading site at home, I am super-duper *extra* immpressed.
Thats smart rc. I get alot of hits right smack dab into the middle of the java applet, flash enhanced, picture spinning virtual reality tours on 3 to 6 word keys and I get some "back button bippity boppin" but hey....cain't win em all.
That's the main point. Keep the page weight down and you've given your site an advantage.
Many of the competitors for my clients ignore this, and they're giving our sites an advantage that they don't need to. I'm not complaining! In fact, I hope they continue to offer pages that take 30 seconds or more to load.
There are many factors which can slow down page loading besides file size: calling cookie info from a slow database, net congestion, server overload (especially a problem when a page calls from various servers -- it seems like one of them is almost guaranteed to be slow). Given this, it's very important to optimize the total page weight. Every kb you save may help to bring in many more prospects over time.
Even if people are growing more willing to wait, that's still no license to the devloper and designer to take great liberties with file size. A lot of the slow pages look to me like no effort was made to optimize at all.
The web is a medium with bandwidth limitations, and these will be with us for a while. A designer who doesn't take the time to learn about the medium is not a good designer, they are self-indulgent. It's rare that the "artistic effect" is worth the wait.
Edited by: tedster
New Tool over at SEW: Webpage Size Checker [searchengineworld.com]
WebPage Size and Speed
I'm am a huge believer in keeping pages small. All the studies show, that users are very sensitive to page size and/or download time.
Speed is Life
Page load speed. I am convinced it is everything. It is the difference between a successful site and a non successful site. It is not easy an easy task to reduce page size, but I try to keep all pages under 20k of html, and less than 30k total with graphics.
For an example of how to do it: see Google and Yahoo. For an example of how not to do it, see CNN and ESPN.
It needs to work in 100% of the browsers on the net. That include browsers such as Scooter, IE, Netscape, Lynx, Opera, Slurp, and Googlebot. That doesn't mean it has to look the same, just that those agents can get to all the content with those browsers. Obviously differences will exist in things such as graphic support.
Leading edge or nonstandard technology. There is a reason they call it, the bleeding edge. Stay far away from anything nonstandard or requires your user to do extra work. (that includes, shockwave, java, and other attempts at embedded tech such as active-x or vbscript). You can't afford to lose (or slight) 10% of your audience. Granted, if you are running something such as a WAP site, WML would be appropriate.
Design for who?
Who is the typical user? Much of the common wisdom is that the average user is using around a P2 at 333mhz with 32 to 64meg of ram and 56k dialup. If that is the "typical" web user, that means there are a bunch of users running less than that. In order to be inclusive, you have to design for a whole lot less than that.
Tips from a Speed Freak:
Because of modem compression, HTML will download twice as fast for most people as graphics. (eg: 20k of html will download as fast as a 10k jpg).
If your server supports it, you might try experimenting with Apache Mod_GZIP. It can reduce your html bandwidth and download times by 50%.
Try simulating walking through your site at 28.8k. Assume 20% of your users are on 28.8 or 28.8k performing systems.
Take your logs and resolve the ip addresses. Throw out known cable users (home,rr) and any isp domain name with the word "cable" or "dsl" in them. Assume the users that are left are using 28.8k-56k modems. That is how many users are connecting at lower speeds to your site. Then compare the time from the first request of the html, to the last "object" request that is on that page (a graphic). That is a good indicator of how long it is taking to download the page for your users.
The number of page reloads you are getting. Those can also point to a server problem. Make sure your logging or counter software doesn't automatically throw out duplicate requests - that is important data.
Try a big page and then try a ultra small page. Notice the difference in the page views per user. Between a 10-15k page and a 40k page, the difference will be dramatic (it may be good to have medical personnel handy).
Last year's entries and winners really give one something to think about...
If Not Response.IsClientConnected() Then
'Log a page that they didn't wait for here
joined:Nov 29, 2000
The rendering time wait for nested tables is worse in Netscape (4.7 and earlier) than it is for IE, Opera or NN6.
That said, if you can get a header tag and critical text positioned outside your tables (or better still by using CSS for positioning) then the text flows on the page very fast, giving your visitors something to do while the graphics load.
Yet another display time factor I've stumbled on recently is the way IE renders a progressive jpeg on a PC. It does NOT render the image progressively, but waits until the entire file is downloaded and then displays the image all at once.
This means, paradoxically, that a the larger file (standard JPEG) will begin showing on screen quicker than a progressive! Netscape handles progressives "correctly", but since the majority uses IE...
I'm sure that fast, informative pages are better for business than beautiful or tricky pages.
This may sound like heresy, but why not use "noindex" on a heavy graphics page. That would cut down the chances of surfers coming directly in to it. You could then take all the keywords from that page, put them on another page, preload from there and then lead in to the graphics page.
Maybe this is different on different browsers
but I have played with it and I think it's worth a mention.