| 10:36 am on Dec 2, 2002 (gmt 0)|
I try to stick to 40k for the entire page.... this lets me average download time at 10 - 20 seconds. I don't always succeed in leveling off at 40k, but if gives me a benchmark to aim for on every page in order to optimise the users experience.
I hate flash based sites from a useability and accessibility point of view - but when there is flash needed, I make use that it's of the streaming variety!
And a hearty welcome to WebmasterWorld!
[edited by: BlobFisk at 11:24 am (utc) on Dec. 2, 2002]
| 10:49 am on Dec 2, 2002 (gmt 0)|
Welcome [webmasterworld.com] to WebmasterWorld papercut.
|every site should load with a modem in less than 40-55 seconds. |
You've got the patience of a saint. :)
If a good amount of relevant content in the area "above the fold" doesn't load within 15 seconds on a 56k connection I'm gone. (along with thousands of others)
| 11:26 am on Dec 2, 2002 (gmt 0)|
|most surfers are still using modems. 56k with a subpar connection at that. |
The broadband industry really pushed some disinformation in this area last year last year - it made many developers feel they could ignore the needs of the 56K connection.
The milestone that was widely touted in the press was the moment when over 50% of the US connection time used broadband access. But, since so much broadband is "always on" -- whether anyone is active on the connection or not -- that is a deceptive statistic. It doesn't mean that 50% of actual Internet USE happens over a broadband connection - not even close.
|If a good amount of relevant content in the area "above the fold" doesn't load within 15 seconds on a 56k connection I'm gone. (along with thousands of others) |
I couldn't agree more. I aim for 5 seconds to have usable content rendered on-screen.
It's going to be this way (56k limitations) for a good while to come. The broadband infrastructure build-out has slowed to a crawl. All the easy locations have already been wired, and the rest of the geography has a long wait, especially in the wake of funding limitations since the dot.com bubble burst.
EVEN SLOWER DIAL-UPS
Here's another tidbit from my niece who is in the US Navy and gets to a lot of countries in a year. Inexpensive connections at 56K can be hard to find, even in parts of Europe. If you can make it easy for a return visitor to transact their business with you quickly, you'll get more business.
She highly values one site that sells consumables. They give her a page with the item's she recently purchased. She can click on the boxes she wants and send the order, all within a minute of connection time. Her eyes sparkle when she talks about this vendor!
[edited by: tedster at 11:38 am (utc) on Dec. 2, 2002]
| 11:29 am on Dec 2, 2002 (gmt 0)|
There is an interesting thread here [webmasterworld.com] on what people find most annoying on sites and causes them to hit the back button and go elsewhere.
One of the reasons that crops up repeatedly is long load times (especially on those awful flash splash page... especially without the skip into link...bah!).
15 seconds is considered a good level to aim for. Most users will wait that long - but not much longer. Dante is right when he says that 40 - 55 seconds is too long and that most users will be gone. It's part of the challenge - make your images small, use CSS and keep your code clean!
You're right though - we should always try and consider the user with the most obstacles... although, CSS and fluid page design allows us to cater for different resolutions with one design these days which is a definite improvement.
| 11:36 am on Dec 2, 2002 (gmt 0)|
|that's a deceptive statistic. |
...and the above statement is redundant. <grin>
Three statisticians go deer hunting with bows and arrows.
They spot a big buck and take aim.
One shoots and his arrow flies off ten feet to the left.
The second shoots and his arrow goes ten feet to the right.
The third statistician jumps up and down yelling, "We got him! We got him!" ;)
| 2:04 pm on Dec 2, 2002 (gmt 0)|
Since using the additional bandwidth will allow for an enhanced site, and presumably give the user a richer experience, can a site test for it?
Are there available demographics for broadband users so that separate 'broadband site design' tactics might be employed?
| 2:33 pm on Dec 2, 2002 (gmt 0)|
Keep in mind, too, that providers are now set on providing ISDN, DSL, cable, etc. and it seems like they've forsaken the poor folks on dialup connections sometimes.
My ISP is via dialup, supposedly 56K, and I've gotten connection speeds as low as 28000 bps! It really sucks surfing the web at such slow speeds. Then to end up at a site where they've got every bell ringing and every whistle blowing! I leave asap, simply because I don't have time to wait 2 minutes for the site to download - no matter how interested I am in the content. I tell myself I'll return when I've got a better connection, but it just doesn't always happen.
| 3:03 pm on Dec 2, 2002 (gmt 0)|
IMHO, dowload time is the most important thing. W/O a fast download the best site in the world may get ignored by most. Since "thin" sites tent to contain lots of text, they often do well in the engines as well.
Any site selling anything should aim to get each page to come up almost instantly on a 56k connection. If the Flash is the content and the reason people are coming to the site, then it doesn't matter as much how fast the download is but there is still no reason to keep someone waiting if its not necessary.
| 3:24 pm on Dec 2, 2002 (gmt 0)|
Following from what Tedster was saying;
The challenge is to build a site, that is quick in downloading, as in 0-15 seconds maximum (assuming all goes well, internet bottleneck wise) on a 56k mdm, roughly @ 2-5KBps, then you are looking at a possible variable of 0 - 75 KB's in that time frame. Even within that you should be looking at the lower quartile, as the bottleneck's are big, and only at bizarre hours, when would-be nerds are awake (i include myself) and the majority of the online world are asleep, do the bottlenecks reduce.
So around 5-20KBps per page is the most desirable.
The internet a.k.a. the biggest collision domain known to man.
Next step, is then usability, or more to the point user comprehension of what exactly the site is about and how to use it in the quickest amount of time, another download feature, but of the human mind. This has to be a close second to page size, assuming that all connected individuals on all mediums around the world are to be considered. This along with page size should permeate across the entire site.
Obviously there are many other factors, to take into account with page size, download speeds, site / page durability, customer / visitor loyality, etc. Though that would be taken this thread wildly off topic.
| 3:48 pm on Dec 2, 2002 (gmt 0)|
One more thing to keep in mind. Even when I'm surfing at ludicrously slow speeds, if I get hooked on the first page where I enter, I'm more likely to endure slower download times on other pages of the site. So I think the real factor to consider is the download time of that very first page.
| 7:00 pm on Dec 2, 2002 (gmt 0)|
|So I think the real factor to consider is the download time of that very first page. |
What about SE referrals? On my sites, only 15% of the users come through the front door from SE's. That's why all my pages load quickly. :)
| 7:28 pm on Dec 2, 2002 (gmt 0)|
I'm not sure my question is proper here, but I think so. My site is image heavy, basically it's just a large photo album.
Anyone have an idea how this affects users willingness to wait for a page to load? How much longer is the average surfer willing to wait, if at all?
I assume if a person is looking for pictures they must be willing to wait longer. But I don't want to stetch it to far.
I've tried to keep load times to around 15 seconds @ 56K, which is easy enough to do with a single image, but gets a lot harder with multi image pages. And multiple images add a lot of value to the page.
| 7:46 pm on Dec 2, 2002 (gmt 0)|
This is where image preload scripts and css come into play. The goal here would be to preload your images and then make sure that your main readable content is the first to load while users are waiting for the graphics.
This is difficult to do in a table based environment, but it can be done. With css, it's a breeze! ;)
I shoot for an average of 15 seconds over a clean 56k. If I can make pages with graphics load under 10 seconds, that is the ultimate goal. Another thing to consider is optimizing your graphics to their fullest. Get all those .gifs down to 2, 4, 8, 16, and 32 color palettes. Take a second look at those jpgs and see if you can reduce the quality another notch or so. All this adds up and makes for a quicker loading page.
On a side note, I've noticed sites built with css loading much quicker over a modem connection than those based on tables. Less code for the browser to render and immediate presentation of main content while other things are still loading.
|Made In Sheffield|
| 7:52 pm on Dec 2, 2002 (gmt 0)|
The usual way round that problem is to provide thumbnail images on one page (small version of the pictures) which link to the big brother version. It would also be nice to indicate the file size and download time @ 56k next to each one.
I disagree on the 15 secs thing. I think 10 seconds is pushing it. Research suggests:
0-1 seconds, the user hardly notices.
1-10 seconds, you still have the user's attention.
11+ seconds, the users attention has gone elsewhere, probably to another site.
I try to keep file size of each page around 5-10k (max 20). There's an overhead when a user views the first page because of downloading the style sheet(s) so initial download is around 15-20k (max 30). That's not including images but if all images have height and width set the page can be displayed while downloading them.
| 7:59 pm on Dec 2, 2002 (gmt 0)|
|I disagree on the 15 secs thing. I think 10 seconds is pushing it. Research suggests: |
I would too if it took 15 seconds to see any content. But, if you can get the readable content to load within a couple of seconds and then while the user is reading the rest of the page is loading, then you've accomplished your goal. Typically the visitor is there for information.
In a graphics environment, like the one mentioned above, the use of thumbnails is excellent advice.
Note: The browser renders the code from top to bottom. If you've got a bunch of markup sitting there right after your <body> tag then it needs to load all of that first. Again, the goal is to position your content as close as possible to the opening <body> tag to avoid delays.
|I try to keep file size of each page around 5-10k. |
Very few graphically appealing pages out there in the 5-10k range. ;)
| 8:16 pm on Dec 2, 2002 (gmt 0)|
I agree with 99% of the comments yes design for the lowest common denominator (and then ad a bit extra :) no no stop it) it all depends upon your Target Market, however, the flash backlash is starting to get to me. The product is great it's just the planning and implementation, that in the main is badly done – use preloaders – no not if it’s a splash screen… It is perfectly possible to stream and have an effective 15 – 20 K splash page that’s far more effective at setting “the scene” than standard html page… It just takes a bit of planning. And as for preloaders, well that’s another story…
As for page size, personally I try to stay under 30K, that gives even the AOL mid day bunch @56K a chance. The most difficult decision to make is based upon the TM. ecom sites, yeap fast to load and in the main (sorry folks, boring) arty based sites have potential to spin things out and give the viewer an experience. Which has got to be one of the most exciting thing about the web – there isn’t a standard approach, yet (and long may that be, IMHO) or am I just a sad idealist!?.
But at the end of the day it is the content that we all come back for (who cares about the download time) – that’s SURELY is why we are here @WebmasterWorld!
| 8:28 pm on Dec 2, 2002 (gmt 0)|
Thanks. I do have at least some text above the first image. OK, I use tables, but then I can barely spell HTML let alone CSS, so.... :)
I also use heigth/width specs. So any text in the image table pops up very fast.
And I use multiple tables as I go down the page. I know this adds code, but I did it because I thought putting more tha one large image in a single table might be slower, maybe I'm wrong on that.
Thumbnails present a problem I don't know how to get around. But I think that's another thread, as it's unrelated to the size, speed issues.
| 11:35 pm on Dec 2, 2002 (gmt 0)|
So whats the vote here 10, 15, 30?
Can we all add our best guess poll?
I say under 30 is ok, what say you all (?), lets try to make these poll posts short....
| 11:44 pm on Dec 2, 2002 (gmt 0)|
|So whats the vote here 10, 15, 30? |
The smaller the better. The magic number that has been discussed here in the past is 40k. To get an accurate size of your web pages, try using Brett's Web Page Size Checker [searchengineworld.com], great little tool!
If you are truly concerned about the load time of your pages, then keep them small. If you position your html content properly, then keeping them real small is not an issue. Again, if you can render content first while other items are loading in the background, you are that much further ahead.
Based on almost 7 years of web design, I'll admit that keeping pages under 30k is a challenge when you are working with graphically rich navigation elements. If you have one image and the rest of it is text, keeping pages under 20k should be no problem. The 5-10k range is probably not a viable solution for most of us.
| 11:58 pm on Dec 2, 2002 (gmt 0)|
|Again, if you can render content first while other items are loading in the background, you are that much further ahead. |
I agree with this. I have a slow connection but I know it's slow. So I'm used to the idea that stuff won't appear instantly. But at the same time I don't want to sit looking at a blank page so having something appear pretty fast is important. Having text appear first is a good way to do this.
I've even been known to sit waiting for a flash animation as long as I've chosen to do so (ie it's not an intro page forced apon me with no text link) and as long as there's something to tell me it's loading and how long it will take. Then I know something is happening and I can look at another window or something while it loads.
But then if it's a site I want to read and itneract with a lot (like this one) then I don't want to be waiting for ages for each page to download. Content is the most important thing here and I'm not prepared to wait while the page sorts itself out before I can begin reading.
So I don't think there are hard and fast rules exactly, it varies depending on the site. But it's certainly something that needs to be considered.
| 12:12 am on Dec 3, 2002 (gmt 0)|
Another thing to keep in mind is that a lot of 56k surfers turn images off because they are tired of waiting for pages to load. If they find a site they like, they might enable images to see what is there.
Somewhere in this thread someone mentioned splitting their content up into separate tables. Smart move! If you have everything in one all encompassing table, the browser needs to read everything before rendering it. If you have 4 tables going from top to bottom, the browser will render the tables in their order; table 1, table 2, table 3 and table 4.
Now, if you were using css...
| 12:50 am on Dec 3, 2002 (gmt 0)|
I found that sometimes the simplest option is the correct one. I refused to use html compressors for some time, but one month ago I finally installed one and I think is the best idea I've ever had related to my web site. Our bandwidth usage has been reduced to a 40% (15% if we only considered text pages) and surfing within our site with a 56K modem is a gratifying experience: is fast even when you load a 150KB page like our forum (with the compression it's reduced to about 20KB).
One last thing: the cpu load related to the compression is imperceptible.
|[...] compresses the data (using the GZip compression format) right before it leaves the server. The data is then transfered to the client in its compressed form. The client then decompresses it. Most recent browsers handle this by default and for the ones that don't, XCompress will seamlessly downgrade to a non-compressed data transmission mode |
If someone wants to know the software company please send me a sticky mail, although I think most SEO's and webmasters already know the software I'm talking about.
| 3:26 am on Dec 3, 2002 (gmt 0)|
>>the software company<<
Use mod_gzip for Apache!
It's open-source & it's great. :)
| 3:57 am on Dec 3, 2002 (gmt 0)|
My home page is 23.5K, but most of my pages are in the 35K range, and I've had quite a few compliments from users (including European users with slow connections) about how quickly my pages load. So I'd guess that 35K is quite acceptable, at least for an editorial site where people aren't constantly clicking from page to page. It might be too much for things like online catalogs where users do a lot of clicking.
Filesize is just one of the things that determine how fast a page displays, of course. An overloaded server or inadequate bandwidth at the server or Web-hosting service can be a much bigger problem than page size.
| 7:00 am on Dec 3, 2002 (gmt 0)|
This broadband article prompts an interesting insight into (home) user requirements V page sizes, load time et al.
| 9:10 am on Dec 3, 2002 (gmt 0)|
Unfortunately we use IIS, so we have to use propietary software. Nevertheless I'm willing to pay it: it's fast even in an old machine, it compress a lot, it's not expensive and we haven't had any complaint of our users (about 150K a month).
| 1:11 pm on Dec 3, 2002 (gmt 0)|
I have to say I am with richardb, there are lots of people that claim they know how to use Flash but they don't. They really don't! Friends of mine make the strong claim too, and I look and see them use it and it's clear they can get the shape tweenings and the scripts working but when it comes to optimsation its all self-taught so they don't know the simple steps that build those "high impact vector based websites" that the box claims.
That's the price of tutorials that 'give you the goods' in 5 simple lessons, people get complacent thinking they know it all and then badly developed flash sites become industry benchmarks.
Flash is a great product.
As for page sizes I tend to go for about 14K.
| 1:21 pm on Dec 3, 2002 (gmt 0)|
pageoneresults is correct - trying to get an average page below 10k is a very difficult challenge, not impossible, but when you are up against a deadline it's a difficult to justify spending so much time on it.
10 - 15 seconds seems like the maximum range of download time... although this is a difficult things to measure bsaed on network traffic and ISP bandwidth. As caine said: "The internet a.k.a. the biggest collision domain known to man."
Flash is a very powerful tool, unfortunately it is oftentimes wielded by the inexperienced, which is when we get these large intro pages and long preload waits.
There are ways to streamline sites, use of external CSS and JS files, trimming down graphics etc. Also, as pageoneresults mentioned, good use of layers can mean that your content text gets displayed while the bells and whistles are loaded. It keeps peoples attention while the frills come down.
| 8:47 pm on Dec 3, 2002 (gmt 0)|
Me, I'm gone in 5 secs if nothing appears. I'm gone quite often ;) (I'm talking of personal surfing as opposed to research)
I read, I think on Jacob Nielson's site, that the average surfer gets p****d waiting after 10 secs these days, where in the past it was 5 secs. I'm with the old school.
Question - does pre loading mean that it loads immediately or has no effect on download of other stuff at the same time or simply that you run slow waiting for stuff you may never see to download? Or am I being silly (please don't answer the last bit :)
| This 61 message thread spans 3 pages: 61 (  2 3 ) > > |