| 1:23 am on Mar 29, 2001 (gmt 0)|
That's about the same as my typical user profile. If I have titles and descriptions in the SERP that are dead-on target, then I think I can risk expanding the "time before bailing" some -maybe to 25 or 30 seconds.
| 1:38 am on Mar 29, 2001 (gmt 0)|
>>>>time before bailing
I've come to the conclusion from my log files that you should save the heavy stuff for deeper into the site when you have them hooked. The index should be light say, 25K max with an easy navigation bar to the real content. I don't think you can go wrong this way.
| 1:53 am on Mar 29, 2001 (gmt 0)|
>>>>time before bailing
Yeah, that's a very technical term, BH. Hated to use it, scares the lurkers.
>The index should be light say, 25K max
I used to think that way, but with big sites particularly, who can say they'll enter the site through the index page? If I can direct their sequence through the site, I can incrementally load their cache and by the 4th or 5th page they're opening pages that are 125k like they were 25k. BUT, I pity the poor guy that surfs in on a 3 word key and hits page 5 first.
| 1:57 am on Mar 29, 2001 (gmt 0)|
I've got a 33.6 dial up at home... and it doesn't even come close to connecting at full speed. It's a darn good day when we connect at 28.8.
That said, I honestly can't imagine load times are as huge a problem these days as people say they are. I'm used to waiting. If I'm at a site I really want to see, I'll wait over a minute for it to load.
I think a lot of home dial-up users are increasingly used to waiting... as businesses and big cities have joined the broadband revolution and forgotten about the rest of us, we've *had* to get used to waiting.
However, when I come across a good-looking, fast-loading site at home, I am super-duper *extra* immpressed.
| 2:08 am on Mar 29, 2001 (gmt 0)|
>It's a darn good day when we connect at 28.8.
Until the last 4 or 5 months, it was the same for the mid-Altantic states, too. It's a little better, but not much. You make a good point.
| 2:09 am on Mar 29, 2001 (gmt 0)|
>>>>load their cache and by the 4th or 5th page
Thats smart rc. I get alot of hits right smack dab into the middle of the java applet, flash enhanced, picture spinning virtual reality tours on 3 to 6 word keys and I get some "back button bippity boppin" but hey....cain't win em all.
| 2:21 am on Mar 29, 2001 (gmt 0)|
Translation (or rencke will be kickin' sand in mod forums tomorrow):
"back button bippity boppin" = "do not wait for page to load"
| 2:33 am on Mar 29, 2001 (gmt 0)|
Thanks rc...sometimes I forget and get too technical.
| 12:05 pm on Mar 29, 2001 (gmt 0)|
I generally don't completely abandon a site because of size, but I do visit much less frequently if I know it's going to be more than about 20 seconds to load the page that receives the updates/articles/whatever.
| 3:35 pm on Mar 29, 2001 (gmt 0)|
>> when I come across a good-looking, fast-loading site at home, I am super-duper *extra* immpressed.
That's the main point. Keep the page weight down and you've given your site an advantage.
Many of the competitors for my clients ignore this, and they're giving our sites an advantage that they don't need to. I'm not complaining! In fact, I hope they continue to offer pages that take 30 seconds or more to load.
There are many factors which can slow down page loading besides file size: calling cookie info from a slow database, net congestion, server overload (especially a problem when a page calls from various servers -- it seems like one of them is almost guaranteed to be slow). Given this, it's very important to optimize the total page weight. Every kb you save may help to bring in many more prospects over time.
Even if people are growing more willing to wait, that's still no license to the devloper and designer to take great liberties with file size. A lot of the slow pages look to me like no effort was made to optimize at all.
The web is a medium with bandwidth limitations, and these will be with us for a while. A designer who doesn't take the time to learn about the medium is not a good designer, they are self-indulgent. It's rare that the "artistic effect" is worth the wait.
Edited by: tedster
| 6:39 pm on Mar 29, 2001 (gmt 0)|
calling [...] info from a slow database
OK... that's one that will get me to leave a site... if the d@mn ad network or tracking server is running SO slow that the entire page hangs over one lousy banner, I'll leave.
| 9:27 pm on Mar 29, 2001 (gmt 0)|
I got to thinking about this a great deal this week (thanks for the inspiration).
New Tool over at SEW: Webpage Size Checker [searchengineworld.com]
WebPage Size and Speed
I'm am a huge believer in keeping pages small. All the studies show, that users are very sensitive to page size and/or download time.
Speed is Life
Page load speed. I am convinced it is everything. It is the difference between a successful site and a non successful site. It is not easy an easy task to reduce page size, but I try to keep all pages under 20k of html, and less than 30k total with graphics.
For an example of how to do it: see Google and Yahoo. For an example of how not to do it, see CNN and ESPN.
It needs to work in 100% of the browsers on the net. That include browsers such as Scooter, IE, Netscape, Lynx, Opera, Slurp, and Googlebot. That doesn't mean it has to look the same, just that those agents can get to all the content with those browsers. Obviously differences will exist in things such as graphic support.
Leading edge or nonstandard technology. There is a reason they call it, the bleeding edge. Stay far away from anything nonstandard or requires your user to do extra work. (that includes, shockwave, java, and other attempts at embedded tech such as active-x or vbscript). You can't afford to lose (or slight) 10% of your audience. Granted, if you are running something such as a WAP site, WML would be appropriate.
Design for who?
Who is the typical user? Much of the common wisdom is that the average user is using around a P2 at 333mhz with 32 to 64meg of ram and 56k dialup. If that is the "typical" web user, that means there are a bunch of users running less than that. In order to be inclusive, you have to design for a whole lot less than that.
Tips from a Speed Freak:
Because of modem compression, HTML will download twice as fast for most people as graphics. (eg: 20k of html will download as fast as a 10k jpg).
If your server supports it, you might try experimenting with Apache Mod_GZIP. It can reduce your html bandwidth and download times by 50%.
Try simulating walking through your site at 28.8k. Assume 20% of your users are on 28.8 or 28.8k performing systems.
Take your logs and resolve the ip addresses. Throw out known cable users (home,rr) and any isp domain name with the word "cable" or "dsl" in them. Assume the users that are left are using 28.8k-56k modems. That is how many users are connecting at lower speeds to your site. Then compare the time from the first request of the html, to the last "object" request that is on that page (a graphic). That is a good indicator of how long it is taking to download the page for your users.
The number of page reloads you are getting. Those can also point to a server problem. Make sure your logging or counter software doesn't automatically throw out duplicate requests - that is important data.
Try a big page and then try a ultra small page. Notice the difference in the page views per user. Between a 10-15k page and a 40k page, the difference will be dramatic (it may be good to have medical personnel handy).
| 9:40 pm on Mar 29, 2001 (gmt 0)|
Last year's entries and winners really give one something to think about...
| 10:05 pm on Mar 29, 2001 (gmt 0)|
| 10:28 pm on Mar 29, 2001 (gmt 0)|
Wow! Good info. I, too, was thinking about how to measure pages that users balked from. In ASP, there is something you can put into your scripts at the bottom of the page to see if your customer is still there. I haven't really tried it, but it should work.
If Not Response.IsClientConnected() Then
'Log a page that they didn't wait for here
| 5:06 pm on Apr 3, 2001 (gmt 0)|
| 2:31 am on Apr 14, 2001 (gmt 0)|
Another thing I look at in relation to download times, is display times. If you have nested tables then you have to wait for the complete page to download. Also since we are using large quanties of text for SEO purposes and non-nested tables, this text will display fast catching the persons attentions (hopefully) so they will wait for the larger graphics. As much as possible I like to get a tag line in a heading tag for two purposes. 1) SE's like the heading tag w/ keywords 2) If you do a search for "widgets" and the first thing you see is a bold "We have your Widgets" you'll stay to see a nice picture of said widget.
| 4:12 pm on Apr 14, 2001 (gmt 0)|
Ah, yes, display time. You're right, this is what really matters, not download time/file size.
The rendering time wait for nested tables is worse in Netscape (4.7 and earlier) than it is for IE, Opera or NN6.
That said, if you can get a header tag and critical text positioned outside your tables (or better still by using CSS for positioning) then the text flows on the page very fast, giving your visitors something to do while the graphics load.
Yet another display time factor I've stumbled on recently is the way IE renders a progressive jpeg on a PC. It does NOT render the image progressively, but waits until the entire file is downloaded and then displays the image all at once.
This means, paradoxically, that a the larger file (standard JPEG) will begin showing on screen quicker than a progressive! Netscape handles progressives "correctly", but since the majority uses IE...
I'm sure that fast, informative pages are better for business than beautiful or tricky pages.
| 8:04 pm on Apr 14, 2001 (gmt 0)|
>I pity the poor guy that surfs in on a 3 word key and hits page 5 first.
This may sound like heresy, but why not use "noindex" on a heavy graphics page. That would cut down the chances of surfers coming directly in to it. You could then take all the keywords from that page, put them on another page, preload from there and then lead in to the graphics page.
| 10:53 pm on Apr 15, 2001 (gmt 0)|
If you have to display lots of text infomation on a page, I believe that if you use lots of separate tables on top of each other as opposed to one with loads of cells, that they are rendered more quickly..
Maybe this is different on different browsers
but I have played with it and I think it's worth a mention.
| 5:34 pm on Apr 16, 2001 (gmt 0)|
Right NiceBloke. It is the "nested" that kills you. The other way around it if you need nested, is to use absolutes on the nested, while letting the main table "float" in size.
| 9:26 pm on May 11, 2001 (gmt 0)|
www.the5k.org just announced their 2001 winners here [the5k.org]
| 10:03 pm on May 11, 2001 (gmt 0)|
I'm getting 404's for all the entries :(