Welcome to WebmasterWorld Guest from 188.8.131.52
Forum Moderators: phranque
joined:July 2, 2000
I just did a month long experiment going from the low bandwidth page to the busy high bandwidth page to see if there really is that big of a difference between the two. On the high bandwidth I added a bunch of related links to other sites which provide the same products to increase traffic to these other sites. The low bandwidth just basically had product information and a "purchase here" type of button.
Results: The busy page takes twice as manyclicks to generate the sale as the clean & simple page.
Perhaps this is why the SE's with not so much extraneous information are performing better traffic wise (Av being the exception) than their counterparts.
I took it to heart on almost all of my sites in the last three months, and the results couldn't be more dramatic. Across the board 30-40% increase in page views. That is even accounting for a normal increase in users besides.
Those sites that had the largest homepage to begin with are seeing the most increase. One site went from 50k homepage to a 15k home page and the hits have trippled on that site with no appreciable difference in rankings or promotion activity.
I think it is the biggest lesson that anyone in ear shot can learn from these forums. There are way too many huge pages on the net. You have to be a pretty large well known site with tons of bullet proof promotion programs to sustain at 50k+ home page. Sure, there are sites where large pages are required (graphic sites, etc), but for the most part, if it is over 30k for the index page (with graphics), you are losing users big time.
But, reducing page sizes is only part of the answer. My experience is that the typical user will leave a site after 10 or 15 seconds - especially if he/she sees just a blank page. I'm pretty dumb when it comes to writing code, but I know that there are sites out there which leave users with a blank page until all of the graphics and ads are loaded. If you can give users something to read while the other 'stuff' is loading - they can evaluate your content and decide whether or not to stay (If the text content is relevant and enticing - they probably will stay.)
So many sites locate on the cheapest ISP (and never think about how slow the ISP will deliver pages during peak hours). Or sign up with banner ads without checking out the capacity/performance of the third-party server delivering the ads. Or without checking to see whether the banner ad code requires the ad to load fully before any other page content can be viewed. Or paste time stickers, weather stickers, external tracking icons or external counters without investigating performance of the server during peak usage times.
My sites are on an independent ISP (not a national one - a local/regional one). They're not the cheapest around. But, over the past 3 years, they've been very reliable and have added capacity in proportion to their business growth and the growth (and demands) of the internet. I have no external stuff on my site - everything on my site rests at the ISP's server. My 23k index page always loads fast. Single source - single responsibility. It works for me.
Added 30 minutes later ... PS - Brett, you've inspired me. I just cut my index page size to 17k.
>which leave users with a blank page until all of the graphics and ads are loaded
A little off topic and I'm not overly clear on the whys but the most common cause of this is not including height and width tags. If memory serves, browsers render in a couple of passes, text first, and with no dimensions on the first pass to "reserve space" for images, they must calculate the space needed during the second pass, so nothing is rendered until the second pass is complete.
Whatever the cause, the phenomenon does happen and, on a commercial site, will surely drive websurfers away.
With the major banner servers, websites can set the maximum K ad they'll accept, but the lower you set the limit, the more you cut into the available pool of ads.
Now about this "pales and cowers" crap, I changed phenomenon to symptoms in the above paragraph 'cause I'm not sure how to spell phenomenon. We all know a little here, some of us just have more experience at looking like we know a lot! Truth is we're always asking each other to help out answering the tough ones... and learning from it... Remember, we don't expect ya to get more then a little pale, a gulp or two is good, but cowering is considered bad form ;)
I still keep a 28.8 modem to view the sites I work with. When I use that modem to surf the web in general, I get nailed with this observation: the trend seems to be toward slower loading pages. Yes, the culprit is often multiple servers as much as it is file size. I also run into server-side scripts that do Lord-knows-what -- and hold up page content until they're done.
Another factor is the way different browsers work to render a screen, particularly with nested tables. If a table at least starts to fill in, the viewer usually gets some text along with the slowly uncovering picture show.
MSIE is better at these partial fill-ins than Netscape. When tables are complexly nested, Netscape sometimes needs extra time to "think" about tables, even after the download is complete -- I've seen as much as 15-20 seconds on earlier Pentiums. And that's even though the contained images all have width and height specified. The page can give the browser too big a job, even if it is valid code!
Another slow-load culprit can be jpeg images - which must be de-compressed client-side after the file is downloaded. Depending on the level of compression, this can add in 5-10 seconds between the download and the screen rendering.
My goal as a designer is not any particular file size, however that is accomplished but giving the visitor a quick screenful. There are a lot of tools for doing this -- intelligent use of the cache being a big one -- even if the page has a total weight of 70kb, but 50kb is already cached by the browser, things are pretty good.
But the trick is to care about it in the first place. As Brett noted, it does matter, and the rewards can be very significant.
We may have some 60-80k pages around here, but since most are 80% pure ascii text, that gets killer 4-1 compression coming across the net. That is another thing to be cognizant of: a 50k jpeg is easily 3times as long to download as 50k of html for most people.
Lets start with the gold standard on useability:
The usableweb.com is a directory of articles related to site usage and design. Without-a-doubt, one of the most important sites on design on the net for webmasters. Simple, fast, usable - no fuss with high quality hand picked links.
Take the dead center link on SPEED under ISSUES for some great articles
about page loading speed.
I think the general consensus is 30k is the top end. Most importantly, the site must present something to the users screen within 3 seconds and be loaded in under 10. That puts a 56Kinda modem at around 30k of pure html (less with graphic).
Some important sites that use right side navigation:
Wdvl is a fairly major site to be using rs menus.
So is Intelliforum in this article on Clustered Navigation:
And is EnterpriseDev [enterprisedev.com].
A recent study (Oct 18) on page load speed at NUA (however, they quote a study by Zona research that is two years old and I find their finds to be too high):
The classic article on Banner Blindness:
15-30% of the web surfs with graphics off:
That kinds tells you why those graphic based counters are so useless and generate questionable data.
From that same article on page size:
o The 0-10K range qualifies as exemplary
o Pages between 10-20K rate as well-optimized
o The 20-40K range is merely adequate
o 40-60K pages earn a dubious designator
o Anything over 60K is unacceptable
Best quote found:
...e-commerce sites, where "image compels, text sells."
(from Wait for It at wired:
Also avoid the classic Mystery Meat Navigation:
(rc, read that)
And finish it with the old Death to download Ratio [hotwired.lycos.com].
every page has a link to the home page, the contact page with emails etc, the search page and a text menu at the top ala brett's SEW that shows how deep you are and you can click back several levels,
The result 200% more page views, better rankings, 500% saving on our disk space, $30% saving on total throughput.
I'm a great Jakob fan, we keep our corporate wite Jakbised though our ezine is more graphic rich and larger pages. But then again, once you are in a magazine and the front page has loaded quickly and you are convinced that there may be something useful, then you are a bit more patient. So many sites now i just get bored waiting for the first page to load with all the "impressive" grphics and doo dahs. Best give them a really simple, minimalist and elegant front page that loads in the blink of an eye.