homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member
Home / Forums Index / WebmasterWorld / Accessibility and Usability
Forum Library, Charter, Moderators: ergophobe

Accessibility and Usability Forum

This 46 message thread spans 2 pages: < < 46 ( 1 [2]     
Acceptable site speed ?
How many seconds is acceptable to wait?

 1:29 am on Feb 11, 2010 (gmt 0)

This was discussed before but most threads were old.

What is an acceptable amount of time to wait for page to load?

These can be two different tings, time to wait before PHP starts printing the html code and time to wait before all images and page content finished loading.

In PHP if I start the counter at the very beginning of the code and print the timer in the footer after all codes, so it counts the MYSQL queries and printing all page html code too.
Time is between 1 and 3 (sometimes 4) seconds.
Is this acceptable?

If it matters, the site is a file directory with many users printing lists of around 10 listings (files) per page (title, description, rating, thumbs).
Edit: site is on a dedicated server with 1 other similar site.



 3:22 pm on Feb 15, 2010 (gmt 0)

Analytics and AdSense ARE gzipped if the user's browser supports it. Google employees have confirmed this in the Page Speed Google Group several times. It is just that WMT Site Performance tool doesn't see this as Google isn't feeding these files gzipped to their own bots.


 4:04 pm on Feb 15, 2010 (gmt 0)

johnnie & KenB - there are some other areas where Google code is being reported as being heftier than necessary. For instance, Google Maps embed. Further, Google Analytics actually rolled out performance enhancements, AFTER the Page Speed and WMT utilities came out, so they had just a bit of egg on their face when their own tools reported some inefficiencies.


 4:16 pm on Feb 15, 2010 (gmt 0)

There are a few different areas where Google's speed diagnostics and documentation have sort of left holes.

We've heard from Google that "they want to make the internet speedier" (my paraphrasing), but they haven't remarked much around the line in the sand they've drawn in the Webmaster Tools graphs. The line is at the roughly 1.4 second mark -- a goal that is very difficult to reach for most sites. Is that merely a nice goal, or do sites not meeting that get some mild penalty?

Further, shouldn't that goal be a little adjusted according to industry? A short page with a few nav links ought to be different from a page full of listings lookups, and maybe airline travel sites, with their realtime lookups across many active databases, should be expectably slower?

I've used Netmechanic diagnostic tools for years, and it gives estimates for how long a page would take to appear for various different internet connection speeds. What speed of connection is Google gearing towards?

The Google Tools are mainly focusing on browser rendering factors. But, there are a number of other components as well. For instance, the huge amount of network pipe that site pages must traverse to reach endusers' browsers before they even begin rendering. Does Google plan to assist with that part of the equation in some way? Large companies contract to Akamai for distributed hosting/networking to reach users at the edge quicker, but the majority of websites don't have this capability. (Admittedly, Amazon now offers a lower-cost, pay-as-you-go model of Content Delivery Network {CDN}.)

Or, is Google planning to roll out their own CDN to enable the faster internet they're espousing? It's maybe putting the cart before the horse just a bit to place all the focus on the UI rendering factors.


 5:36 pm on Feb 15, 2010 (gmt 0)

my major speed hogs are google's products

Same here. Having Analytics and AdSense (one ad unit and one link unit) on a page, for me, results in seven additional javascript files downloaded from Google. This is horrible, but seemingly unavoidable. They've got to find a way to combine a lot of that code into fewer files.


 5:54 pm on Feb 15, 2010 (gmt 0)

I've done a speed test on my pages for the first time today. Never goes over 0.1s per page. I'm estimating about 0.25s to load the page on my screen. How can Google come up with 8 seconds?


 8:54 pm on Feb 15, 2010 (gmt 0)

Are your datapoints between 0 and 100? If they are, the load time that Google shows may not be very accurate.


 10:04 pm on Feb 15, 2010 (gmt 0)


Your numbers don't sound realistic even on the fastest computer and using the Opera web browser. How do you calculate your page speeds?

Page speed is determined from the moment the user clicks on a link, submits an URL from the address bar, etc. until when a page has completed rendering. There is going to be a latency for every file needed to render a web page simply as a result of the browser sending a request for each file to the server. Then there is the processing time necessary for the server to call up/generate the requested file. There will then be the time it takes to transmit the file from the server to the browser and then finally there is the time spent by the browser rendering the web page.


 11:05 pm on Feb 15, 2010 (gmt 0)

Same here. Having Analytics and AdSense (one ad unit and one link unit) on a page, for me, results in seven additional javascript files downloaded from Google. This is horrible, but seemingly unavoidable. They've got to find a way to combine a lot of that code into fewer files.

Hah. Try adding google ad manager to the mix. Pandemonium will ensue.


 4:21 am on Feb 16, 2010 (gmt 0)

speedup to be gained by optimising the regex patterns in the rules

G1smd is rreferring to threads like these:


Can't find the one in supporters on the topic, which was the best of them all. Anyone have a link?


 5:44 am on Feb 16, 2010 (gmt 0)

I don't see why they can't selectively gzip analytics and adsense.

As far as I know, it's gzip. Just checked it in webpagetest org.


 4:55 pm on Feb 16, 2010 (gmt 0)

they haven't remarked much around the line in the sand they've drawn in the Webmaster Tools graphs. The line is at the roughly 1.4 second mark

Silvery - interesting observations. Anyone else have something to add there. I think that's a topic worth exploring a bit more.

From a usability standpoint, 1.4 seconds is still rather long. Jakob Nielsen would have us use these numbers [useit.com]
< 0.1 seconds and the user notices the latency.
< 1 second and the user's train of thought is interrupted
< 10 seconds and you go past the user's span of attention.

Those numbers are based on studies going back to the 1960s, but I suspect the internet has more variability than, say, a word processing program.

1. Internet surfing is commonly not task based, but a diversion
2. Except for services that you might be locked into (hotmail, your bank), in most cases, if one site has great information and slow response times and another has almost great information and great response times, the user can just jump ship, in a way that was probably not possible on the enterprise-level apps that were likely the basis for the usability testing of the 1960s through early 1990s.

So to bring it back to Silvery's comments - how do people take the "line in the sand" from Google.


 7:05 am on Feb 17, 2010 (gmt 0)

If you're on Windows, Mac, Linux or Solaris you can use the Apache Friends XAMPP package (more complete - includes PERL, Sendmail, FTP and more).

Sorry to jump so far back in the thread, but if you are using Linux it is easier to use the package manager to install, and doing it that way means that you will get automatic updates. You may only need to install a single package (Ubuntu and Mandriva have LAMP stack meta packages, Mandriva also has a GUI for configuring Apache but only simple stuff).


 12:14 am on Mar 6, 2010 (gmt 0)

Personally, I never let my site speeds exceed 3 seconds, and always try to keep the as near to 1 second as possible. Sites that load over 3 seconds tend to have a big drop-off.

Personally, I use Firebug + YSlow plugin.


 11:20 pm on Mar 13, 2010 (gmt 0)

I mentioned earlyer that a page with static text and php counter at the end took some time to load. In last tests it took 350ms to load while on another site, the exact same STATIC page took 1ms or less.

By disabling SSL support in Plesk for that domain I gained 350ms :D

EDIT: I probably talked too soon. The speed was from enabling gzip compression, but for some reason changes only applied when I disabled the SSH support (that I didnt use anyway).

[edited by: adrianTNT at 11:57 pm (utc) on Mar 13, 2010]


 11:48 pm on Mar 13, 2010 (gmt 0)

IMHO, as soon as you start measuring in seconds your site is way too slow.

A page should load in a fraction of a second.


 11:59 pm on Mar 13, 2010 (gmt 0)

I was shocked by the speed improvement that I didnt write properly, I meant SSL not SSH. Doh ! Eider way, the gZip made the difference.

This 46 message thread spans 2 pages: < < 46 ( 1 [2]
Global Options:
 top home search open messages active posts  

Home / Forums Index / WebmasterWorld / Accessibility and Usability
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved