homepage Welcome to WebmasterWorld Guest from
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / WebmasterWorld / Accessibility and Usability
Forum Library, Charter, Moderators: ergophobe

Accessibility and Usability Forum

This 46 message thread spans 2 pages: 46 ( [1] 2 > >     
Acceptable site speed ?
How many seconds is acceptable to wait?

 1:29 am on Feb 11, 2010 (gmt 0)

This was discussed before but most threads were old.

What is an acceptable amount of time to wait for page to load?

These can be two different tings, time to wait before PHP starts printing the html code and time to wait before all images and page content finished loading.

In PHP if I start the counter at the very beginning of the code and print the timer in the footer after all codes, so it counts the MYSQL queries and printing all page html code too.
Time is between 1 and 3 (sometimes 4) seconds.
Is this acceptable?

If it matters, the site is a file directory with many users printing lists of around 10 listings (files) per page (title, description, rating, thumbs).
Edit: site is on a dedicated server with 1 other similar site.



 1:51 am on Feb 11, 2010 (gmt 0)

Use Google Webmaster Tools Site Performance page to see how your site stacks up to other sites. It tells you what percentage of sites are faster than yours on average. It might not be perfect, but it is a level bench mark as it uses the same methodology to measure the page speed of all sites.


 2:02 am on Feb 11, 2010 (gmt 0)

Thanks KenB, I actually reached that tool right after posting this. It is useful indeed.

But if other have any other ideas/tips/comments feel free to share.

Edit: and as probably Google counts complete page load time there (including all images, etc) it says 6.7 seconds, and it says estimation is high accuracy with 1000 data points.


 2:25 am on Feb 11, 2010 (gmt 0)

If you are getting "high accuracy," that is good. This speed benchmark is based on real users who have the Google Toolbar installed.

Another tip would be to install the Google Page Speed extension for Firefox [code.google.com...]

This extension will help you pin point some things you can do to speed up your site.


 2:31 am on Feb 11, 2010 (gmt 0)

A page speed faster that some % of all websites is interesting, but is it really all that informative?

I'd rather know how my image gallery pages compare speed wise to other image gallery pages.

The same could be said for my event calendar pages or directory pages style pages.

Like to like comparisons seem more informative.


 2:34 am on Feb 11, 2010 (gmt 0)

Use the Page Speed extension to make your page as good as you can and leave it at that. At some point you have to look at how much functionality are you losing to speed up the page. There may be a lot of little things you can do before you get to that point.


 2:51 am on Feb 11, 2010 (gmt 0)

@ken_b: you are right about comparing it to similar sites, if I look at my alexa.com speed it says 3.4 seconds and for a similar site it says 12 seconds, I win :))

Then yahoo.com speed is 2.4 (says alexa), my page has a similar amount of content in the layout (same amount of images, etc), so If I compare 2.4 with my 3.4 I am OK I think.

While testing with Google speed tools and that Mozilla plug-in I found some things, I didn't thought about this before:
If I do not specify IMG width/height, then after each image completed loading, the browser has to re-process the code to align layout properly. Specifying IMG width/height can have an impact on speed.


 3:05 am on Feb 11, 2010 (gmt 0)

Another thing to track is your "crawl stats under "diagnostics" on Webmaster Tools. This will allow you to see what the average download time is for your pages by googlebot. This can be used over time to see how scripting change impact page loading. This can be helpful in reducing processing time for pages, which is not only good for users, but nicer to your server.


 3:24 am on Feb 11, 2010 (gmt 0)

The "crawl stats" says Time spent downloading a page is:
1.39 sec (High)
0.68 sec (Average)
0.39 sec (Low)
So a big difference than the other speed (6.7 Seconds) Google bot speed doesn't include images, right?!

By that page I also see that Google bot generates over 5GB of monthly traffic.


 3:30 am on Feb 11, 2010 (gmt 0)

googlebot's time spent downloading page is just HTML files, nothing else. The graph is the biggest value as it can let you see trends better. An average of 0.68 seconds is high. Ideally you'd get your average times down to less than 400ms (where your low is now).


 4:24 pm on Feb 11, 2010 (gmt 0)

adrian, have you benchmarked your app at all?

The times you're getting for runing the PHP script strike me as pretty high. 3 seconds is an awfully long time if it's the only request on the server.

You can use xDebug/CahceGrind and find out where bottlenecks are.

I did this on one site and found out that a mere 6 calls to the PHP function GetImageSize() was causing a huge slowdown on my front page. I was only using it to be able to get the image sizes for the HTML height and width attributes, so I just axed that and cut page generation by over half.

All that just to say I was aabsolutely shocked to find out how resource intensive this function was. I would never have guessed it.

Also, if my pages were regularly taking 3 seconds to run the script, I would seriously consider static caching.

Most CMS have some caching plugin, but I've also rolled my own very easily by looking for a cached version, if not found or beyond a certain age, using PHP output buffering to grab the whole page and caching it.


 5:45 pm on Feb 11, 2010 (gmt 0)

The CMS I use is PHP made by me from scratch. So I have no static cache yet. But it would not be impossible to add it. I will think about it after I optimize everything else I can (like "Parallelize downloads across hostnames" for my ~20 jpg images on all pages)
Inserting mysql records with a "delay insert" also helps me, just found about this one too.

Now that you mentioned getimagesize(), I do have that function on all my "listing details" pages :) So I will consider saving the image size in database instead of always call the function, and I will count time needed by that function on some larger images. I use it on SWF files.

Question: I have some mysql queries after* <html> and <body>, this is bad, right? I didn't test (counting milliseconds) for this one yet.

Will look into xDebug/CahceGrind too.

Eider way, bottom line is that with what I found the last 2 days I see there is much room for improving performance/speed.


 7:00 pm on Feb 11, 2010 (gmt 0)

Thanks for the suggestion about xDebug/CacheGrind. I'm working on configuring these at this moment. Maybe I'll find some stuff I can focus on cleaning up. :)


 11:11 pm on Feb 11, 2010 (gmt 0)

KenB - you probably don't want to run xDebug on a production server. It slows down the server a lot because it has to write data with every single function call.

You might need to bump up your script execution timeout limits and the memory available to PHP as well.

Still, you'll get more data than you can imagine, and yet easy to view and understand. Quite enlightening.

adrian - look into using output buffering in PHP to roll your own simple cache. at the very least, I think it's a good thing to have set up for your most popular pages.


 12:17 am on Feb 12, 2010 (gmt 0)

I'm running it on my laptop, which house a complete mirror of my production server. It isn't precise to what the production server experiences but the profile option is allowing me to find some bottlenecks and learn a few tricks to make my code more efficient in the future.

xDebug is a very useful tool.


 3:47 am on Feb 12, 2010 (gmt 0)

Yeah exactly. There are some functions in PHP that seem to behave differently in Windows and Linux because they depend on native *nix functions, but even between very different setups, it's still really useful.

If you mirror the OS, server, PHP + MySQL versions, even if the hardware is quite different, it gives a very good relative sense of what's eating up CPU cycles.

Pretty amazing the sheer quantity of data it gives you and the ability to view it in so many different ways. I haven't actually fired it up in a while, but I have a slow Drupal site I've been thinking needs some profiling.


 4:14 am on Feb 12, 2010 (gmt 0)

I'm only mirroring Apache, PHP & MySQL versions, The server is FreeBSD and my laptop is WinXP. It isn't a perfect mirror but it is the best I can do. It did help be uncover all kinds of inefficient ways I was doing things like putting a trim() or count() in the wrong place causing them to be called repetitively, whereas by moving them I could get them to be called once. For example:


while ($i < count($menu)){i++;}


while ($i < $MenuCount){i++;}

Apparently with the first version the count has to be recalculated on every loop. Setting the count to a variable and then referencing the variable in the while statement allowed the count to be done only once. It makes sense when you stop and think about it, but it hadn't occurred to me.


 6:56 am on Feb 12, 2010 (gmt 0)

>>count has to be recalculated on every loop.

A classic one! I also quit doing after the first time I saw my xDebug output. It's shocking how often you'll see that and worse in the examples in programming books.

On the flip side, back then, all kinds of people were talking about single quotes being more efficient than double quotes and when I benchmarked that [webmasterworld.com] I realized that the differences are mostly miniscule.


 12:50 pm on Feb 12, 2010 (gmt 0)

Another one that can add up to a real time consumption if done hundreds of times is $A=$B[$i][$s]. In my while loops I was putting my array values for loop $i into straight variables to make my code more readable, doing this six times per loop over the course of 160+ loops actually ate up about 4ms more time than just using $B[$i][$s] in my code directly.

The lesson from that one is to watch how you create pointers. They may be fast but even pennies can add up to real money if you have enough of them.


 1:12 pm on Feb 12, 2010 (gmt 0)

I am stuck at something.
I start the counter right at the beginning, then I have the MySQL queries in header.php then right before <html> I print a counter and is only 20-30ms, then I have static html code and when I print counter again is 300-400 ms.
I am moving the counter print up and down in a static html content, upper is 30ms and lower is 400ms.
What kind of code could cause something like that?
This doesn't seem right, is this normal behaviour?

I looked at the debug tools but I understand I need to install and configure PHP locally, oops.


 7:51 pm on Feb 12, 2010 (gmt 0)


installing PHP locally is dead easy. If you're on Windows, you can use the WAMP package. If you're on Windows, Mac, Linux or Solaris you can use the Apache Friends XAMPP package (more complete - includes PERL, Sendmail, FTP and more). This gets you up and testing in no time.

As for your other question, I'm not sure. When you reach the <html> tag, is your page fully generated, or are you still at that point running it through a template, substituting variables and on?

A priori, I would guess that you're grabbing part of your page via a database query and that's taking time, but it could be anything. Like I said earlier, a dozen calls to GetImageSize() was bringing my server to its knees just to generate a single page.


 10:27 pm on Feb 12, 2010 (gmt 0)

another benchmarking tool that may help you measure site speed issues is using a waterfall chart on the browser side to see how long the page object requests take.
there are such charts available in the safari web inspector and the opera developer tools.


 10:34 pm on Feb 12, 2010 (gmt 0)

If I cannot fix this I will try installing PHP locally and the debug tools...

About that delay...
It delays opening plain text. That is what I find strange, that should not happen, right?
I am moving my last counter before and after some plain <h2> and <p> tags and it is a 300-400ms difference.
Eg: 34ms and 400ms.

But OK, is probably hard for anyone to realise what is happending here with my code :)


 3:30 am on Feb 13, 2010 (gmt 0)

I ended up testing in an empty php file, can someone confirm if this is normal?! ...

A blank php page, I start a timer by microtime, then simply write a <p> tag (OVER 16KB), then I stop the timer and print the time, it would show over 300ms, like this would be need to "download" that text.

I though it shoud simply show zero because php doesn't "work" on anything ?!


 11:33 am on Feb 15, 2010 (gmt 0)

If you use URL rewriting, there's a speedup to be gained by optimising the regex patterns in the rules, adjusting the order of the conditions within a rule, and maybe even adjusting the order of the rules. If the rule conditions use 'file exists' and/or 'directory exists' checks, there is a huge gain to be had by coding these correctly (Hint: the standard .htaccess rules supplied with Wordpress, Joomla, et al. are about as inefficient as you could ever get). See WebmasterWorld's Apache forum for more detailed discussions of these topics.


 11:54 am on Feb 15, 2010 (gmt 0)

Use Google Webmaster Tools Site Performance page

I looked it over a while back and found it funny that it suggested I remove analytics, calling it a 3rd party something or other as well as adsense, also because it was a 3rd party something or other.


 12:14 pm on Feb 15, 2010 (gmt 0)

To those considering a static cache: I have had great results using a reverse proxy (specifically, apache's mod_cache) to serve up the 'static dynamic' content. The savings on server resources and loading times are huge, at the cost of some minor freshness issues. Using mod_cache, I managed to slash two to three hundred milliseconds off my load time.

Ofcourse, you wouldn't want to do this on frequently updating sites, but the benefits to more static and content-heavy sites can be enormous.


 1:16 pm on Feb 15, 2010 (gmt 0)

I aim for < 300ms processing time on the server. Hardware, efficient database design, caching and code optimisation are all important.

Client side optimisation can also make a big difference for users. I've just done a new website for a client that is larger in download size than their old one (due to image-heavy rebranding) but faster to download for most visitors (big cut in number of HTTP requests).

YSlow and Page Speed firefox extensions are essential. Steve Souders' books and his blog are also highly recommended.

Make it as fast as possible. Then make it faster some more :-)


 1:36 pm on Feb 15, 2010 (gmt 0)

The answer is as fast as possible. You mention old posts. Page load time has not changed ever. Fast page load time is just important today as it was in 1997. People still get dial up speeds even on brodband. Anybody who has ever worked at a big company knows that there are times when the Internet is very slow. People at home might have other family members that are downloading movies or music.


Watch this video of Marissa Mayer. She goes into detail about how much load time improved bounce rate. Web pages should be under 100k.


 1:48 pm on Feb 15, 2010 (gmt 0)

BTW: my major speed hogs are google's products... I don't see why they can't selectively gzip analytics and adsense.

This 46 message thread spans 2 pages: 46 ( [1] 2 > >
Global Options:
 top home search open messages active posts  

Home / Forums Index / WebmasterWorld / Accessibility and Usability
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved