homepage Welcome to WebmasterWorld Guest from 54.196.225.45
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Home / Forums Index / Code, Content, and Presentation / HTML
Forum Library, Charter, Moderators: incrediBILL

HTML Forum

This 46 message thread spans 2 pages: 46 ( [1] 2 > >     
Speeding Up Your Site - best practices for the front end
a Google video from Yahoo's Steve Souders
tedster




msg:3547253
 6:07 pm on Jan 13, 2008 (gmt 0)

I highly recommend investing an hour in watching this video: High Performance Web Sites and YSlow [video.google.com]. It's part of the Google Tech Talks series, and this one features Yahoo's Steve Souders. Steve currently holds the job of "Chief Performance Yahoo!" (Note: The sound is uneven at the beginning of the video, but that gets corrected early on.)

In this research-based talk, Steve does not look at database efficiency or other back end improvements. He focuses instead on the front end, the user's experience within their browser. His research shows that, by far, the front end is the main area where significant website speed gains can be had.

His list gives us 14 best practices, culled from his research:

1. Make Fewer HTTP Requests
2. Use a Content Delivery Network
3. Add an Expires Header
4. Gzip Components
5. Put Stylesheets at the Top
6. Put Scripts at the Bottom
7. Avoid CSS Expressions
8. Make JavaScript and CSS External
9. Reduce DNS Lookups
10. Minify JavaScript
11. Avoid Redirects
12. Remove Duplicate Scripts
13. Configure ETags
14. Make Ajax Cacheable

There are some great tips here in the many details and side comments he makes, such as using multiple hosts names to allow browsers to do more parallel downloads. Another key point is that IE and Firefox browsers will stall or block other downloads and executions whenever they are downloading any javascript file. Opera is a bit better, and will continue to download image files in parallel. But Opera still will not do a parallel download of any other script.

Souders has integrated this information into a Yahoo tool called YSlow [developer.yahoo.com], an extension of the Firebug add-on to Firefox. He has also published a book with O'Reilly about all these goodies, called "High Performance Web Sites".

Sometimes you hear someone talk and just know that they've "got the goods." Steve Souders has definitely got the goods.

[edited by: tedster at 10:16 pm (utc) on Jan. 13, 2008]

 

trinorthlighting




msg:3547301
 7:46 pm on Jan 13, 2008 (gmt 0)

This is geat information tedster, thank you for the post!

g1smd




msg:3547303
 7:53 pm on Jan 13, 2008 (gmt 0)

Interesting tool.

I'm not 100% sure that I agree with all of the findings for the site I just tested it against, but it does give a lot of food for thought in improving design methods. There is one change that I am making immediately, one that I had long forgotten.

.

I think I found a minor bug or two.

I have a main CSS file which further imports several other CSS files. They are all reported as being outside the document <head> but they are not (as far as suggesting they are in the <body> or somesuch).

On the page listing the objects that don't have an expires header, or one that isn't set far enough in the future, the date format is the default US m/dd/yyyy style and not the one that I have selected in the main Windows options.

.

I am not sure why this is listed as the second object in the list: http://www.domain.eu/robots.txt#resize_iframe%26remote_iframe_0%26102$.

I am going to have to look up what "minify" a JS file means.

Pfffft. I got an "F".

I must check why GZIP isn't on for this site, and set up the expires headers correctly.

madmatt69




msg:3547309
 8:17 pm on Jan 13, 2008 (gmt 0)

I made a post about this about a month or so ago - after implementing all those changes, my crawl rate went much higher.

If you look in google webmaster tools, and check out Tools - Set Crawl Rate you'll see some interesting graphs including time spent downloading a page.

Before the changes I had huge spikes and a very erratic graph - now the graph shows the average time cut down in half, and almost totally steady.

Adsense and other revenue went up and traffic as well.

Needless to say, implementing some of those changes was probably the best thing I did for my site all year. Take some time and read the post or watch the video and work on it.. Some of the changes only take a few minutes to do.

encyclo




msg:3547333
 9:48 pm on Jan 13, 2008 (gmt 0)

If watching the whole video is too much (I would recommend it though), there is an excellent text summary of the contents of the video available on the Yahoo Developer Network site:

  • Best Practices for Speeding Up Your Web Site [developer.yahoo.com]

    Seeing as he mentioned it, I would recommend madmatt69's thread 20% gain in adsense income after speeding up site [webmasterworld.com] which has some good tips for increasing speed with a PHP-driven site.

    I am going to have to look up what "minify" a JS file means

    In this context, the author is suggesting reducing to a minimum the Javascript file size:

    Minification is the practice of removing unnecessary characters from code to reduce its size thereby improving load times. When code is minified all comments are removed, as well as unneeded white space characters (space, newline, and tab).

  • g1smd




    msg:3547339
     10:02 pm on Jan 13, 2008 (gmt 0)

    Rule 14 needs to be used with care.

    The example uses an extra parameter on a URL to show the date the information was last changed. This could potentially lead to duplicate content issues if those URLs should ever be indexed.

    .

    Although eTags are set up for all files, I got an F for that too. Not sure why.

    tedster




    msg:3547344
     10:14 pm on Jan 13, 2008 (gmt 0)

    Agreed, g1 - in fact, ALL of these points need to be used as suggestions only. Implement them with intelligence and only where they make real sense in any specific website. But I must confess that many of these issues I never even think about a lot of the time. So just raising our awareness is already an important step.

    I've seen cases, for instance, where using a second host for images actually hurt speed - for reasons that were difficult to address in that particular configuration. But just knowing about the issue and the options is a good thing. Getting a 40% improvement in website front end speed can be a major factor in online success, and such improvements are often well within reach.

    tedster




    msg:3547354
     10:26 pm on Jan 13, 2008 (gmt 0)

    Another point - in the video, Souders is not talking about changing the visual design at all but only working within how a given design is coded. But if you have the chance to work directly with designers, informing them of the constraints certain design elements force on a site, you can often get even bigger gains.

    When a responsible designer understands that their design is more than commercial art, but can directly influence business success from a technical direction, they are often happy to contribute to the overall achievement. I had one such conversation with a designer, and the next version of the site abandoned such visual frills as rounded corners, gratutious gradients and the like. This made a real difference in load times and site stats altogether, and the site still looked really sharp.

    (Interesting side note - on January 7, 2008
    Steve Souders left Yahoo for Google [blog.wired.com].)

    g1smd




    msg:3547366
     10:48 pm on Jan 13, 2008 (gmt 0)

    I need to go read more stuff. One site recommends turning off eTags not adding them.

    I'm getting into some areas of Apache that I haven't meddled with before.

    [edited by: tedster at 3:53 pm (utc) on Jan. 14, 2008]

    Habtom




    msg:3547623
     11:06 am on Jan 14, 2008 (gmt 0)

    I am not sure if this point is covered here, but removing the all the returns on the html source code reduces a significant amount of size of the page. The entire html code would lose its coding structure and becomes barely readable, but it somehow was great in reducing the size.

    lammert




    msg:3547630
     11:15 am on Jan 14, 2008 (gmt 0)

    It's more or less covered by minifying JavaScript and gzipping all content. The nice thing of this video is that covers points that have a huge impact on the client side speed that are not normally thought of. Things like IE and FF stopping downloads of images when they download a JavaScript file (Only Opera seems to handle this correct) and sequentual loading and execution of JavaScript that (in IE) even totally blocks further page rendering until finished.

    mifi601




    msg:3547772
     2:57 pm on Jan 14, 2008 (gmt 0)

    A big issue in all my pages is 'get a CDN' - now at which level is that really feasable? Are there any of you who are using a CDN - what are the costs?

    tedster




    msg:3547837
     4:07 pm on Jan 14, 2008 (gmt 0)

    g1smd wrote: "One site recommends turning off eTags not adding them."

    The Yahoo help page also recommends turning off eTags in some situations, but not in others. The wording in the rule - "configure eTags" - is a bit ambiguous and further reading illustrates why.

    If you host your web site on just one server, this isn't a problem. But if you have multiple servers hosting your web site, and you're using Apache or IIS with the default ETag configuration, your users are getting slower pages, your servers have a higher load, you're consuming greater bandwidth, and proxies aren't caching your content efficiently.

    If you're not taking advantage of the flexible validation model that ETags provide, it's better to just remove the ETag altogether.

    [developer.yahoo.com...]

    Also, using eTags does give you a lower "grade" with the YSlow tool.

    chewy




    msg:3547935
     5:51 pm on Jan 14, 2008 (gmt 0)

    this is excellent - but, forgive me, since I'm focused on conversion (visitors / action) and since virtually everything I've learned about SEO (here and elsewhere) focuses on increasing the numerator -- where is the thread that writes about best practices for the denominator?

    pageoneresults




    msg:3547940
     5:55 pm on Jan 14, 2008 (gmt 0)

    1. Make Fewer HTTP Requests

    Here's some impressive statistics...

    Google - 2 http requests at 14,438k total.

    WebmasterWorld - 4 http requests at 59,139k total.

    Live - 5 http requests at 22,640k total.

    And, not so impressive?

    Yahoo! - 77 http requests at 353,113 total.

    CNN - 262 http requests at 712,377k total.

    For the CNN site, 175 of those requests are CSS background images.

    incrediBILL




    msg:3547972
     6:23 pm on Jan 14, 2008 (gmt 0)

    He focuses instead on the front end, the user's experience within their browser.

    Then Steve should be FIRED because he's failing miserably.

    They slowed down the Yahoo Movie site to the point my older laptop now kicks up an error asking if I want to terminate the javascript on the page due to it running too long.

    Likewise the new Yahoo Mail is slower than hell, reverted back to the original format, so on and so forth.

    Didn't say the changes were neat, but they quickly made a still useful old Thinkpad next to useless where Yahoo is concerned.

    Yup, I'll sit right down and waste an hour learning how I too can mess up my site the same way.

    Thanks for the tip Tedster! ;)

    ergophobe




    msg:3548201
     11:07 pm on Jan 14, 2008 (gmt 0)

    Actually Bill, Souders says several times that he doesn't get involved in design, including decisions on functionality - his goal is to serve a given design as fast possible.

    Some designs may have incredibly bloated javascript that could be made more efficient, but that part of efficiency is NOT within his purview. His is more a "given that you have serve up this bloated page that you have no control over, how do you do it fast?"

    You shouldn't take Yahoo's abysmal speed as an indicator of Souder's expertise.

    8kobe




    msg:3548231
     11:59 pm on Jan 14, 2008 (gmt 0)

    I have been using yslow for awhile now. I would say that it is accurate to about 99%. I however have hit a couple of sites that it didn't do a good job on.

    g1smd




    msg:3548244
     12:23 am on Jan 15, 2008 (gmt 0)

    Yeah, I'm confused about eTags. The tool says:

    These URLs have an eTag:

    ...long list...

    Score: F

    The only URLs that do not have the tag, are those for the various parts of the Google Custom Search Engine code embedded on that page.

    incrediBILL




    msg:3548343
     4:28 am on Jan 15, 2008 (gmt 0)

    his goal is to serve a given design as fast possible.

    So he serves up something so fast that it can start slowing you down even faster.

    Nice.

    Josefu




    msg:3548470
     10:19 am on Jan 15, 2008 (gmt 0)

    Thanks for the link, interesting topic and an introduction to two new useful tools!

    Of course not all of these suggestions can be implemented on all websites - but a few of (the more obvious of them) can help in almost all cases: adding "expires" headers and cutting the comments out of javascript files... I wonder if doing the same to php files would help? Of course this means keeping a duplicate comment-ed "working file" locally so that I can keep track of what all my script does...

    My biggest problem is with IE's caching of Ajax http requests... for the time being I am obliged to add a "dummy" timestamp variable to catalog pages to ensure that any updates get shown instead of the old version. No can-do for any improvement there.

    lammert




    msg:3548487
     11:00 am on Jan 15, 2008 (gmt 0)

    Removing the comments from a PHP file will increase the parse speed somewhat, but this is only slightly. Skipping over a comment block between /* and */ takes much less time than parsing the same amount of bytes containing variables and functions that have to be looked up, syntax checking taking place etc.

    According to the video, page generation (the PHP part) takes in average only five percent of the total time the visitor has to wait before the whole page is rendered on his screen. So removing the comments may reduce that 5% to maybe 4.9%.

    Reducing javascript size has much more effect because the size not only determines how much time the browser needs to parse it, but also how much bytes/packets have to be sent over the internet connection before parsing can even start.

    Chico_Loco




    msg:3548693
     3:02 pm on Jan 15, 2008 (gmt 0)

    Respectfully... Yahoo are hardly the best at practicing what they preach. Neither are Google mind you, but Google Senior Engineer Aaron Hopkins also has an interesting piece called Optimizing Page Load Time [die.net], which has been around for a long time. It mentions a lot of the above, along with graphs pertaining to performance increases by using multiple hostnames.

    AhmedF




    msg:3548732
     3:44 pm on Jan 15, 2008 (gmt 0)

    I am surprised no one mentioned CSS sprites.

    Definitely should check that out.

    ergophobe




    msg:3548824
     5:05 pm on Jan 15, 2008 (gmt 0)

    People are really getting diverted by how they feel about Yahoo! and discounting the video without even watching it because of that. There is some excellent research-based information in there. If you can't be bothered to watch the video, please don't be bothered to pass judgement on it.

    >>no one mentioned CSS sprites.

    #1. Reduce HTTP Requests. He specifically talks about sprites.

    AhmedF




    msg:3548842
     5:15 pm on Jan 15, 2008 (gmt 0)

    Sorry I meant in the discussion in the post :)

    Josefu




    msg:3549097
     9:36 pm on Jan 15, 2008 (gmt 0)

    About spreading content over multiple domains... a few questions.

    First off, what most concerned me was the "order of load" bit - this is not essential to many of my websites, but it would seem logical to keep all the (generated) html on the same domain to avoid conflict.

    Based on the above question, would it be feasable to have html on one server, and, say, images and swf content on another? Would putting javascript requests as well on another domain create possible "load order" conflicts? How about CSS? I could well possibly imagine spreading things out such if it sped things up.

    Lastly, what exactly does he mean by "domains"? Would sub-domains suffice as "alternate domains"?

    g1smd




    msg:3549109
     9:46 pm on Jan 15, 2008 (gmt 0)

    Yeah, I'm wondering about moving the images to images.domain.com, and wonder what effect that has with things, and how that depends on whether the subdomain is on the same server, or on a different server.

    madmatt69




    msg:3549128
     10:20 pm on Jan 15, 2008 (gmt 0)

    I've experimented with that a little and haven't really noticed any difference - Sometimes its faster, sometimes its slower. But I probably dont have enough traffic volume to really gauge if it's helping or not.

    Chico_Loco




    msg:3549145
     10:43 pm on Jan 15, 2008 (gmt 0)

    g1smd,

    From what I understand, the optimal way to move to load images is indeed to have them servered by a different server, but with certain prerequisites. For the most part, the real advantage comes from being able to use a stripped-down super-light HTTP server for the serving of static content (your images, CSS, JS files, etc..). In this way, the HTTP server can be pre-compiled without support for PHP, MySQL, Perl, mod_rewrite, etc., etc.. This gives the server a much smaller memory footprint and quick load-time. Naturally, your main Web server will retain such functionality. This all being in addition to the browser's parallel-loading advantage!

    There's a few issues with having an images.domain.tld static file server though. If you have a single box with both your main HTTP server and your optimized static file server, you can't run both of them on port 80 on the same IP. One option is to re-port your static file server to another port, eg. 81. This is obviously a pain because your images will need to be referenced as images.domain.tld:81..., which can cause logistics problems down the road.

    The better way to do it is to have a separate box (or the same box with 2 unique IP addresses). This way you can map your main domain to one IP, the static server which is the sub-domain to the other IP, and both can run on port 80!

    It's quite an interesting subject that I've spent a bit of time on... To be honest though, I wouldn't recommend doing any of this unless you have a combination of servers under heavy load, along with an obvious problem in page-load times, but the load-speed increase is very slight in the majority of cases, to be honest. It really only matters with very large pages and very high traffic levels. Hope this helps.

    This 46 message thread spans 2 pages: 46 ( [1] 2 > >
    Global Options:
     top home search open messages active posts  
     

    Home / Forums Index / Code, Content, and Presentation / HTML
    rss feed

    All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
    Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
    WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
    © Webmaster World 1996-2014 all rights reserved