| 4:36 am on Jul 30, 2010 (gmt 0)|
As you can probably tell, I think graphic compression is also low-hanging fruit. It's a rare gif file that needs 256 colors (on small images, 16 is amazing); it's a rare online jpg that needs more than 40% for compression; and only png files that truly need an alpha channel should be saved at 24-bit.
I'm amazed at how graphics are abused online. The Photoshop option says "save for the web" - which, by the way, also removes some of the embedded xml data and that trims the file even further - so remember the "web" part.
Has anyone got any other low-hanging fruit for improving site speed?
| 5:02 am on Jul 30, 2010 (gmt 0)|
Setting expires & cache-control for jpg/gif/js/css/ico/png to 1 year; if you set it shorter, say 1 week, internet explorer sends if-modified-since every single time after 1 week. You won't notice the difference on low-latency connection, but the whole page flickers on high-latency connections.
| 6:36 am on Jul 30, 2010 (gmt 0)|
decreasing the number of http requests can make a difference in site performance.
possible techniques for this include:
this will be a tradeoff with software packaging issues, reusability, configuration management and version control, etc.
- using css sprites where multiple images such as a set of icons or menu items are combined into a single image file and only the required part of the image is displayed where necessary.
another important factor for the user experience is to provide the correct cache control headers so that subsequent pages when rendered don't make requests for resources that have been previously downloaded, such as the images, favicons, style sheets or scripts that are common across multiple pages on a site or even when the same page is later requested.
| 7:07 am on Jul 30, 2010 (gmt 0)|
Good one phranque. And since site performance uses toolbar data, those cache related actions will definitely make a difference. Some of the best site performance tests I've seen make use of a follow-up request for the same URL, just to check on caching issues.
| 7:07 am on Jul 30, 2010 (gmt 0)|
|If your site speed metric is so bad that you are losing ranking position because of it, then your visitors have already been hating your site anyway. |
That sums it up very well.
| 8:11 am on Jul 30, 2010 (gmt 0)|
Putting static files such as css, scripts, graphics and flash on a cookieless subdomain/s increases the parallel download rate making the page render faster
| 9:26 am on Jul 30, 2010 (gmt 0)|
i submitted my post long after previewing it and just now noticed i almost had the same answer regarding cacheing as you.
i would like to add to that a reference to this "Enabling If-Modified-Since Headers" [webmasterworld.com] thread from the Apache forum.
as usual pay special attention to what jdMorgan writes in that thread.
| 9:51 am on Jul 30, 2010 (gmt 0)|
I couldn't agree more with the above, if you're primarily motivated by anything other than giving your users the best experience possible, then I'd argue that "Site Performance" (as measured by Google) is the least of your problems. I'm completely obsessed with UI design and user experience. Not a single thing is more important than the experience a user has when then arrive on your site.
Something I've been experimenting with recently is preloading content based on the path a user is most likely to take through our site.
Now this can be as simple or as complicated as you'd like to make it, if your site structure and content is well managed, a users next click should be – with some degree of accuracy – predictable, and if you know where your user is going next, you're missing an opportunity to improve usability if you're not getting that content ready for them before they arrive.
OPTIMISED LANDING PAGES
During the 30 second user ponder, I start preloading the master CSS for the site, a small JS library, a CSS Sprite and the main image for the significantly fatter linked page. The result is a home page that loads incredibly fast, and the second click from the user pretty much just snaps into place, it's particularly impressive for the user because the image on the second page is usually 850 x 500 and it appears in under a second, the user has no idea that they are preloading our content in bite size chunks and just assume it's quick.
YMMV, this practice works well for us because we use a lot of large glossy images of products that have an emphasis on design, and we spend a lot of time looking at user data so we're pretty clear on what to preload and what not to. Of course now and again you're going to load content the user doesn't need, but it's a small price to pay if you can afford it.
| 2:17 pm on Jul 30, 2010 (gmt 0)|
OOOOOhhh this is good - tedster, where can I find specs on this that you would recommend reading. 'correct cache control headers ' I have been working on load speeds from a user exp. perspective and it has totally helped in terms of the big G.
| 2:36 pm on Jul 30, 2010 (gmt 0)|
Funny thing is all of a sudden site owners begin worrying about their site speed when for the last 10 years one of the most important parts of a site is how fast you can get it to pop for a visitor.
If they would have done their homework before launching the bloated sites we have now all this would really be a mute subject.
It takes someone in this case google to get the message across and all of a sudden we have all these sites worrying how fast their site is. Just kinda of ironic to me.
| 2:36 pm on Jul 30, 2010 (gmt 0)|
YSlow is still my favorite tool to monitor what I need to do to improve page speed. It goes over almost everything mentioned above and more.
| 2:48 pm on Jul 30, 2010 (gmt 0)|
I've spend quite some time on this. Here some of the things i've learned the hard way:
Make sure the browser doens't send any cookies to the CDN or subdomain you are using. It's a common mistake and a quick win.
Each file you are downloading takes about 4kb of header data. So each file you can miss saves you an extra few kb.
Use yuicompressor to compress your css and js files, in my opinion it's the best tool right now.
If possible send your output (buffer) to the browser as soon as possible and keep doing this until your page is loaded.
If you really want the max out of it buy the book "building faster websites"
| 4:04 pm on Jul 30, 2010 (gmt 0)|
| 4:30 pm on Jul 30, 2010 (gmt 0)|
One very easy one, and often missed I think, is to use www. for 'needs cookies' and put everything else on non-www without cookies... The browser sends any cookie(s) with every request for every file because it doesn't know any better, so if you set your cookies to be sent for www. and move everything else to non-www the browser only sends the cookie(s) when it's actually needed to make the site operate. (Upstream is the slowest part of the connection, because connections are optimized for downloading and uploading is always a slower stream than down.)
NOTE: This DOES NOT WORK the other way around! You cannot set your cookie on non-www and then serve cookieless files from www. because the cookie for non-www is sent by default to any sub-domain, even if you try to say otherwise when you set the cookie by setting it for example.com (without the preceding .example.com).
| 5:36 pm on Jul 30, 2010 (gmt 0)|
|where can I find specs on this that you would recommend reading. 'correct cache control headers ' |
A good starting place is [developer.yahoo.com...] - this is the documentation for Yahoo's YSlow tool. It's an interesting note that Steve Souders [stevesouders.com] did most of the speed research while he was at Yahoo and then was hired at Google.
| 5:57 pm on Jul 30, 2010 (gmt 0)|
Here's one that's often ignored:
favicon.ico - Do you have one? How big is the file?
I have a favicon.ico in my browser cache that is 290KB (<-- Not a typo). I have many that are over 30KB in size. What a waste of bandwidth...
Mine is less than 1KB (and so is WebmasterWorld's). The original size of mine was 5KB, it took me 20 minutes find a solution and to fix the file size.
| 8:37 pm on Jul 30, 2010 (gmt 0)|
What are some of you seeing when selecting "set custom crawl rate" from the WMT - Site Configuration - Settings - Set Custom Crawl Rate? Do high numbers here indicate a site has problem with speed?
|The crawl rate affects the speed of Googlebot's requests during the crawl process. It has no effect on how often Googlebot crawls your site. Google determines the recommended rate based on the number of pages in your site. |
We allow g to select the crawl rate. Based on number of pages on our site, it would currently take about 2 months for them to crawl all the pages per the g crawl rate setting.
As previously discussed here:
URL 09 thread on WMT speed stats [webmasterworld.com]
| 2:25 am on Aug 2, 2010 (gmt 0)|
very good post, i myself see website performance as a metric for user experiece much more than google ranking.
I mainly use css sprites to improve performance, because I am not so sure if gzip is secure enough for most users, or most browsers.
btw, iambic9 provide a very good idea, but i dont know how to set proloading.
| 3:15 am on Aug 2, 2010 (gmt 0)|
I've gone through and made changes to my site to improved speed. According to the Firefox speed test, my site is an 84, and Google is a 90. Webmaster Tools, though, shows that my site is slower than 54% of the sites on the 'net.
How much of the page load time, if any, is affected by the user's connection speed?
| 3:19 am on Aug 2, 2010 (gmt 0)|
The Webmaster Tools reports rests heavily on Toolbar data, so if your site has lots of users on slow connections that might be an influence. Still, I would "hope" that Google sees which toolbars are on slow connections and normalizes that data.
Lots of people have mentioned that the WMT data seems funky to them. It's a good thing that it's not a big part of ranking right now!
| 10:57 pm on Aug 3, 2010 (gmt 0)|
I want to be a web-MASTER, and not a web-lemming.
This is a classic!
| 6:57 pm on Oct 3, 2010 (gmt 0)|
Of course this actually bloats the individual page downloads and all page loads appear to be larger and slower and loads junk into the page the SE's typically don't need.
Combining into the minimum number of files is the best idea for overall performance.
|PRELOADING - Something I've been experimenting with recently is preloading content based on the path a user is most likely to take through our site. |
When you get too much preloading going on it can impact overall server performance not to mention you're just wasting bandwidth based on an assumption.