tedster - 11:42 pm on Jul 29, 2010 (gmt 0) [edited by: tedster at 9:51 pm (utc) on Jul 30, 2010]
There seems to be a growing confusion and even mythology around Google and site speed. It's becoming painful to read some of the junk around the web on this topic - and also stories from the people who tried to improve their site speed and hurt their rankings instead.
Here's my take on it all. First, notice that there are two very different reports in Webmaster Tools: Crawl Speed and Site Performance.
1. CRAWL SPEED - under "Diagnostics > Crawl Stats"
This is a report on googlebot's experience requesting URLs from your server. The graph shows "time spent downloading a page" - in milliseconds.
2. SITE PERFORMANCE - under "Labs > Site Performance"
This is a report on the average user's experience rendering pages on your site. The graph is shown with a scale of seconds.
#1 is all about your server - its efficiency, how fast database calls are returned, things like that. #2 is about a whole bunch of other things - everything that affects how long it takes to put the finalized page on a user's monitor.
#1 - your own server
#2 - the visitor's toolbar
Want to improve #1? You may need to move to a better server or hosting service. You may need to optimize your database calls. But there's not a lot more you can do here.
Want to improve #2? There are a whole lot of things you can do, and they were spelled out over a year ago when Google started talking about The Need for Speed [webmasterworld.com].
But looking further, why would we want to do any of this? Is it because Google said speed might be used as a ranking factor? FAIL! I say don't be Google's puppet. I want to be a web-MASTER, and not a web-lemming. If Google is using speed at all, it's still is a very minor part of the algorithm. In fact, this is something Matt Cutts has reinforced several times. As a straight-up ranking factor, site speed is extremely minor.
Google started all this fuss mostly to put the rendering speed issue back on the table, to begin raising awareness of the issue - and it is an important issue. As broadband spread around the globe, some developers forgot that speed still does matter. Heck, I've seen graphic designers who save every jpg at 100% and every png as 24-bit!
If your site speed metric is so bad that you are losing ranking position because of it, then your visitors have already been hating your site anyway. It's not just Google's algo that's punishing you, it's that the visitors that do manage to come in are having a sub-par experience.
So I say improve your site speed because you don't want to crap all over your visitors. Just the same way that you would fix a leaky pipe over the main door to a street store. But don't run will-nilly making big changes when you don't really understand what you are affecting or why.
The simplest speed improvement many sites can make is to activate gzip. Just that much is simple and can make a huge difference with little risk. I'm always looking for the best returns for the least resources spent, and gzip is low-hanging fruit. But you're not likely to find me playing around with e-tags in the near future.
[edited by: tedster at 9:51 pm (utc) on Jul 30, 2010]