On other threads various people have voiced the opinion that Google's method of measuring loading speeds is seriously flawed and gives inaccurate results.
It does seem so. Google's measurements depend strongly on their toolbar users "calling home" with the information about how fast the page renders.
At the same time, I have checked quite a few sites at this point and as a rule, if the pages render slowly on my screen they also score poorly in WMT.
I think that it is a crock - the kind of thing that newly minted Computer Science grads come up with when they don't know any better. There are so many variables in content delivery over a network that it is nearly impossible to get a reliable site performance figure. A lot of the delays in site loading seem to be attributable to Google Adsense and Google Analytics.
3.2 seconds. But when I measure using my own connection, it loads around 1.7 seconds. Adsense and Analytics slow down the site.
I wonder if they've even bothered to exclude Google Adsense and Google Analytics from the calculations? They really should hire this guy:
Apparently he works for Yahoo but his High Performance Websites book is excellent.
What I find quite funny about all this is the amount of websites that have minimized images on category pages so much as they might as well not be there. If what Tedster says is correct then this is a slap in the face for anyone on shared hosting, for small sites, for Mama and Papa.
LOL! They did - soon after Yahoo released the Y!Slow tool and Steve gave a tech talk at Google. In fact, here's a Google Tech Talk video on browser speed issues [youtube.com] with Steve Souders. It's one of several on YouTube, and they're all worth watching.
Mine dropped from ~3.1 to ~2.4 after removing Analytics.
I blocked analytics in all my browsers years ago. Whenever a site was slow to load - and I mean SLOW - the status bar always said "waiting for google analytics".
Be interesting to know how many other "real life" site visitors block it, although I suppose most dno't know about it.
I have an image site. About 2 weeks ago I installed the browser speedtool add-on recommended by Google and I took the advice to "leverage browser caching". I also scaled some images on the homepage as suggested. I'm starting to see the graph in WMT level out at 2.6 sec. This is down from about 3.9 seconds 2 weeks ago. With browser caching enabled my site's homepage loads in under 1 sec for me. Is this type of data included in googles calculation?
2.0 Seconds with Analytics (AJAX v) and no GZip.
I just enabled mod_deflate to start gzipping text files (HTML, CSS, JS), so I'll try to remember to post again if it makes any difference. I should note, the one I'm talking about is actually faster than G says too. You can miss a refresh if you don't watch status bar, which is why I haven't worried too much about GZip before but it'll be interesting to see what effect it has on the number they report.
"On average, pages in your site take 7.9 seconds to load (updated on Apr 28, 2010). This is slower than 87% of sites. These estimates are of high accuracy (more than 1000 data points). The chart below shows how your site's average page load time has changed over the last few months. For your reference, it also shows the 20th percentile value across all sites, separating slow and fast load times."
all the "Example pages" are faster than the 7.9 average, they range between 5.8 to 7.7 and are the recently most trafficked pages.
My traffic structure is such that most of the traffic is from US to very few pages, whereas a lot of the global traffic is scattered across many pages.
I feel that the Google average comes from average page load of each individual URL irrespective of which URL gets how much visits...
MadScientist: Implementing gzip cut my reported WMT page loading time in half, from around 7-8 seconds to around 3-4 seconds.
Cool helpnow, glad it did you some good!
I'll try to remember to keep an eye on it and see what numbers I get, but it's not one of those links I click very often, because I usually optimize for speed and when I can't find any more without tearing the whole site apart and starting over I figure it's as fast as I'm going to make it, and G's WMT number is what it is... I don't concern myself with their number nearly as much as what I see happening in my browser and I was actually blown away to see 2.0 seconds with some of the numbers people here are reporting, but I'm not complaining... :)
In my experience, shared hosting isn't problematic in itself for speed. My ecommerce site is on shared hosting and it includes images, but my average download speed on that site is 1.2 seconds acc. to Google. I think that's because it's static html. In contrast, my blog speed is now twice as fast as it was since moving to a new server, and that coincides with my own experience with it. Google says the blog is still slow in general, but from what I can see, that is because they are including info from the old server; there were times it was taking 35 seconds to download (you get what you pay for, I guess).
Other than motiving some webmasters to optimise and being nice pr, how useful is this metric other than to show how bad most people's connections are and how much junk they have installed?
The problem here is: all sites are not equal!
We host a bazillion of MP3s for previewing music. These previews are 300k!
Now WMT says:
On average, pages in your site take 2.2 seconds to load (updated on Apr 30, 2010).
Well - direct delivery of 5 million mp3 snippets of 300k on a 2.2 seconds average is pretty good in my book. But if G takes that 2.2 as "slow", they sure have an algo problem.
We are averaging 120 MBit/s a day and pages without the big previews load within 1.2 seconds out of Germany - and our servers are US based!
WMT should split pages in categories. They have the knowledge!
|Other than motiving some webmasters to optimise and being nice pr, how useful is this metric other than to show how bad most people's connections are and how much junk they have installed? |
Now if they were to reference this against google bot's download and crawl rates they might have something useful...
This tracking code stuff has gotten out of control. Look at Huffington Post. They may have ten or more different tracking scripts! The pages take forever to open. But Google continues to give them high position.
I turned off Quantcast tracking code a few days. I'll give it a week and then see if there are any effects.