| This 40 message thread spans 2 pages: < < 40 ( 1  ) || |
|Acceptable Page Load time in Google's eyes|
I installed page load measurement for GA a few days ago. Average loading time of my pages is around 4s. Do you think that is acceptable in Google's eyes or rather on the slow side?
What are your page load times?
If you have a considerable number of return visitors, and you're able to serve compressed content and set up caching headers, consider hosting ga.js locally. Set up an alert to notify you of updates to the file on Google's domain, and on your domain give it a unique name each time you update it (e.g. with the day's date, ga20110522.js). Then set up a far-future expiry date (e.g. a year), and make sure the file's served compressed.
Unfortunately, you cannot host __utm.gif locally, so a call to Google's servers is still necessary. Fortunately, it's only 35B in size (303B transfer size), whereas ga.js is 11700B (11.7KB).
I just tested one page from one of my sites and most of suggestions about optimising page load speed on that Google Labs link seem to apply to Adsense. I guess they've changed the settings for 'irony' along with everything else in Panda.
Similar situation; widgets, ajax loads, masses of images - analytics reports the home page as 9.7s, and gwt as around 7.
We're doing everything obvious to reduce load times - zipping, minifying, merging js files etc and some smarter techniques; ajax loading certain content, lazy loading images below the fold (speed up and bandwidth decrease)
The big question I have is; when google use load times as a factor in search quality (and therefore ranking) - do they do it in a dumb way (eg. adding fb widgets and adsense ads into the load time) or a smart way (eg. the time the page loads above the fold)?
If it is the former... it would certainly penalize content rich sites - (with the rise in above the fold ads and hotlinked image previews on the image search pages, I'm an unhappy webmaster with regards to google at the moment)
gethan, Google uses page speed as a DIRECT factor only in a very minor way (Matt Cutts once said, it was a kind of tie-breaker).
Indirectly, slow page speed can affect user experience and engagement and Google apparently does use those metrics. If slow pages are lowering a site's user engagement and that is noticeable in the metrics Google can see [webmasterworld.com], then Google might choose to send less traffic by lowering rankings - because Google wants to give their own users a quality experience.
Overall, wouldn't hosting ga.js locally result in longer page load times? It's the one file that's almost certainly already in the user's cache.
According to its response headers, ga.js is cached for a mere 24 hours, so it depends on the other sites your users have visited in the last 24 hours. If you have a lot of return visitors, you can speed up their visits by hosting ga.js locally and having it expire in, say, a year's time. This would be something you could test with the Site Speed feature in GA.
As a result, your users will still have to redownload ga.js once a day when they visit other sites using GA, but on your site they'll only redownload it whenever you've updated it (assuming you've set it up as I described in my previous post).
Good point robzilla, and maybe worth testing once I've knocked down the lower-hanging fruit. Such a strategy would also give you the option of combining it with your other js files.
Still buggy data IMO - average day is 2-3 seconds and then there are entries like this - 968.32s with a sample size of 1. Brought that day up to 10.69s.
I really doubt someone sat there waiting 16 minutes for that page to load or that the page took 16 minutes to load because there are no widgets on that page.
|If you have a considerable number of return visitors, and you're able to serve compressed content and set up caching headers, consider hosting ga.js locally. Set up an alert to notify you of updates to the file on Google's domain, and on your domain give it a unique name each time you update it (e.g. with the day's date, ga20110522.js). Then set up a far-future expiry date (e.g. a year), and make sure the file's served compressed. |
I don't know if this would necessarily be faster. Given that many of the top 100 sites use GA, you lose the benefit of having the file already cached for first-time visits by switching the URL. Even for return visits, it's likely they already have the file cached from another site unless your site is a home page for the user.
But it would be something to try, of course.
My page load time is something like:
1) A fraction of a second to load the content.
2) Several seconds to load the ads.
I HOPE Google;s algo knows the difference.
| This 40 message thread spans 2 pages: < < 40 ( 1  ) |