Forum Moderators: Robert Charlton & goodroi
My aim at the moment is to get the average for all sites under 500 ms but this would appear to be just for downloading the text, not the images. Do you have a figure you aim at or do you have a better method of measuring it?
The methods I am using to speed up my sites are:
1. Changing host (moving from one well known uk host to another at a similar price resulted WMT showing access times dropping to a 6th of previous times)
2. Making pages validate as far as possible (can't get any pages containing affiliate links to validate fully)
3. Removing practically all white space from html
4. Removing practically all white space from scripting
5. Reducing database accesses, using my own form of caching in places.
I am assuming that Google is likely to take acces times into consideration in the algo if not already and I read a report from a recent survey (in the travel sector) sugesting that website visitors are unlikely to wait more than 2-3 seconds for a page - a couple of years ago it was 4 seconds.
My bounce rates in G Analytics tend to be around 60% and pages per visit only just over 2 - two metrics I would like to improve on.
There are two tools that I use a lot for this job - YSlow from Yahoo! and Page Speed from Google. Both are Firefox plug-ins. I've learned a lot from both tools - things I never thought about. Herer's a thread that gives access to both tools.
See The Need for Speed [webmasterworld.com]
1. Serving images from a separate server
If you rely on Google images for search then don't do that!
Whilst the images may be listed it is very rare they will be ranked well, just believe me, a massive amount of trial and error over the years on my part for that.
IOW:
http://css.example.com/stylesheet.css is the URL you use on your page, then with Mod_Rewrite (if necessary, sometimes it's not) serve the file from it's actual location on your server...
http://images.example.com/images/imagename.jpg would be the image version.
One of the biggest slowdowns is the number of connections a browser opens to a single domain at a single time, and by 'using' sub-domains you can force extra connections. (Don't go overboard with it though... There's a balance and if you go too far by trying to get too many connections you can actually clog things up a bit. (Or, I'm fairly sure I did once... ;))
I assume duplicate images don't cause problems - or do they?
Good question!
I have example1.com and example2.com, I used to serve all images for both sites from a repository at example3.com.
Whilst both examples.com/widget.html ranked #1 for their respective searches they mostly did not rank anywhere for images however once the images were served from their respective domains they shot to the top.
I have actually left all those images on example3.com, some rank a little however they certainly have not affected the ones I do have hosted on their own domains therefore I assume that there is not a significant duplicate issue problem.
But I believe you can serve different directories of a single domain name from different servers
However, my trials have shown that when doing this the IPs do not match-up for G resulting in a lower image search ranking, once my IPs matched-up the images went straight to #1.
Of course there is a HUGE YMMV here and it's not just the image, all the other "attributes" have to be right as well :-)
6. Edit all text for conciseness.
7. Optimize all images for the leanest feasible file size.
8. Use CSS sprites to reduce the number of separate images that need to be requested.
9. Make more use of contextual selectors in your external CSS and use fewer divs, spans, classes, etc, in your source code.
10. (For a new site or section) Plan folders / file paths so you end up with meaningful, concise URLs. Shorter URLs in your navigation menus could save hundreds of characters on every page.
11. Create your own redirects to replace longwinded affiliate links that don't validate.
[edited by: buckworks at 3:32 pm (utc) on Sep. 24, 2009]
If the latter, different browsers take different times to render. This can be worsened by some plug-ins (eg firefox's page validator if it's turned on).
Google Analytics can slow down final display considerably (I have it turned off on my browsers - it sometimes fails to complete in any reasonable time). Some hit counters likewise.
Ditto any ads from other sites (if any).
I assume your browser requests compressed data and your server returns it compressed.
If you are using typical virtual servers then you have no control over download time. If one site on the server takes a hammering everything else suffers.
... So, I now manage my own server in Singapore. I've got it so tuned up, it's magical. I also redesigned my pages with a minimalist approach and now they look like white A4 paper documents with plain text information. There is only a 1 pixel background image in the css and the site logo. All white space is cut out of the source... also a few ads. Any javascript ads that I can't get to validate are thrown out.
I'm sitting here in Thailand (where ISPs are notoriously bad) and my site's pages are loading almost as fast as you can blink, even the forum... and my regulars in Asia sure appreciate the high speed site.
The only thing that hold it up are the affiliate ads I sometimes run... but it's a commercial site and I don't know a way around that problem, other than the fact I have ordered the source in such a way that the affiliate javascript code is loaded as late in the code as possible.
[edited by: tedster at 11:25 pm (utc) on Nov. 19, 2009]
[edit reason] remove the specific web host's name [/edit]
I am still wondering about my original question on how people judge the speed of a site and what should we be aiming for. I am on a rel slow connection of 256K so not necessarily seeing the same load times as everyone else (living in the hills of southern Spain with no mains electricity, water or landline phone, I am pleased to have 256K via microwave link)