Forum Moderators: Robert Charlton & goodroi
However after looking in the GWMT Labs page speed function I have determined that this is probably not the case.
It would appear that page download times are tracked by the toolbar for each specific real time user.
I noticed that some pages are only ever used by logged in visitors, for example the postnewtopic.php page in a forum and the search results page which you need to be logged into view. These are not pages typically reached by Googlebot.
I think this move towards speed will actually reinforce the Geographical targetting of search results.
A content delivery network appears to be the way forward.
I'm a little depressed. I reduced the page size from 100k to 30k compressed images, and enabled gzipping on the server.
Page download times have reduced in GWMT by 1 second from 5.5 to 4. (The server actually takes 0.03 seconds to generate the page, the rest of this time must be down to network latency.)
What are your thoughts on this theory? Can you add anything to back this up or disprove it?
If your site's global reach and traffic warrants it, then a CDN may be a good idea. There are some peer-to-peer offerings that are not as expensive as the top-shelf offerings and can be a incermental way toward getting involved.
However, if your hosting service has a good data pipe but you're not in a position to fork out for a CDN, you're still probably fine. Do what you can - optimize those pages for speed (a lot of people got lazy in that area as broadband spread) and you will still be competitive.
And for your closing question, yes, network latency is the piece that adds those extra seconds. And so it does for all your competitors, too.
Page loadings seems to be twice as fast as Google says it is, and it says its slower than the average web page?, which is not what I experience at all, as most of pages load within 2 seconds on my screen from a server which is 4000miles away from me.
I also see wild fluctuations, the site performance graph seems to be broken: 8 seconds - 1 second - 4 seconds
This is pure crap.
Googlebot needs on average 80ms to fetch a URL and this has been constant all the time so I have no idea how they come up with the fluctuations in their performance graph.
The speed tool shows an excellent type rating for a flash landing page which is actually only 1-2k. Basically just the SEO, CSS, and the flashLoader script.
The actual time(please wait - loading) before anything is seen or can be done is a whole lot longer.