| 10:00 pm on Jan 7, 2010 (gmt 0)|
Sure, toolbar data is a big chunk of it. But I don't think you have anything to fear, unless you are serving your site from your living room on a regular ISP account.
If your site's global reach and traffic warrants it, then a CDN may be a good idea. There are some peer-to-peer offerings that are not as expensive as the top-shelf offerings and can be a incermental way toward getting involved.
However, if your hosting service has a good data pipe but you're not in a position to fork out for a CDN, you're still probably fine. Do what you can - optimize those pages for speed (a lot of people got lazy in that area as broadband spread) and you will still be competitive.
And for your closing question, yes, network latency is the piece that adds those extra seconds. And so it does for all your competitors, too.
| 11:58 pm on Jan 7, 2010 (gmt 0)|
It's very unlikely that Google would combine an algo change with the Caffeine rollout. Caffeine is a major transition, and they were quite cautious about their approach to it, even announcing it beforehand and asking for feedback, something they have rarely done in the past. During the rollout they would want to watch events very closely. So I strongly doubt that they would add the additional complication of a simultaneous algo change.
| 11:13 am on Jan 8, 2010 (gmt 0)|
Soory I accidentally posted the above message on the wrong thread. I meant to put it on the Google Updates and SERP changes thread.
| 2:04 pm on Jan 8, 2010 (gmt 0)|
AdSense, Maps and Analytics all deploy an image, the load time for the image is your page load and transmission speed benchmark. I think it is being actively used on their ranking algo.
| 6:29 pm on Jan 8, 2010 (gmt 0)|
The page load information Google is telling me seems crap to me.
Page loadings seems to be twice as fast as Google says it is, and it says its slower than the average web page?, which is not what I experience at all, as most of pages load within 2 seconds on my screen from a server which is 4000miles away from me.
| 7:28 pm on Jan 8, 2010 (gmt 0)|
They suggest to serve google maps, adsense and analytics code from existing domains in order to minimize DNS lookups. They really need to think twice about what they suggest there.
I also see wild fluctuations, the site performance graph seems to be broken: 8 seconds - 1 second - 4 seconds
This is pure crap.
Googlebot needs on average 80ms to fetch a URL and this has been constant all the time so I have no idea how they come up with the fluctuations in their performance graph.
| 7:51 pm on Jan 8, 2010 (gmt 0)|
And just to throw a monkey wrench into it...
The speed tool shows an excellent type rating for a flash landing page which is actually only 1-2k. Basically just the SEO, CSS, and the flashLoader script.
The actual time(please wait - loading) before anything is seen or can be done is a whole lot longer.
| 1:07 pm on Jan 9, 2010 (gmt 0)|
I have one site where the graph of site load time has slid down into the desirable 10% range in a smooth curve.
I haven't made any changes to the site in that time (its been christmas for heavens sake!) and other sites on the same server aren't showing a similar performance.
I really think this still needs some tweaking...
| 2:36 pm on Jan 9, 2010 (gmt 0)|
My loading time apparently has gone down too, but I didn't change anything. Better than 95% of sites
| 3:29 pm on Jan 9, 2010 (gmt 0)|
Out of interest what are your page sizes including images and your Google measured download speed for that page?
| 10:49 pm on Jan 9, 2010 (gmt 0)|
It has to be the toolbar. My password protected admin pages show on page speed in Webmaster central.