homepage Welcome to WebmasterWorld Guest from 54.166.100.8
register, free tools, login, search, pro membership, help, library, announcements, recent posts, open posts,
Become a Pro Member

Visit PubCon.com
Home / Forums Index / Google / Google SEO News and Discussion
Forum Library, Charter, Moderators: Robert Charlton & aakk9999 & brotherhood of lan & goodroi

Google SEO News and Discussion Forum

    
How Google measures page speed
waynne




msg:4055866
 12:54 pm on Jan 7, 2010 (gmt 0)

Intially I assumed that the Googlebot page download time would be the factor used to determine a page download speed.

However after looking in the GWMT Labs page speed function I have determined that this is probably not the case.

It would appear that page download times are tracked by the toolbar for each specific real time user.

I noticed that some pages are only ever used by logged in visitors, for example the postnewtopic.php page in a forum and the search results page which you need to be logged into view. These are not pages typically reached by Googlebot.

I think this move towards speed will actually reinforce the Geographical targetting of search results.

A content delivery network appears to be the way forward.

I'm a little depressed. I reduced the page size from 100k to 30k compressed images, and enabled gzipping on the server.

Page download times have reduced in GWMT by 1 second from 5.5 to 4. (The server actually takes 0.03 seconds to generate the page, the rest of this time must be down to network latency.)

What are your thoughts on this theory? Can you add anything to back this up or disprove it?

 

tedster




msg:4056254
 10:00 pm on Jan 7, 2010 (gmt 0)

Sure, toolbar data is a big chunk of it. But I don't think you have anything to fear, unless you are serving your site from your living room on a regular ISP account.

If your site's global reach and traffic warrants it, then a CDN may be a good idea. There are some peer-to-peer offerings that are not as expensive as the top-shelf offerings and can be a incermental way toward getting involved.

However, if your hosting service has a good data pipe but you're not in a position to fork out for a CDN, you're still probably fine. Do what you can - optimize those pages for speed (a lot of people got lazy in that area as broadband spread) and you will still be competitive.

And for your closing question, yes, network latency is the piece that adds those extra seconds. And so it does for all your competitors, too.

aristotle




msg:4056346
 11:58 pm on Jan 7, 2010 (gmt 0)

It's very unlikely that Google would combine an algo change with the Caffeine rollout. Caffeine is a major transition, and they were quite cautious about their approach to it, even announcing it beforehand and asking for feedback, something they have rarely done in the past. During the rollout they would want to watch events very closely. So I strongly doubt that they would add the additional complication of a simultaneous algo change.

aristotle




msg:4056586
 11:13 am on Jan 8, 2010 (gmt 0)

Soory I accidentally posted the above message on the wrong thread. I meant to put it on the Google Updates and SERP changes thread.

kjennings2




msg:4056659
 2:04 pm on Jan 8, 2010 (gmt 0)

AdSense, Maps and Analytics all deploy an image, the load time for the image is your page load and transmission speed benchmark. I think it is being actively used on their ranking algo.

Seb7




msg:4056842
 6:29 pm on Jan 8, 2010 (gmt 0)

The page load information Google is telling me seems crap to me.

Page loadings seems to be twice as fast as Google says it is, and it says its slower than the average web page?, which is not what I experience at all, as most of pages load within 2 seconds on my screen from a server which is 4000miles away from me.

SEOPTI




msg:4056880
 7:28 pm on Jan 8, 2010 (gmt 0)

They suggest to serve google maps, adsense and analytics code from existing domains in order to minimize DNS lookups. They really need to think twice about what they suggest there.

I also see wild fluctuations, the site performance graph seems to be broken: 8 seconds - 1 second - 4 seconds

This is pure crap.

Googlebot needs on average 80ms to fetch a URL and this has been constant all the time so I have no idea how they come up with the fluctuations in their performance graph.

ppc_newbie




msg:4056900
 7:51 pm on Jan 8, 2010 (gmt 0)

And just to throw a monkey wrench into it...

The speed tool shows an excellent type rating for a flash landing page which is actually only 1-2k. Basically just the SEO, CSS, and the flashLoader script.

The actual time(please wait - loading) before anything is seen or can be done is a whole lot longer.

leadegroot




msg:4057305
 1:07 pm on Jan 9, 2010 (gmt 0)

I have one site where the graph of site load time has slid down into the desirable 10% range in a smooth curve.
I haven't made any changes to the site in that time (its been christmas for heavens sake!) and other sites on the same server aren't showing a similar performance.
I really think this still needs some tweaking...

walkman




msg:4057331
 2:36 pm on Jan 9, 2010 (gmt 0)

My loading time apparently has gone down too, but I didn't change anything. Better than 95% of sites

waynne




msg:4057355
 3:29 pm on Jan 9, 2010 (gmt 0)

Out of interest what are your page sizes including images and your Google measured download speed for that page?

walkman




msg:4057516
 10:49 pm on Jan 9, 2010 (gmt 0)

It has to be the toolbar. My password protected admin pages show on page speed in Webmaster central.

Global Options:
 top home search open messages active posts  
 

Home / Forums Index / Google / Google SEO News and Discussion
rss feed

All trademarks and copyrights held by respective owners. Member comments are owned by the poster.
Home ¦ Free Tools ¦ Terms of Service ¦ Privacy Policy ¦ Report Problem ¦ About ¦ Library ¦ Newsletter
WebmasterWorld is a Developer Shed Community owned by Jim Boykin.
© Webmaster World 1996-2014 all rights reserved