Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Page load speed? What counts is time to make content visible

         

jetteroheller

9:24 am on Jan 4, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Since there are rumors, load speed has an influence on search results, I post this theme here.

At Google Webmaster Tools, my sites are at 50% are faster to 65% are faster.

But my site is highly optimized to show fast content.

The headline of the page and the description shows up
when the first TCP/IP package is delivered.

This is because:

GZ compressed delivery
Short CSS in the head

So the first TCP/IP package contains enough information
to render headline and description.

Most html pages are less than 4 kb GZ compressed.

So all the text from the page is shown after loading 4 kb

All navigation graphics are in one PNG file
18 kb to load

The first part of the own javascript has
13 kb because also GZ compressed

After loading
HTML 4 kb
PNG 18 kb
other pictures on the page
JS 13 kb the page is complete visible in all details

At this point, there should be taken a time
"Page is visible for the user"

There is also other stuff loading
Second part of Javascript, first part starts loading
4 AdSense ads
Google Analytics

But this loads are done, while the visitor
can already view the page in all important details.

So I think there should be at last 2 page speed times

* page visible for the user
* all is loaded

I think at the "page visible for the user" times, my site would rank very high in the statistics of load times.

tedster

5:06 pm on Jan 4, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I agree with your observations - and I'm pretty sure Google does too. Load speed is still a very minor factor and this might well be one of the reasons.

You also got me thinking about Matt Cutts' rather enigmatic comment at Pubcon, saying that they were working on an algorithm to measure what content is actually above the fold. First, this is clearly going to be device dependent. So you've got me thinking that they might be trying to measure when the first screen is fully populated.

Also note, the page speed data they show is of two kinds - it does show a single number, but it also shows how that compares to the rest of the sites on the web.

enigma1

11:35 am on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



But my site is highly optimized to show fast content.

It's not only the html, css and images. There are other factors like server environment, port throughput and also server and application latencies. The steps you've taken and mentioned in the original post are more regarding bandwidth than speed.

In other words you can even display an empty document but still page loading be slower than a full-blown ecommerce page listing 100 products with images. Sending 1K of data on one server environment may take longer to render than 100K on another server environment, b/w is not the only factor.

In order to get a more precise page load measurement you will have to trigger a timer on the server end right when the request happens and then stop it from the client end once everything is loaded and somehow log it in order to compare it. You will also need a number of samples and probing from different locations.

Now the gwt page speed info talks about times the googlebot sees. One thing you could do, although limited, is to take the same site and deploy it on 2 different servers say one shared the other dedicated. The odds are, the dedicated environment will show faster page loads.

And apart of b/w and server setup, other factors driven from the site's code structure, like html cache, database optimization, queries cache etc., can dramatically improve page load times.

jetteroheller

12:46 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



It's not only the html, css and images. There are other factors like server environment, port throughput and also server and application latencies. The steps you've taken and mentioned in the original post are more regarding bandwidth than speed.


The server has only to send the file.
All files are sent as is.

No database access and other time consuming tasks.
Simple send the file

The number of transfer is extreme reduced.

html file -> all content is shown
png file -> first part of navigation can be shown
js file -> all navigation can be shown

onebuyone

1:09 pm on Jan 5, 2012 (gmt 0)

10+ Year Member



what about resources(CPU/Memory) consumption? Maybe usage of such metric is googles little secret?

rlange

1:55 pm on Jan 5, 2012 (gmt 0)

10+ Year Member



onebyone wrote:
what about resources(CPU/Memory) consumption? Maybe usage of such metric is googles little secret?

That's not something Google can determine. It's not even something they can guess at. They'd either need access to your server sufficient enough to run commands on it, or you'd need to make that information available yourself. Neither one of those is a good idea.

--
Ryan

enigma1

2:15 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Even if you have previously generated physical static pages it can still be slower than a dynamic web system.

Say a page content is several KBs compressed the client may still have to download several KBs on every access even if it's the same page. A site with dynamic pages can utilize the 304 header and just send few bytes on repeated requests for the same page/client. AFAIK that is a factor on google's metrics for speed. Plus the server environment I mentioned before.

IMO you need to put some logging mechanism to check timings and then check the results having the same page on two or more different servers. I have seen differences among dedicated boxes just because the one host has a 10Mbit port "on special offer", vs a standard 100Mbit port.

In your gwt what's the spike/worse timing googlebot encountered? Also if you're concerned with external resources setup a site without them and check the timings. For the client end, depending how your javascripts are setup, may not run right away but after everything is loaded, so you could setup an ajax call to your server once everything is done to check the time taken for the whole request/response cycle.

onebuyone

2:16 pm on Jan 5, 2012 (gmt 0)

10+ Year Member



Even after a page is loaded by browser, it does not stop to use CPU on client computer(I didn't mean server side, sorry for being not clear on that). There are many ways to make CSS/JS/HTML code, which will make your system use 100% CPU and consume a lot of memory only to render that page. Think about using JS inside CSS or connecting to window events like "onscroll" without any limits.

Page Speed scores are easy to abuse(get high score for poor performance site), so they will never be used in algo. It's easier to check how much time it took to render a page, how much memory was alocated in the process and how it performs during usage(excessive JS can be a killer).

Poorly coded front ends of many websites are a plague today. Using them on modern PC/laptop is not problem, but on older machines it is an issue.

For Google it's easier to find out such data, so PageSpeed can be just a bluff for webmasters, while in real algo they may just use pure cpu/memory usage numbers, because in the end it's the only thing that matters.

so maybe you should open resource monitor in your system, refresh your page, scroll up and down, move mouse around and then check cpu/memory usage graphs. CPU usage close to 100% for long time and increasing memory usage are not good signals..

Hoople

3:29 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I recently turned away a customer who was dead set on moving to a VPS hosting (which I don't offer). His old host was shared hosting that was struggling to get uptime into the >95% realm.

Just checked his site performance with the GTMetrix online page speed test site and there was an increasing DNS overhead. At first DNS lookups have a reasonable time (<30ms for index.html + styles.css) but slowly it increased towards the end of his 5.74 second page load. Towards the end DNS overhead was 1.88 SECONDS. Total page size: 1.55MB Total number of requests: 85. Not a typical site build by readers here but there are plenty just as bad or worse.

So, he got what he paid for, NOT! Looks like a new form of overselling - VPS's that are tied to resources (DNS) that can't keep up.

deadsea

6:07 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



I just wish there were a way to increase the load speed of adsense ads. With adblock, my pages render from across the country in 700ms. Without adblock the pages take 2.5 seconds.

tedster

6:52 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



the gwt page speed info talks about times the googlebot sees

Google specifically said this is not the case. Page Speed - as reported in "Labs/Site Performance" is using data from toolbar installations and other browser records from users (Chrome, I assume).

There is another metric in WMT's "Diagnostics/Crawl Stats" area that does report on googlebot's time spent downloading from the server. But that's not "Page Speed" or "Site Performance".

enigma1

8:07 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



There is another metric in WMT's "Diagnostics/Crawl Stats" area

Yes this is the one I meant.

jetteroheller

8:21 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I just wish there were a way to increase the load speed of adsense ads. With adblock, my pages render from across the country in 700ms. Without adblock the pages take 2.5 seconds.


When I look at page speed in Firefox,
1/3 is my own page
2/3 is Google AdSense, Google Analytics, Google +1 Button

But even with this, they are typical
at 0,3 to 0,4 MB while most pages of mass media have beyond 1,5 MB

levo

9:42 pm on Jan 5, 2012 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



On supported browsers, chrome/firefox and ie9, you can use the following method to view what analytics is sampling for page speed. (Analytics recently changed its method to include the redirect time too but this code doesn't include it)


if (window.performance&&performance.timing) {
setTimeout(function(){
var htmlfivespeed = performance.timing.loadEventStart - performance.timing.fetchStart;
htmlfivespeed = htmlfivespeed/1000;
$("body").append('<div style="position:fixed;bottom:1px;right:1px;padding:4px;background-color:#000;color:#FFF;font-size:12px;">'+htmlfivespeed+'</div>');
},3000);
}


- If you're using gzip/deflate, all packages have to be delivered before showing any content. Still, due to TCP slow start, it is best practice to serve it compressed and under 4KB.

- BTW, Google is using increased initial congestion window on its servers (microsoft too). [code.google.com...] If you have root access to your server, you can increase it with "ip route ... initcwnd" command (look it up)

- Image loading can be tricky, make sure you have keep-alive enabled with low timeout.

- onebuyone is absolutely right on CPU usage. Check your website on a slow device. You can use an iPad, its DOM is rather painfully slow, and it shows all the mistakes you make utilizing javascript/dynamic content. If your DOM modifications are slow, the dynamically loaded content would start loading delayed. I've recently checked/optimized every line of my .js code and it made up to 5-10X boost on rendering and 2X on page loading.

- For adsense, you can switch to async DFP code.

- Don't be fooled by asynchronous 3rd party codes, they do delay your window.load event and increase the page speed. Put all non-essential 3rd party codes (social buttons, stats etc) after window.onload (or $(window).load if you're using jquery)