Welcome to WebmasterWorld Guest from 54.167.29.208

Forum Moderators: Robert Charlton & goodroi

Featured Home Page Discussion

Google to use Page Speed as Metric

     
7:28 pm on Jan 17, 2018 (gmt 0)

Administrator from US 

WebmasterWorld Administrator brett_tabke is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month Best Post Of The Month

joined:Sept 21, 1999
posts:38150
votes: 61


Google will make page speed a factor in mobile search ranking starting in July

Google today announced a significant change in how it ranks websites for mobile searches: it will now take page speed into consideration as one of its signals, the company says. The change, which Google is referring to as the “Speed Update,” will go into effect in July 2018, and will downrank very slow websites under certain conditions.

Though speed will become more of a factor in determining the order of search results, the change is not so drastic as to make it the only factor. There will be times that slow pages still rank highly – like when they have the most relevant content related to the search query at hand, for example.
[techcrunch.com...]
[webmasters.googleblog.com...]
2:43 pm on Jan 21, 2018 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3338
votes: 249


This isn't new. Google has been regularly talking about using speed to impact rankings since at least 2010 [webmasters.googleblog.com].

This newest version of a "speed update" doesn't even happen for 6 months and that assumes Google is on time. Wasn't the Google mobile first index change almost a year behind schedule? We have plenty of time and honestly 99% of you reading this don't need anytime because you are already safe.

When it does roll out it will ONLY IMPACT EXTREMELY SLOW stuff or as Google says "only affect pages that deliver the slowest experience to users and will only affect a small percentage of queries".

Speed is just 1 of over 100+ ranking factors. So you probably want to keep things in perspective. As soon as your speed is good enough your SEO efforts are better spent elsewhere instead of worrying about a millisecond. It is in Google's best interest for webmasters to focus too much on speed and be distracted from exploiting the many weaknesses in their algo.

If you really are worried and want to over analyze this, just compare the speed results of your site with the top ten ranking pages (make sure to pay attention to mobile pages & desktop pages). You will most likely find your site speed is about the same as your competition because you all know that users like fast sites and your profit goes up when you are fast enough.

IMHO this is a big non-issue.

If someone wasn't smart enough to realize slow websites suck and convert horribly then they were unlikely to rank well. I can think of a few specific situations this will impact like a site that quickly loads the ads and intentionally is super slow to load the content to artificially boost their ads CTR. For most webmasters, they already have good enough speed.
9:56 am on Jan 22, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:1666
votes: 239


Although some progress has been made, I think we're still quite far away from "good enough speed", and what we might now classify as "good enough" (even though we shouldn't) is likely to change over time. Similarly, what we measure to be "good enough speed" on our development hardware and low-latency broadband connections is often not representative of the average user experience. If it were a non-issue, I don't think Google would be pushing it as much as they are, and it is also my experience that much of the Web is still unnecessarily slow (perhaps increasingly so, despite all efforts).

It may not make much sense to optimize speed with only search rankings in mind, and good speed is not going to make up for poor content or usability, but the impact that snappy pages can have on signals that are likely to feed back into rankings are, in my opinion, often understated. You don't get to the top (or stay there long) by being "good enough"; you'd leave too much room for your competitors to catch up.
12:35 pm on Jan 22, 2018 (gmt 0)

Administrator from US 

WebmasterWorld Administrator goodroi is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:June 21, 2004
posts:3338
votes: 249


@robzilla My point is there are only so many hours in a day and we all need to prioritize our time and resources. Focusing too much on speed can hurt a website because as you correctly mention a good website also needs content and usability. It is important to properly triage our work to optimize our results.
1:05 pm on Jan 22, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:1666
votes: 239


It is important to properly triage our work to optimize our results.

Absolutely! Thankfully we're constantly getting new tools and information to help use save timing optimizing for speed.
3:14 pm on Jan 22, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member jetteroheller is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 22, 2005
posts: 3051
votes: 6


I wrote 1997 my own CMS what I use since this time, every year some improvements.
Since many years compressed delivery of JS and HTML, CSS in the html head section.
Only one JS is loaded.
Each site has for English and German a different JS.
The one and only JS is conditional compiled.
This means there is a string, for example

javascript=AS,CI,GS,L6,LB,LC,LI,LM,LS,PP,Pl,rt

In the javascript:

GS:function html_search_box(w,c,n,b)
GS:{
GS:e: var d='<form action="https://www.google.com" id="cse-search-box"'+n+' target="_blank">';
GS:g: var d='<form action="https://www.google.de" id="cse-search-box"'+n+' target="_blank">';
GS: d+='<div style=width:'+w+'>';

GS: means this line only, when GS ( Google search ) is required
GS:g: means only for the German version, when GS is required

Just loading html and the sprite.png is enough to render much parts of a page.
Logo, icons for navigation, icons for different languages.
3:09 am on Jan 23, 2018 (gmt 0)

New User from NG 

joined:Jan 23, 2018
posts:3
votes: 0


What would happen to those site with 50% speed score?
9:51 am on Jan 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:1666
votes: 239


Nothing. Again, it's not about the PageSpeed score. It's about actual load times of your pages, and we don't know what the cutoff might be.
10:04 am on Jan 23, 2018 (gmt 0)

Moderator from US 

WebmasterWorld Administrator keyplyr is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Sept 26, 2001
posts:10638
votes: 630


Remembering back to the dial-up modem days where I 'd click a link, then go get a cup of coffee and return just in time for the page to load.
10:25 am on Jan 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member jetteroheller is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 22, 2005
posts: 3051
votes: 6


In the dial-up modem time had been web pages much less kb.

As I tested my page, 0.1 MB no additional round trips to render, Google states, that the average page needs 4 additional round trips and 2.7 MB to render. This is insane!

My pages from the dial-up modem time 1998 had been slower.
* no compressed delivery
* no async javascripts
* many graphics instead of one combined graphic
* no chache directions

When I see some pages source text,
3 different CSS,
5 different javascripts in the head section, all has to be loaded before rendering can start.

Lazy programmers without any intention for optimizing.

My first job as a programmer 1982 was to create data structures to bring 1000 different tariffs and designs for tickets in skiing areas into an 8 kB EPROM. We thought at this time, should this field be 2 or 3 bits width?
10:26 am on Jan 23, 2018 (gmt 0)

Senior Member from IN 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 30, 2017
posts:707
votes: 86


Honestly, it is impossible to achieve good score with AdSense ads.
11:02 am on Jan 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:1666
votes: 239


Honestly, it is impossible to achieve good score with AdSense ads.

They slow pages down considerably, so that seems only fair :-)
1:07 pm on Jan 23, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member jetteroheller is a WebmasterWorld Top Contributor of All Time 10+ Year Member Top Contributors Of The Month

joined:Jan 22, 2005
posts: 3051
votes: 6


My sites without AdSense 99 from 100, 0,1 MB to render.
my sites wtih AdSennse 97 from 100, 0,3 MB to render
11:19 pm on Jan 23, 2018 (gmt 0)

Preferred Member

10+ Year Member Top Contributors Of The Month

joined:Apr 15, 2004
posts:353
votes: 47


One of the biggest problems here is that google does not understand speed!

You have IPs making slow requests on your site and Google Analytics thinks your site is slow.

Yes it is that bad I request a picture today and one tomorrow from a webpage and Google Analytics says the webpage has taken one day to load.
8:02 pm on Jan 28, 2018 (gmt 0)

Senior Member from IN 

WebmasterWorld Senior Member Top Contributors Of The Month

joined:Apr 30, 2017
posts:707
votes: 86


I'm now confused.

After performing tests on Think With Google (4s), Pingdom (1.29s), GTMetrix (1.79s) - I see the site is fast where the server is (NYC), countries/cities close to the US (Canada) has almost same score. But the speed is not really good from UK and India :/

I thought of trying CDN, but when I tested both StackPath and KeyCDN on my other sites - Response time is better from almost all locations but the wait time of files hosted on CDN is extremely high :-/
10:44 pm on Jan 28, 2018 (gmt 0)

Senior Member

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Sept 25, 2005
posts:1666
votes: 239


Did you test multiple times? The edge nodes of the CDNs need to prime their caches. The first request for a file on each node will always be slow because they have to forward the request to your server, fetch the file, then serve it back to the client. Make sure you have your caching headers set properly to then allow requests to be cached (preferably as long as possible, then purge when necessary).

Webpagetest.org gives more accurate test results than the others you've mentioned, at least in my experience, although the quality of test machines does differ quite a bit, so you may have to try a few in your preferred region. But Real User Monitoring (RUM) like the speed reports in Google Analytics should give you the most accurate data, and you can filter that by country.

One of the biggest problems here is that google does not understand speed!
[...]
You have IPs making slow requests on your site and Google Analytics thinks your site is slow.

Google is one of the front runners in the web performance field, so that's a silly blanket statement. Perhaps you don't understand Google Analytics. For one, Analytics does not "think" anything, it collects data and makes it presentable. It's up to you to decide, based on that data, whether or not your site is slow. You often get outliers in data sets like these. If you have a relatively large number of users with long load times, or a few with extremely long ones, that's obviously going to affect your average. So either you have too many slow requests (and your site is indeed slow) or the number of speed samples from your website is too low (not enough traffic) to reach any kind of statistical significancy.

[edited by: robzilla at 10:53 pm (utc) on Jan 28, 2018]

10:52 pm on Jan 28, 2018 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1107
votes: 264


A Google 'speed' timeline for those interested:
* 2009: Google releases 'PageSpeed' to compete with, compliment Yahoo's YSlow.

* 2010: Google announces speed as a new ranking signal albeit of lesser weight than relevance and initially affecting less than 1% of queries.
Note: desktop focussed.

* 2014: Google announces now actually rendering pages to replicate actual user experience and recommends CSS and JS be unblocked to aid ranking.
Note: for many sites this adds considerable page 'weight' to waht Google receives.

* 2015: Google affixes 'mobile friendly' label to pages (not sites) meeting specified criteria in mobile query returns. Intent/relevance still stated to be stronger signals than being 'mobile friendly'.
Note: 'mobile friendly' (initially at least) being a very low bar.

* 2015: Google tests a white on red 'Slow' label and a black on yellow triangle '!' followed by light grey 'Slow to load' text. No information provided as to threshold(s) involved.

* 2015: Google launches AMP (Accelerated Mobile Pages).

* 2016: Google announces 'Mobile-first Indexing' as an ongoing experiment.

* 2017: Google confirms 'Mobile-first Indexing' is being rolled out.

* 2018: Google announces PageSpeed Insights now includes data from the Chrome browser User Experience Report.

* 2018: Google announces forthcoming 'Speed Update' to start July 2018.

In the penultimate announcement/point above
Real-world data in PageSpeed Insights
[developers.googleblog.com] by Mushan Yang and Xiangyu Luo, Google Developers, 09-January-2018, there are a few interesting tidbits to savour:


The PSI report now has several different elements:
* The Speed score categorizes a page as being Fast, Average, or Slow. This is determined by looking at the median value of two metrics: First Contentful Paint (FCP) and DOM Content Loaded (DCL). If both metrics are in the top one-third of their category, the page is considered fast.

* The Optimization score categorizes a page as being Good, Medium, or Low by estimating its performance headroom. The calculation assumes that a developer wants to keep the same appearance and functionality of the page.

* The Page Load Distributions section presents how this page's FCP and DCL events are distributed in the data set. These events are categorized as Fast (top third), Average (middle third), and Slow (bottom third) by comparing to all events in the Chrome User Experience Report.


My take aways:
* Google is waiting on real world (Chrome UX Report) input from each site before categorising a page's speed.
---this appears to be a page level rather than site level ranking signal.

* my initial thought was that Google might dampen the 'Slow' aka bottom third sites, however that seems at odds with with only affect a small percentage of queries. I also note that intent of the search query is still a very strong signal, so a slow page may still rank highly if it has great, relevant content.

This leads me to wonder if, once again, we are seeing a Google private/public data differential perhaps similar to TBPR which was much less granular than actual. I suspect this is likely and one reason is that it allows Google to warn the bottom 'Slow' third without actually giving any info on an actual threshold. They could, for instance, 'hit' one or two percent, as incentive and if the response is not to their liking hit 5% or 10% or whatever until they get their point across.

And they get to hold 'intent/relevance' as trump for slow sites they can't afford to exclude...

Hosting/ISP connection/service quality and geolocations of server/client suddenly may become a matter of greater concern. Et al.
This 46 message thread spans 2 pages: 46