Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google "Core Web Vitals" replace 'Speed Report' in GSC

         

Robert Charlton

10:05 am on May 28, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



On May 5, 2020, Google announced on several of its various user channels a set of user experience metrics, which it called "Web Vitals"...

- - In 'Chromium Blog'...
Introducing Web Vitals: essential metrics for a healthy site
Tuesday, May 5, 2020
[blog.chromium.org...]

My emphasis added...
Today we are introducing a new program, Web Vitals, an initiative by Google to provide unified guidance for quality signals that, we believe, are essential to delivering a great user experience on the web.

- - In 'Google Developers'...
Web Vitals
Essential metrics for a healthy site
https://web.dev/vitals/ [web.dev]


Both of the above articles focused on what Google called "Core Web Vitals", and emphasized their importance and universality. Again, I've bolded some spots in my quotes...

Core Web Vitals

Core Web Vitals are the subset of Web Vitals that apply to all web pages, should be measured by all site owners, and will be surfaced across all Google tools. Each of the Core Web Vitals represents a distinct facet of the user experience, is measurable in the field, and reflects the real-world experience of a critical user-centric outcome.

The metrics that make up Core Web Vitals will evolve over time. The current set for 2020 focuses on three aspects of the user experience — loading, interactivity, and visual stability — and includes the following metrics (and their respective thresholds):


Today, I've seen numerous announcements that these "Core Web Vitals" have gone live or appear to be rolling out in the GSC. Here's Google's summary of the three Core Web Vitals...


Largest Contentful Paint (LCP): measures loading performance. To provide a good user experience, LCP should occur within 2.5 seconds of when the page first starts loading.

First Input Delay (FID): measures interactivity. To provide a good user experience, pages should have a FID of less than 100 milliseconds.

Cumulative Layout Shift (CLS): measures visual stability. To provide a good user experience, pages should maintain a CLS of less than 0.1.


Many more details, graphics, and charts on the Google Developers page, as well as on the two major blogs that have been reporting these...

[seroundtable.com...]
[searchenginejournal.com...]

Additionally, WebmasterWorld member Junior Member Steve29 posted the following, apparently coming upon it as it was rolling out, and I will add his post following this one, with several details tweeked for accuracy or completeness.

I'm already seeing some members mentioning sites dropping in the serps, and these perhaps could be explained by the above.

As a personal note, I should add that the most annoying thing I've found about sites loading is the jumping around as you try to read them. I've complained about this for a great many years. I'm hoping this Google initiative causes site owners and advertisers to get it together and fix the problem, as it's perhaps the biggest PITA I encounter in normal web surfing... either that or disappear from the serps.

Steven29

12:52 am on May 28, 2020 (gmt 0)



< moved from another location >

Hi,

There appears to be an update going on with Google Page Speed Report: [developers.google.com...]

I believe this is related to the new "Core Web Vitals" report found in the Google Search Console: [search.google.com...]

It's almost appearing as if the Page Speed Report is calculating using the new metrics found in the Core Web Vitals, but does not show them publicly.

CLS (Cumulative Layout Shift)
[web.dev...]

LCP (Largest Contentful Paint)
[web.dev...]

First Input Delay (FID) (Mods note - added this)
[web.dev...]

Is anybody seeing the shift / changes?


[edited by: Robert_Charlton at 10:21 am (utc) on May 28, 2020]

Robert Charlton

11:05 am on May 28, 2020 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Steven29, are you seeing any of these "Core Web Vitals" quality changes having any obvious correlations with the May update which is also rolling out and causing great confusion? I haven't seen anything to indicate they're connected, but your post suggests you may have been looking at this. Please advise.

Steven29

3:44 pm on May 28, 2020 (gmt 0)



Hi Robert,

Yes and no, I do see lots of other factors in the ranking.

I have been seeing a trend the past few months and posted something in regards to it last month here: [webmasterworld.com...]

Here are my new updated scores with this new "Core Web Vitals" scoring:

Mobile: [i.ibb.co...]

Desktop: [i.ibb.co...]

This new report seems to fluctuate more than the previous one.

I still see websites like this that will outrank me for content I am posting first: [i.ibb.co...]

The main problem Google has is spam and they seem to allow certain people to do whatever they want. The strange part is these same websites have a free pass on Facebook too! My main competitor copies me immediately and is posting to "Like" all of their posts 2 - 3 times a day to win a bogus prizes they never award. (Winner announced! Click on this page to see if you won and you have only 12 hours to claim your prize!). Sometimes one of their page finally get's suspended, but usually after 6 months and then replaced by an exact copy within 24 hours with over 200,000 likes and links to the same website continue posting.

"Content is King"?!?!

Musicarl

5:03 pm on May 28, 2020 (gmt 0)

10+ Year Member



Have yet to find a site that passes their "core web vitals assessment." If it's a ranking factor, it's not apparent yet - for one search term we have a score of 78 and rank fourth. Here are the scores for the three pages above us:

#1 - 10
#2 - 20
#3 - 17

aristotle

10:24 pm on May 28, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Just checked the core vitals reports for my sites in GSC. The core vitals results for a site have three categories:
-- poor URLs
-- URLs need improvement
-- good URLs

All the pages on all of my sites are "good URLs". This is in spite of the fact that my sites are still http.

So from this, it looks like a site still being http ("not secure") isn't one of the factors included in the core vitals analysis.

lucy24

2:18 am on May 29, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



You’re all still ahead of me. All I’m getting is “Not enough data for this device type” coupled with a recommendation to try Page Speed Insights ... which in turn tells me
Lighthouse returned error: ERRORED_DOCUMENT_REQUEST. Lighthouse was unable to reliably load the page you requested. Make sure you are testing the correct URL and that the server is properly responding to all requests. (Status code: 403)

Well, ###. I actually remember putting Chrome-Lighthouse on my block list not all that long ago, because their behavior seemed fishy. How the heck was I supposed to know it's a quasi-legitimate G function and not just another unidentifiable entity operating from a Googloid range (66.249.80-95, the one that’s next door to the crawl range)?

But I digress.

:: idly wondering how they arrive at a PSI score of 100 [after removing the UA block] even while giving a long list of possible issues ::

jediviper

8:25 am on May 29, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



These scores are not yet part of the ranking signals, so no reason to worry about who got what.
Of course it's good to always improve the page experience, but fighting for these scores to get higher, won't improve your search results.

aristotle

10:52 am on May 29, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Well they could point to issues that ARE already affecting rankings, since bad user experiences could reduce time on site, eliminate repeat visits, reduce the chances of attracting backlinks, etc, thereby indirectly creating other negative algorithmic signals

Lagonda

3:32 pm on May 29, 2020 (gmt 0)

10+ Year Member



Now Google Search team, send the memo to the AdSense team.

riccarbi

8:17 pm on May 29, 2020 (gmt 0)



Well they could point to issues that ARE already affecting rankings, since bad user experiences could reduce time on site, eliminate repeat visits, reduce the chances of attracting backlinks, etc, thereby indirectly creating other negative algorithmic signals


Is there any proof that time-on-site and/or repeat visits have ever been part of Google's ranking algorithm, directly or indirectly?
How on earth Google can be aware of your websites' time-on-site if you don't explicitly tell them? Through Chrome? GA? Human review? Some exotic AI technique that scans millions of websites a day? This is techno-blah-blah. No one knows what signals Google's algo is really using, as well as in which way it might collect the necessary data.
Furthermore, I'm pretty sure there are no such things as "bad user experience" and "good user experience". And no one at Google really give seriously a damn about them either; since it depends on what users you are considering, your website's niche, your business targets, and so on, It is something a stupid algorithm can't generalize about. Too many variables for a search engine that has become unable to provide me the address of a really good pizza restaurant nearby, these days. We are all chasing pavements trying to understand the supposed "intelligence" of Google's ranking algo.

lucy24

10:45 pm on May 29, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm pretty sure there are no such things as "bad user experience" and "good user experience".
Say what now?

It is not many hours since I sat in front of the screen twiddling my thumbs waiting for five grossly oversized image files (resized onscreen to 200px or so, but must have been multi-megabytes* in filesize) to load up because I needed to use the site. Had I not required the service, I would have closed the window in disgust.


* Moved by ’satiable curtiosity I fished the page out of my browser history and checked. Four of the five are SIX MEGABYTES and up, the largest weighing in at 9.3 MB. For, again, a responsive display of 200px or so. I consider this a Bad User Experience.

SweetPotato

11:32 am on May 30, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



So excuse me but this incredible hypocritical of Google.

How are we supposed to get a good score if Google AdSense and Google analytics add whole seconds to the load time?

If I wanted a super fast site I'd have to remove Google. That's all.

All other optimizations are a waste of time that have no impact on score.

SweetPotato

1:52 pm on May 30, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



7.100 ms Reduce Javascript Execution time [imgur.com...]
2.650 ms Third party code blocked the main thread: [imgur.com...]
1.710 ms Remove unused Javascript. [imgur.com...]

11.4 Seconds Added by Google. Same guys complaining about the site being slow. Should we just ditch google? Ditch ads? Close?
They want internet run by a few giant white-listed sites that no matter how horrible they are, penalties don't seem to affect them ever. They do everything users hate, their sites are 10 MB yet is the small site always getting hit.
<snip>

[edited by: Robert_Charlton at 12:32 am (utc) on May 31, 2020]
[edit reason] removed offtopic editorializing [/edit]

christianz

7:05 am on May 31, 2020 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



None of this matters for AdSense sites, because PageSpeed for them is already in the toilet and there are no ways to improve any of these metrics, unless you completely remove AdSense.

dethfire

3:17 am on Jun 2, 2020 (gmt 0)

10+ Year Member Top Contributors Of The Month



Want to pass vitals, remove adsense. Google shooting themselves in the foot.

JesterMagic

11:29 am on Jun 2, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google will give sites that display adsense a pass (at least not count the additional load time etc directly related to displaying adsense) just like they give themselves a pass for all the ads their own web properties show which would tank other sites in the rankings.

immrrobot

1:42 pm on Jun 3, 2020 (gmt 0)

5+ Year Member Top Contributors Of The Month



LoL, this is so flawed. All the lagging in page speed comes from google adsense in my site. I don't use any extra plugin that would inject their own JS or css. Being a developer who understands SEO, my site is all custom built. And, all I can see is Adsense messing around with the speed. That's hilarious

Swanny007

11:57 pm on Jun 7, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



All of my problems stem from using Google Analytics and AdSense. F*ck it's frustrating. What should I do?

lucy24

2:04 am on Jun 8, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I can't say anything about AdSense, but if you don't like GA, why use it? There are alternatives.

Swanny007

3:32 pm on Jun 8, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



I'm going to experiment and see what has the most impact. Definitely AdSense is worse than GA. Can I remove auto ads perhaps and just leave one or two ads on the page and perform better? Hmmm..

EditorialGuy

7:28 pm on Jun 8, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



This is obviously a work in progress, and it won't affect search until next year. IMO, there's no need to panic (or chase a moving target).

tangor

2:50 am on Jun 9, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



As with most things on the bleeding edge of tech ... there will be some bleeding involved. Play or stay ... and six months or a year from now take another look.

Steven29

3:00 pm on Jun 10, 2020 (gmt 0)



Adsense can be a little tricky but you should be able to optimize it to load properly with lots of work. I am loading 4 ad units at once without any trouble: [ibb.co...]

ronin

12:13 pm on Jun 20, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



How are we supposed to get a good score if Google AdSense and Google analytics add whole seconds to the load time?

If I wanted a super fast site I'd have to remove Google. That's all.


None of this matters for AdSense sites, because PageSpeed for them is already in the toilet [...]


Want to pass vitals, remove adsense.


Agreed. This is why I removed Adsense and then Analytics from all of my sites five years ago.

The one thing Google PageSpeed complained about most consistently was Adsense.

In the end I decided that speed - a very significant part of UX, especially on mobile - had to be more important than supplementary revenue.

samwest

1:45 pm on Jun 20, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Amazon gets a 60 resulting in: "Field Data - Over the last 30 days, field data shows that this page does not pass the Core Web Vitals assessment."...I guess they set the bar so high nobody will pass, unless your site is simply "Hello World"...or consists of one search field and one image that says "Google".

My site gets a 10, while GTMetrix gives it an A (91) grade.

JorgeV

2:08 pm on Jun 20, 2020 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Hello,

From my experience, ... avoid popular CMS and frameworks. They are easy, fancy, good looking, but they are awful from a technical point of view. by wanting to do too many things; they became too big.

Those who write their own code, cms, html, css, js, etc... have much lighter and reacting sites.

samwest

6:39 pm on Jun 20, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Those who write their own code, cms, html, css, js, etc... have much lighter and reacting sites.

Some of us, who can't afford teams of developers, would rather spend our time writing content than building out functionality...and we should not be penalized for doing so even if it costs impatient users a few milliseconds.

A CMS / CF powered site that grades A/B on GTMetrix navigates perfectly fine and fast.

Raising the bar impractically high based on technicalities is just Gorgs way of weeding out webmasters and sites I guess. However, it seems Google own tools are not refined either...so...

Surprisingly both my site and google.com both grade A (91%) on GtMetrix...but Page Speed Insights won't even measure Google.com...but I get a 10 (of 100) lol - what's up with that?

lucy24

10:20 pm on Jun 20, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



even if it costs impatient users a few milliseconds.
A badly coded page can take MUCH more than a few extra milliseconds. Shouldn’t search engines take things into account that genuinely affect the user experience? If the search engine sends me to a page, and I open the tab to find a blank screen with maybe a hint of a status bar somewhere, I am not likely to stick around. And it does not improve my opinion of the search engine.

samwest

2:43 am on Jun 21, 2020 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Shouldn’t search engines take things into account that genuinely affect the user experience?

If they indeed affect user experience, but they don't.

I pass multiple speed analytics, and like I said, my site and Google both rank the exact same 91%.
Matomo indicates a very fast response time, but apparently not fast enough for PSI. I have yet to see a site that is passing....even Amazon. Maybe an html brochure site.
How many speed analytics do you have to pass to please Google?

Page Speed Insights is bogus garbage and will just become either another dead end project or another excuse to de-rank decent sites. I've see this show before.
This 52 message thread spans 2 pages: 52