Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

John Mueller on tracking recovery from a Google Core Quality-Update

         

Robert Charlton

11:18 am on Aug 26, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I've been noticing a number of current posts in the forum asking about recovery from Google Core Update(s). Recently, Google's John Mueller has made some comments about the process which have caught my attention.

John, in an Aug 13 2021 Google Office Hours exchange with SEO Julien Chiarenza, answered a two-part question about Google's necessarily long time-frame for re-evaluation, and how such a protracted recovery might be monitored. Barry Schwartz on SER and Glenn Gabe (on Twitter) have both posted about the discussion.

Barry links to the section of the Office Hours video where the John and Julien discussion begins, quotes John extensively from the YouTube transcript; and he includes some excellent paraphrases of John's most salient points.

I'm skirting around a lot of specifics for now because I don't want to exceed fair use....

Google On Being On The Right Track To A Google Core Update Quality Recovery
by Barry Schwartz - Aug 17, 2021
[seroundtable.com...]

What particularly caught my eye was John's suggestion of finding "proxy metrics that you can use for recognizing the quality of your site". The discussion provides some insight into what it takes to get re-evaluated... and also of the kinds of quality signs that Google is looking for, which I assume are specific to an individual site.

The changes and data are slow to accumulate. As John puts it... "because all of the quality signals that we collect, they just take a long time to be built up." Barry's paraphrase of part two of John's answer is so good at explaining John's intentions regarding the issue of (noisy) engagement metrics to monitor a site's progress, that I will quote Barry's paragraph about that issue in full....

John added it might make sense for you to make "proxy metrics" where you can pretend to know how Google measures quality. Some of those metrics might be engagement metrics like time on site, conversions, etc. Not that Google uses these metrics in search, John said "it's not so much that we would use that user behavior directly in search," but these are "leading indicator for you to let you know that you're on the right track."

Additionally, Google would have a hard time observing change, John notes, if you just built up a subset of your pages, so Google needs to look at the site as a whole... and John therefore suggests that you tackle your entire site, not just a section of it.


For now, I'll just post a link to Glenn Gabe's Twitter thread, which is largely self-explanatory.

[twitter.com...]

After removing low-quality content, how does a quality evaluation work? -> Via @johnmu
- It can take months (6+ months) for G to reevaluate a site after improving quality overall. It's partially due to reindexing & partially due to collecting quality signals...

As Glenn mentions, John's suggestion that you improve your "site overall, and over the long-term", is consistent with what he's been recommending.

The suggested methodology for observing change requires building up some data on the behavior of a site before you begin to make changes, not only to be guided by the data, but also because you're really measuring rates of change of all sections of the site... both good quality pages, and those iffy but improvable pages you decide to keep and work on... so you need the old data to gauge where you've come from.

engine

1:27 pm on Aug 26, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month




It may be worth linking direct to the YouTube video with the Q&A

Robert Charlton

5:39 pm on Aug 26, 2021 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



engine, thanks for linking directly to the YouTube video.

If viewers want to jump to the particular Q&A section we're discussing, with John and Julien Chiarenza... the start of that particular section, by my reckoning, is about 06:12 in from the beginning of the video.

As Barry noted and I myself concur, it is worth noting how well Julien's questions were phrased... and I think that John was extremely thoughtful in the way he handled those questions off the cuff, without spilling the "secret sauce" (as Matt Cutts so often described the sensitive parts of the algorithm).

The phrase "proxy for engagement", btw, has been used by many commentators, including Bill Slawski, in discussing Google patents concerning the so-called "first long click". I feel that Google's time spent using multiple quality signals to verify that the site is a good one is Google's way of "triangulating" user data, so, while Google isn't trusting that data because by itself it is unreliable, in the context of other signals it can be helpful to them. It was striking to me to see John use it here... and it does in a way validate those terms "engagement" and "user experience", even though the precise signals for them aren't public... and apparently they're not simple either.


Perhaps this is also a good time to reference a thread I've sometimes cited when discussing time-to-rank in Google, also with John Mueller, from nine years ago. It's remarkably consistent with what John's saying now, and I think it takes on some added dimension in light of the current discussion.

Search engines need time & other signals to confirm a site is "fantastic"
June, 2012
https://www.webmasterworld.com/google/4467831.htm [webmasterworld.com]