I've been noticing a number of current posts in the forum asking about recovery from Google Core Update(s). Recently, Google's John Mueller has made some comments about the process which have caught my attention.
John, in an Aug 13 2021 Google Office Hours exchange with SEO Julien Chiarenza, answered a two-part question about Google's necessarily long time-frame for re-evaluation, and how such a protracted recovery might be monitored. Barry Schwartz on SER and Glenn Gabe (on Twitter) have both posted about the discussion.
Barry links to the section of the Office Hours video where the John and Julien discussion begins, quotes John extensively from the YouTube transcript; and he includes some excellent paraphrases of John's most salient points.
I'm skirting around a lot of specifics for now because I don't want to exceed fair use....
Google On Being On The Right Track To A Google Core Update Quality Recovery by Barry Schwartz - Aug 17, 2021 [
seroundtable.com...]
What particularly caught my eye was John's suggestion of finding "proxy metrics that you can use for recognizing the quality of your site". The discussion provides some insight into what it takes to get re-evaluated... and also of the kinds of quality signs that Google is looking for, which I assume are specific to an individual site.
The changes and data are slow to accumulate. As John puts it... "because all of the quality signals that we collect, they just take a long time to be built up." Barry's paraphrase of part two of John's answer is so good at explaining John's intentions regarding the issue of (noisy) engagement metrics to monitor a site's progress, that I will quote Barry's paragraph about that issue in full....
John added it might make sense for you to make "proxy metrics" where you can pretend to know how Google measures quality. Some of those metrics might be engagement metrics like time on site, conversions, etc. Not that Google uses these metrics in search, John said "it's not so much that we would use that user behavior directly in search," but these are "leading indicator for you to let you know that you're on the right track."
Additionally, Google would have a hard time observing change, John notes, if you just built up a subset of your pages, so Google needs to look at the site as a whole... and John therefore suggests that you tackle your entire site, not just a section of it.
For now, I'll just post a link to Glenn Gabe's Twitter thread, which is largely self-explanatory.
[
twitter.com...]
After removing low-quality content, how does a quality evaluation work? -> Via @johnmu
- It can take months (6+ months) for G to reevaluate a site after improving quality overall. It's partially due to reindexing & partially due to collecting quality signals...
As Glenn mentions, John's suggestion that you improve your "site overall, and over the long-term", is consistent with what he's been recommending.
The suggested methodology for observing change requires building up some data on the behavior of a site before you begin to make changes, not only to be guided by the data, but also because you're really measuring rates of change of all sections of the site... both good quality pages, and those iffy but improvable pages you decide to keep and work on... so you need the old data to gauge where you've come from.