Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Google Updates and SERP Changes - October 2018

         

broccoli

11:36 am on Oct 1, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



The following message was cut out of thread at: https://www.webmasterworld.com/google/4918232.htm [webmasterworld.com] by robert_charlton - 4:08 am on Oct 1, 2018, (PDT -8)


I seem to have recovered most of my rankings from before my suspected mobile-first Fred penalty, apart from the very highest volume ones, where an annoying thin-content site is still pushing me down.

The traffic to my site has doubled to about 4K. I’m still well off the 10K figure I was at before the March update pushed up a bunch of low quality sites in my niche.

No corresponding increase in adsense earnings though. As I’m a viral site I see weird, unnatural adsense drops after traffic increases all the time. CPC is still the same but CTR has halved. I hope it settles down. If not, my entire niche may no longer be financially viable.


[edited by: Robert_Charlton at 12:11 pm (utc) on Oct 1, 2018]
[edit reason] Cleanup after thread split to new month [/edit]

Mark_A

10:27 am on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Milchan,

Have you tried using a retargeting service like adroll? I used that for quite a while (might restart it again soon) and found it drive a far bit of traffic for quite a low cost. ..


Thanks, I never came across "retargeting services" before. I will take a look. .

ichthyous

2:47 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Comparing traffic from the 26th-3rd vs. the previous one week period from the 19th-26th in Google Analytics I am showing a 5% decline in users/sessions. However in Google Search Console it's a 46% decline in clicks, 37% decline in impressions and a 16% decline in CTR week over week. SEMrush shows my site falling off a cliff starting on the 27th, from 24% down to 17.7% today across 500 key phrases. Of my top 20 competitors, most dropped somewhat but not nearly as much as my site, and only one climbed strongly. My site was at the top of the list of my competitors, so it had the farthest to fall. With this fall, it's still in 2nd place out of 20 by a hair. That one site that climbed strongly (3rd place) is a deep pocketed multi-billion dollar company that runs a ton of ads on Google and elsewhere. I don't run ads of any kind on any platform, nor are they found on my site.

Interestingly, even searches for my name, which is the biggest single search term visitors find me on, are down 45% since the 26th. Can anyone shed some light on how it's possible that GA users/sessions might only be down 5% while Search Console shows a massive drop in clicks/impressions/CTR?

I really hope that this is temporary, and in my 15 years of doing the tango with Google I feel that strong sudden reversals like this usually end up reverting on their own. I am not rushing to make changes to my site other than trying to improve mobile load times and get new IBLs.

thejimster

2:54 pm on Oct 4, 2018 (gmt 0)

10+ Year Member



@ichthyous in Google Analytics you're comparing Google/organic traffic to Search Console traffic right?

Cralamarre

2:57 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



@ichthyous,
Just wanted to say that my traffic, according to Google Analytics, has also been down 5% beginning on the 27th. Seems like many sites were hit with a 5-10% drop. But I certainly won't be wasting time making changes to my site. Not for Google's sake.

thejimster

3:56 pm on Oct 4, 2018 (gmt 0)

10+ Year Member



Regarding the latest update, I'm finding a lot of people saying that random/old content pages on sites are ranking, which I can also confirm that this is happening with one of our sites that was impacted by the Sept. 27th update. Here's my thought as to what's possibly happened..

<theory not fact>
Google has turned up the dial with which Quality Rater's input is affecting search results, and/or they have increased the volume of sites that are reviewed by quality raters, starting (mostly) with health/medical/YMYL websites. Depending on how your page was rated for a particular search query, the page could be demoted/devalued. Even if one or two sites per high volume query were completely removed from any spot in the top few positions, that would explain those who are seeing +/- 10-30% traffic as things shake out, and also explain those who have lost virtually everything.

In regards to the random/old content pages that people are suddenly seeing higher in the SERPs, my guess is that those pages were not reviewed by the quality raters (because they were most likely not found for high volume search queries) prior to August 1st or Sept 27th updates. So if you had strong rankings in the past (you most likely have a strong domain, history, links, etc.), and therefore your site is still perceived as an authority in that niche. Those strong domain signals that are still intact for your domain is the algorithm's way of trying to show a page for a strong website that the quality raters have not devalued.
</theory not fact>

I'd like to hear people's thoughts on this nugget of a theory for and against.

"So that with the next updates when when Google makes some things will look better again. And this is something I've seen for for some of the sites over the last couple of months where though say oh I was like one month everything is terrible and then a month later I was like yes we made it we got things back on track again and these things can change over time." - Mueller

^^Possibly this is saying to fix/improve things as suggested in the QRG and that next time your page/site is evaluated the demotion you're seeing will be removed. (I know.. seems should be a manual action?)

Also, another guess as to why some queries are showing strong rankings for websites that have previously never been present in your niche's SERP is that they have not been evaluated yet for your niche.

I realize that this totally contradicts the idea of G moving towards more AI, which is why this is a nugget of a theory at this point.

thejimster

3:56 pm on Oct 4, 2018 (gmt 0)

10+ Year Member



(My post above posted twice and not sure how to delete, so just editing it to this.)

Cralamarre

4:07 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



Wouldn't it be nearly impossible at this point, with so many websites out there, for humans to go through them checking them all for quality, and then ranking each one in order? And how many personal opinions are factored into the rankings?

And what are the qualifications of the people ranking websites for quality? Are health care professionals and medical researchers being hired to rate health-related sites (because they have nothing better to do)? Or just random people with no qualifications? It just seems like it would be an overwhelming task at this point for actual humans to rate and rank websites. But that's just my thoughts. I don't know how it works.

ichthyous

4:14 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@thejimster - Good point, I'm glad you mentioned it...I went back and ran the comparison again with just Google/organic traffic and it jumped from a 5% to a 25% decline in users. That's bad and I haven't seen a big drop like that in years, but it's still not matching the 46% decline in clicks that GSC is showing?

Gregorich SEO

4:21 pm on Oct 4, 2018 (gmt 0)

5+ Year Member



@thejimster

"...my guess is that those pages were not reviewed by the quality raters..."

Yes, maybe that's why we're seeing sites with inferior SEO and low E.A.T. ranking higher (even after an update meant to reward E.A.T.) Maybe they just haven't been evaluated yet so they're benefiting from the ranking drops of better sites.

Noticing many of the sites that moved up for big keywords are on the Shopify platform (good mobile optimization/pg speed). We're on the Magento CMS with pretty slow page speed. Probably has something to do with our recent SERP dive.

thejimster

4:27 pm on Oct 4, 2018 (gmt 0)

10+ Year Member



@Cralamarre

I’ve seen the 10,000 number floated around for quality raters, so 10,000 people reviewing 20ish pages for a given query (and close variants) doesn’t seem out of the realm of possibility in my opinion to knock out a TON of sites. It could certainly be a large chunk of a niche in my opinion, and I would assume it would be done in chunks from time to time.

It’s my understanding that they do not RANK the pages. They just rate different factors of those pages according to relevance, safety, EAT, etc. The algorithm ranks sites and quality raters devalue pages with low scores according to the above factors. If say 100 quality raters rate a page as low EAT, the algorithm would demote the page until EAT is improved above a certain threshold.

The guidelines give specific examples of how raters are to cross-check facts, look for reviews, etc off of the website being evaluated.

Again this is all just a guess/theory...

Cralamarre

4:37 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



@thejimster,
That makes sense. I would certainly welcome at least some human interaction when it comes to the SERPs. What would be nice, though (and maybe this exists and I'm just unaware of it) is a Google page where we can actually see the quality score of our site/pages as rated by the Quality Raters, along with reasons for the score.

Of course, sites that just copy and paste content from other sites, or make up nonsense, should expect their quality score to be low. But it would still be helpful for the rest of who are honestly trying to offer something of quality.

NickMNS

4:49 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Google has turned up the dial with which Quality Rater's input is affecting search results,

Let us get something straight Quality Raters have no input on search results. There is no room filled with monkeys on type writers rating websites.

The role of the quality raters is to evaluate the output of the search Algorithm to ensure that any changes that were made have had the intended impact. Even if a site is selected within a sample and rated as "poor" that rating will not have any impact on the sites actual ranking in the SERPs. Now maybe if changes are made to the algorithm and the raters determine that impact is as intended, when it really isn't and then Google decides to push to change based on the flawed evaluation, then yes, the raters may have an impact on the rankings. It could be argued from the descriptions reported here (which are certainly not sufficient in number and not sufficiently free of bias to be representative) that something like a flawed evaluation could explain the situation. But that certainly does mean that suddenly raters are evaluating websites and checking a radio box "include" or "do not include" in SERPs.

Cralamarre

5:19 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



@NickMNS,
So that does still raise the possibility that quality raters, while not directly ranking websites, can still affect algorithm changes, and affect them in a negative way, if their evaluations of the sites they sample are not accurate. But as I think you were pointing out, just because everyone who's site was hit by an update is claiming that their content is better than everyone else's, it's not proof of anything (other than a possible need for a second opinion).

Gregorich SEO

5:29 pm on Oct 4, 2018 (gmt 0)

5+ Year Member



@NickMNS

"Even if a site is selected within a sample and rated as "poor" that rating will not have any impact on the sites actual ranking in the SERPs."

Thank you for clearing this up. But just out of curiousity, what if the roadmap goes...

Update>Quality Rater's Evaluation of SERPs Post-Update>2nd Update Based on Quality Rater's Evaluation?

In that case, a poor QSR rating would have some impact on rankings, yes?

I've noticed competitors benefitted from Aug 1 update, then tanked after Sept 26ish update. Almost like QSRs evaluated the first update and didn't like what they saw?

thejimster

6:10 pm on Oct 4, 2018 (gmt 0)

10+ Year Member



SEL quote: "Google contracts with over 10,000 search quality raters worldwide to evaluate its search results."

So... there are ~10,000+ people evaluating search results.

@NickMNS can you link to where you've found the exact role for quality raters?

a flawed evaluation could explain the situation


Or it's not actually flawed based on what the quality rater (or many raters) sees on a website or what they found (or not found) on 3rd party websites. I don't think one or two or three raters will hurt a website, but if a significant portion of raters rate a website as low EAT, maybe that website could be hurt (filtered) in future updates.

But that certainly does mean that suddenly raters are evaluating websites and checking a radio box "include" or "do not include" in SERPs.


I assume that was a typo and you meant to say "doesn't mean"..

Nobody's suggested that specifically, or if I gave that impression it was not intentional. Look at page 21 of the SQRG in the PQ Rating and Explanation column where pages are rated on a sliding scale. I would guess that search queries possibly have different thresholds for these ratings where they must meet or exceed that threshold, and if a page(or website) is significantly below that threshold, it would be held back or filtered for that query after an update.

I may have given the impression that a rater's feedback on a page affects the search results directly. That was a mistake. I think that a given set of search results could be rated, and that batch of data collection is used for future core algorithm updates, tweaks, etc. and result in pages or websites being suppressed or filtered.

Cralamarre

6:17 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



@thejimster,
Sounds like everyone is saying the same thing. Quality Raters rate search results, not websites directly, to give Google a way to make sure that algorithm changes are having the intended impact.
And like everything else related to Google and its algorithms, there is nothing that you or I or anyone can do about it other than focus on the quality of our sites and hope for the best.

thejimster

6:20 pm on Oct 4, 2018 (gmt 0)

10+ Year Member



Update>Quality Rater's Evaluation of SERPs Post-Update>2nd Update Based on Quality Rater's Evaluation?


^this
I've noticed competitors benefitted from Aug 1 update, then tanked after Sept 26ish update. Almost like QSRs evaluated the first update and didn't like what they saw?


That would be one of our websites. And I'll be the first to admit that while focusing on our UI and removing lots of graphics and things, we have done a poor job at conveying our actual Expertise, Authority, and Trustworthiness in the industry on our actual website. We've done all the standard on-page factors very well in my opinion, used Schema everywhere possible, and it worked for a while. Now, I think we need to do a better job at showing our expertise instead of just the above mentioned items.

I would have never expected to have ALL of our major landing pages filtered from the results because of this though. It seems... much too drastic in my opinion.

thejimster

6:55 pm on Oct 4, 2018 (gmt 0)

10+ Year Member



@Cralamarre

I'm currently looking at page 13 of the guidelines (Section 2.5.3) that reads: "Finding About Us, Contact Information, and Customer Service Information"..... :)

Sorry for hijacking this thread, but I've yet to see a decent argument against my above post. From my recent understanding of the document, Quality Raters are used for a variety of tasks which include rating search results, pages, websites, reputation, etc. QRs could be used in whichever way G wants. There are specific instructions in the guidelines that tell raters how to evaluate websites and examples are given.

I understand that some assume that the raters are only used to make sure algorithm updates produce the desired affect, but assuming that the information collected from raters isn't used for future updates seems... unbelievable, but that's my opinion.

Alright.. I'll go back to my usual lurking and see where it goes :)

Cralamarre

7:02 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 5+ Year Member Top Contributors Of The Month



@thejimster
The only thing I would really care about would be that if Google is using people to rate the quality of my site, for whatever reason, I would like access to that rating and the reasons for it. Even if it's not affecting the SERP's directly, it's still useful information.

NickMNS

8:40 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Let's do some math:
number of raters = 10,000
number of webpages = 50,000,000,000
Time required to review a webpage = 1 minute.
number of minutes in a year = 60 min/hr * 40hr/week * 50 work weeks per year = 120,000
number of websites reviewed per year = 1,200,000,000
That is 50 x short of being able to review every webpage at least once a year.

The point is that it is mathematically impossible for that raters to evaluate webpages at the scale of the web. Forget about "quality raters" assume that they don't exist because they have no impact on your website. This does not mean that one should ignore the QR guidelines. It is a good document that provides a good explanation and insight as how Google evaluate sites. But here again, following the guidelines is necessary but not sufficient.

Regarding the work flow is most likely (this is not based on any specific knowledge of the process but on typical industry practices):
Choose feature to target with a change in the Algo -> make a change -> test change on a small test set of websites (offline) -> review results -. if accepted test on larger test-set (offline) -> send for evaluation by raters -> review rating report and decide whether to proceed -> test change on small test set live results -> rate again => go/no go decision => push the change to production (live).

Bottom line is I doubt that Google would push any change without thorough testing. That is not to say that they never make mistakes or rollbacks, there have been enough examples flagged over the years, but they are rare and I would not count on any algo change to be rolledback even if the perceived result seems worse than the previous state.

I assume that was a typo and you meant to say "doesn't mean"..

Yes typo, sorry, I am rushing in and out today and i have to run out again right now....

Gregorich SEO

9:04 pm on Oct 4, 2018 (gmt 0)

5+ Year Member



@NickMNS

But they theoretically only need to rate sites in the top 10. They're also particularly focused on niches whose content can effect the health, money, etc of readers, which narrows down the amount of sites QSRs need to rate.

[edited by: Gregorich_SEO at 9:29 pm (utc) on Oct 4, 2018]

NickMNS

9:29 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@Gregorich SEO
Sure, top ten = 10% and your 50x off, so by taking only 10% your are now only 5x off. That doesn't account for the fact that new pages are added everyday and existing pages are continuously updated. My example was based on "review once a year" at 5x that is once every 5 years. The bottom line is, it is not feasible.

No human review is what made Google. Yahoo, Lycos, DMOZ and other would still be in the game if human review were feasible, it never was and I doubt it ever will be.

@thejimster
@NickMNS can you link to where you've found the exact role for quality raters?

None specifically that I can think of now. @Martinibuster is pretty good with stuff relating to how Google works and the Google patents maybe he can chime in.

There is this thread here going on that provides insight (high level) into the inner workings:
[webmasterworld.com...]

The video posted by Robzilla is pretty interesting.

NickMNS

9:32 pm on Oct 4, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



@thejimster
Acutally the CNBC story that is the basis of the link thread has some explanation on how the quality raters are used.

Gregorich SEO

9:39 pm on Oct 4, 2018 (gmt 0)

5+ Year Member



@NickMNS

@thejimster

From the CNBC article:

Google contracts about 10,000 of these raters around the world, and while they cannot directly affect search results, their opinions help Google’s search team evaluate whether a given tweak should go through or not. Raters typically see old and new results side by side, and determine which are better.

“Better” is not a purely subjective term. It’s defined by a published document of search quality rater guidelines, which describe how raters should judge a page that shows up in their results. Particular attention is paid to a page’s expertise, authoritativeness and trustworthiness.

“You can view the rater guidelines as where we want the search algorithm to go,” Ben Gomes, Google’s vice president of search, assistant and news, told CNBC. “They don’t tell you how the algorithm is ranking results, but they fundamentally show what the algorithm should do. ”

Milchan

11:54 pm on Oct 4, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



im still having trouble getting a handle on traffic patterns and results. It seems like I had an upturn in traffic of about 10 to 20% since Sep 27th and a uptick in conversions , nothing major but welcome all the same. Today , traffic seems along those line but not a single conversion - just strange. Could be an anomaly but zero is very strange indeed.
Ive learnt to not worry too much though and just see what happens tomorrow (or later tonight) .

massimodefilippo

2:05 am on Oct 5, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



Bhe, there is another explanation.
Quality rater cannot check all the website in the world, but they can check the first 100 medical website in every country.
That's why every good medical website drop down.
They didn't put down website not medical at 100%.
So, if you have many pages about food, personal trainer, legal aspect of the medicine, green or others matters, you grow.

This update affected all the domain, not only a page.
So, newspaper, magazines, page with legal aspect of medicine, pharma factories ecc. rank better.
Even the hospital because they doesn't speak only about medicine.
They have many other matters in the websites, the structure, ecc.

Little websites are not impacted, but they grow because the best website went down.

Steven29

2:21 am on Oct 5, 2018 (gmt 0)



number of raters = 10,000
number of webpages = 50,000,000,000
Time required to review a webpage = 1 minute.
number of minutes in a year = 60 min/hr * 40hr/week * 50 work weeks per year = 120,000
number of websites reviewed per year = 1,200,000,000


What about something like [lol I don't think these numbers are an exact science]:
50,000,000,000 - 45,000,000,000 websites that receive less than 1,000 visitors monthly from searches

5,000,000,000 - 4,000,000,000 domain age of 5 years or less

1,000,000,000 - 500,000,000 websites that don't have the domain owner logged in via a mobile device with a working number

500,000,000 - 450,000,000 people who have never given us their cr3dit card for a service

50,000,000 - -25,000,000 records not found with I.d. VALIDATION against some database

.. I'm sure there are many others

sk7411

8:45 am on Oct 5, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



And .. the last Sept27 update rolled back .. or some tweaks happened .


The trend seems to be 1 week / month / holidays .. etc.

I will probably feed these data into ML and create a future prediction lol.

Shepherd

10:02 am on Oct 5, 2018 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



And .. the last Sept27 update rolled back

We are seeing results back to pre- September 27 data set this morning also. Odd.

BushyTop

10:14 am on Oct 5, 2018 (gmt 0)

5+ Year Member Top Contributors Of The Month



Looks like the SERP sensors are up again...

Anyone seeing any movement?
This 553 message thread spans 19 pages: 553