Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

How can EAT be a Google metric?

         

anallawalla

9:30 am on Feb 17, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



I am painfully aware that EAT (and EEAT) are not Google ranking factors, not used in Google systems etc. This is something else.

Back in November 2022, a Googler was quoted in SERT (with a link to a video where he said within the SERT quote):

"E-A-T is a core part of our metrics," he added, explaining that it is to "ensure the content that people consume is going to be, is not going to be harmful and it is going to be useful to the user" Google lives by the principles of E-A-T every single day, he said. "We do it to every single query and every single result," he added. Kim said "so it is actually pretty pervasive throughout everything we do."


What is your interpretation of how EAT can be a metric?

Mine is that it has got to have a numeric value only known to Google and that an automated process compares every SERP entry against their computed EAT values as a QA process?

not2easy

12:36 pm on Feb 17, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



My take is sort of similar if vague. I see it as a value a site earns over time through evaluating the evolution of the content over time. A sliding scale that considers the query and the user. It would be a scoring system and I would not think it is a static value. And I'm probably wrong.

Brett_Tabke

4:46 pm on Feb 17, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



At the high point, it was reported that Google had nearly 2000 people and contractors employed as "quality raters". It was posted on reddit about 2010, that a rater needed to average one serp-to-webpage interaction every minute.

1 minute * 8hrs * 60 mins * 5 days * 52 weeks * 20 years * 2000 people = a max of 4,992,000,000 webpages reviewed in 20 years.

If 10,000 websites make up the mass majority of Serps Google produces in any given language market, that means raters can have eyes-on every major serp in a couple of days.

Not only are they judging the value of the serp response to the entries, they are also making judgements about webpages linked too. What are they doing with that trove of data?

anallawalla

9:52 pm on Feb 17, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



My thinking is along these lines:
  • An AI process looks for evidence of Expertise, Authoritativeness, and Trustworthiness. Perhaps they have added Experience by now.[/*]
  • This assessment is conducted continuously during crawls of the site and other sites that cite it or link to it.[/*]
  • Each of these factors is given a numeric score that can change with each crawl.[/*]
  • Each element of a SERP, snippet, carousel image, URL result would have such a score. This should have a high value relative to the results that follow below it and on subsequent "pages" in the continuous scroll.[/*]
  • SERP results are obviously selected by separate ranking systems, so I contend that E/EAT serves as a quality check after the fact.[/*]
  • As this is done after the fact, it does not slow down the delivery of the SERPs. If an adverse trend is noted, it is analysed in detail and a ranking system is amended, or the E/EAT factors are tweaked.[/*]

That is how I interpret the Googler's remarks that E/EAT is applied to "every single query".

engine

9:25 am on Feb 18, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



Metric, but not ranking.

It used to be that you'd need to get into Yahoo directory, or DMOZ to have a good chance of playing along with SERPs, mainly because it added a level of authority. Of course, that all changed as spam skewed the databases.
One way of achieving the A-part is to make their own system. E/E-A-T. It gives it's own A-part, meaning that if accepted the ranking algo can take over.
IMHO

aristotle

8:52 pm on Feb 22, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



The real question is how does google's algorithm determine which sites have the most expertise, authority and trust.

Does anyone remember the old theory that big well-known companies and organizations gradually climbed to the top of the search results because their listings got more clicks from searchers than the sites that were originally ranking above them? I've always believed that this theory has merit.

This theory suggests that google could be letting the searchers themselves vote (by their clicks) which sites have the highest EAT scores.

For example, in the healthcare sector, sites such as Mayo Clinic, Harvard Medical School, National Institute of Health, etc, would be given the highest EAT scores because they get more clicks than lesser-known sites.

EditorialGuy

9:03 pm on Feb 22, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Does anyone remember the old theory that big well-known companies and organizations gradually climbed to the top of the search results because their listings got more clicks from searchers than the sites that were originally ranking above them? I've always believed that this theory has merit.

Or maybe it's a combination of factors, like sheer scale combined with backlinks from large sites that are already trusted.

Side note: I've often wondered how massive UGC-based sites like TripAdvisor got so big so fast 15 or 20 years ago. I can remember when many (most?) TripAdvisor pages were empty template-based, keyword-driven pages with an invitation to "Write a review." Google indexed and ranked those pages despite their lack of content, and eventually (thanks to Google) the pages were found by searchers and populated with UGC. The current situation with Reddit reminds me of those olden days.

anallawalla

11:22 pm on Feb 22, 2024 (gmt 0)

WebmasterWorld Administrator 10+ Year Member Top Contributors Of The Month



The real question is how does google's algorithm determine which sites have the most expertise, authority and trust.


My guess is that AI can help to calculate a numeric score based on some of the hints we can see in the Raters Guidelines. All content can be allocated a category and micro-category (the precise word escapes me, but when I worked at the Yellow Pages, there was a small team with full-time jobs working on it).

So, how would "Orlando" have an "expert"? It needs an adjective or qualifier, e.g. "Orlando motels", else the expertise would be on the word itself, e.g. placename, human name, etc. Doing the search for Orlando is revealing. The top featured snippet (in Google AU) is the geographical location - "City in Florida" and most of the SERP has a tourism slant to it. There are some others, e.g. real estate, news items, etc.

The SERP is based on numerous other relevance ranking factors, not just Expertise, but let's assume that Expertise needs to be high for each result. The rater guidelines define it, so each site crawl is evaluated for expertise, and a number is added to the page hash. The expertise might be a site-wide number given to each page on the site because the signals of expertise might be stronger on certain pages and citations on other domains.

As EAT is treated as a metric, the validation is done after a SERP is generated, not before assembling it.

That's how I am processing it.

Whitey

7:13 am on Mar 11, 2024 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



My sense is that E A T runs through a series of metrics, no different from Google’s intent over the last decade or so. Validations may come in via quality raters before and after as part of rolling human audits across categories.

If an article is published, and nobody competes it gets elevated straight away. E A T is less likely to apply.

If the article competes, then scores associated with brand reputation kick in. Newbies will struggle if they don’t have something new and robust to say. High v low metric E A T.

Authority, I feel plays a big part. So if you are known strongly in a vertical eg medical, financial, travel etc this will overtake authorship authority. Niche specializations with notoriety may allow small players in.

Lesser factors like UI/UX may play a part, depending on the nature of the page, sending validators signals to Google.

I feel common sense and familiarity with the “thinkers” that engineer the algorithms can equal common alignments among observing SEO’s to match. ie focus on the common outcomes and “what’s good for users.

But then again, it’s just my view.