Welcome to WebmasterWorld Guest from 52.91.39.106

Forum Moderators: Robert Charlton & goodroi

E-A-T-ing well now even more critical

Expertise, Authoritativeness, and Trustworthiness

     
11:26 pm on May 20, 2019 (gmt 0)

Senior Member from CA 

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month

joined:Nov 25, 2003
posts:1268
votes: 393


I expect everyone has heard of Google's E-A-T (Expertise, Authoritativeness, Trustworthiness) signal(s).
Note: General Guidelines [static.googleusercontent.com] dated 16-May-2019 [PDF file 17.4MB]
Note: see also engine's post Google Updates Search Quality Rater Guidelines, May 2019 [webmasterworld.com]

Typically they are considered page/site level values as examples appear to all be internally derived. I have a serious problem with that perception 'as is' simply because most/all SE values tend to flow aka be passed, however weighted, via links and I would expect that to include E-A-T. Sort of how PR was/is seen to be derived, accumulate, be weighted and passed on. Especially as Google search is based upon popularity aka value flows.

There is a long standing network theory (Stanley Milgram's phrase six degrees of separation (max 5-intermediaries between any two nodes) from his 1967 experiments) popularised by a 90's play that in turn generated the 1996 online game (and subsequent meme) 'The Oracle of Bacon at Virginia': guessing the number of connections between the actor Kevin Bacon and another actor.
Note: in 2008, Microsoft found that the average internet connection had 6.6 hops.
Note: in 2016, FaceBook found that, within their network, the average was 3-1/2 degrees of separation.

If one builds a link graph [en.wikipedia.org], whether of the internet/web entirely or of a vertical or niche of interest one finds, typically, that six to a dozen large nodes stand out although there may be multiples that in smaller but still overview distinguishable nodes.

In a practical sense these are the 'popular' aggregators of authority. This does not mean they are 'true' or 'factual' just that they have accumulated authority via being popular with other sites. In this 'reality' there may be equally authoritative-popular sites with directly opposing views; yet, for each their visitors accept one as an authority, the other not. And as a SE is unable to distinguish fact from fiction so gives both a free ride.

A hybrid means (there are Google patents) to E-A-T is to initially seed aka humanly manually select high value sites and then combine degree of separation from the seeds with the popularly derived results.

IMO if there is E-A-T seeding it is being overwhelmed by popularity. As is often mentioned here at WebmasterWorld crap sites frequently outweigh actual expert authoritative trustworthy sites. Of course, there are many other ranking inputs in a query ranking determination, however that such can overwhelm explicit with implicit or that the supposedly explicit is so implicitly derived to be not fit for purpose is regrettable.

There are three Google/SEO findings from this thought exercise:
1. best to include appropriate E-A-T signals throughout a site and on as many pages as fully as practicable. As is often the case, these signals are mostly accepted best practice*** so the SE should simply be along for the ride.

2. best to identify high value E-A-T aggregator nodes and have them link out to you for best link ROI. Especially those within a site's niche/vertical/area of expertise.

3. (based on the underlaying network theory) if such a node is cancelled/dampened it has a huge effect; in experiments of various systems the loss of one to three dozen (varied by network size) nodes breaks the entire network. Networks, in reality, are actually rather fragile. And this holds true for the passing of various SE values including E-A-T. Therefor the more 'false' or 'fake' an aggregator the greater the risk that at some point it will be discounted. Risk assessment is a best business practice.


***See [1.] above.
Accepted Best Practices also used as search quality/E-A-T signals

4.5 A High Level of Expertise/Authoritativeness/Trustworthiness (E-A-T)
High quality pages and websites need enough expertise to be authoritative and trustworthy on their topic.

.................

4.6 Examples of High Quality Pages
High Quality Characteristics
* a satisfying or comprehensive amount of high or very high quality MC (Main Content)
* positive reputation (website)
* high E-A-T of the publisher and/or author
* high E-A-T for the purpose of the page
* high E-A-T for the article
* high E-A-T (everyday expertise)

..................

5.0 Highest Quality Pages
Highest quality pages are created to serve a beneficial purpose and achieve their purpose very well. The distinction between High and Highest is based on the quality and quantity of MC, as well as the level of reputation and E-A-T.

What makes a page Highest quality? In addition to the attributes of a High quality page, a Highest quality page must have at least one of the following characteristics:
* very high level of Expertise, Authoritativeness, and Trustworthiness (E-A-T).
* very satisfying amount of high or highest quality MC.
* very positive website reputation for a website that is responsible for the MC on the page. Very positive reputation of the creator of the MC, if different from that of the website


In summation:
To do good, better, best in Google search query results one should, among other best practices:
1. provide consistent sufficiently high quality main content with high to very high E-A-T of author/publisher, expertise, article, and page purpose as wel las build an overall positive site reputation.

2. acquire links from other high value sites especially from E-A-T aggregator nodes particularly those within one's niche/vertival.

D'oh.
2:05 pm on May 31, 2019 (gmt 0)

New User

joined:Oct 19, 2017
posts: 5
votes: 1


This is an interesting set of considerations and does indeed raise some good points.

Acquiring links from nodes from with one's niche or vertical can be problematic if it is an area where all the main nodes are in fact rivals. It could be considered problematic to be seen as authoritative in a field when your direct competitors would obviously not want to link to you. This could lead to all possible E-A-T factors being on-page or on-site only. How can you provide a high E-A-T for an author when that author only/primarily writes for that one site and therefore has no external validation. Surely this is quite common.

Similarly, if there are a small number of popular aggregator sites, or nodes, in the whole network then one might safely assume that for a narrow vertical there would be an even smaller number of key aggregators suited to that vertical. In the absence of links from rivals, the competitors would all be fighting to get "approval" or "stamp of approval" (links) from these aggregators probably all following identical SEO methods and techniques. I see in many verticals rivals posting similarly themed blog posts, product cateory content, and so on. There is only so much one can write about in a given vertical, especially narrow ones, so what method is then chosen to distinguish A from B from C when all cover everything there is to say on the subject and all recommended best practices have been followed?
 

Join The Conversation

Moderators and Top Contributors

Hot Threads This Week

Featured Threads

Free SEO Tools

Hire Expert Members