Are You Looking at Your SEO from the Wrong Direction? 


Google’s E.E.A.T Algorithm: A Byte of SEO Goodness for Lunch

You probably know that Google’s new search approach “E.E.A.T” now uses AI to evaluate websites with regards to their Experience, Expertise, Authority & Trust. Understanding all the factors and weightings involved in scoring your site is an incredibly complex task As a website owner, it’s your job to work out how to increase that score; using SEO as part of your toolkit to fine-tune the different levers at your disposal.

But if the AI is a black box then how do you know what to focus on first? An SEO tech checklist? Optimise bounce rate, or domain rank, or pagespeed? To meet the paradigm shift Google is driving, we need to shift the way we think about SEO.

A Thought Experiment.

To help us understand the problem let’s pretend we are starting our own search company,   ‘Gorgulon Search Inc’ (pretty confident this doesn’t exist at the time of writing) and developing our own AI model to evaluate and rank pages. For this we’ll try to do as Google would do and assume that they want the best possible experience for users. This means delivering the most trustworthy and relevant results back our users, thus helping to ensure they will come back again and again. 

A Little Bit of Data Science

When we dive into machine learning challenges, our initial task is to set a clear goal for what we want the outcome to be. In our situation, I aim for a single score – let’s call it an ‘experigagement‘ score, ranging from 0 to 100. This score combines both user experience and engagement factors. Imagine zero as the lowest point, representing a terrible, irrelevant, and spam-filled website that takes an eternity to load. On the other hand, picture one hundred as the highest point, symbolizing the finest scenario where a website not only precisely addresses your needs but also educates, informs, and entertains you, all while seamlessly guiding you to your next steps.

As data scientists we really want ‘training data’, a set of data to use to optimise and test my algorithm. I might come up with a list of search terms, then hire a team to evaluate some search pages against each term. I may have them score on 

  • How well did the page meet the intent of the search term 
  • How easy was it to read 
  • Was there a level of expertise
  • Was it compelling with it’s information giving 
  • Was it concise 
  • Was the site spammy 
  • Would the information bring me to good rather than to harm
  • Did it load quickly

Does this seem fantasyful? Google has a doc explaining how they actually it.

Search Quality Rater Guidelines: An Overview 

The list could go on but at some point I would gather all the scores up, weigh the importance of each factor and sum everything up to give our experigagement score. This is our target. 

The next part of the task is to find a bunch of ‘features’ that would help me predict the experigagement score. Maybe

  • Page load speed timings
  • Mobile friendliness 
  • The titles relevance to the search term   
  • The headings relevance to the search term   
  • The body copies relevance to the search term
  • How many people link to this page
  • Is the author know and respected across the web
  • Does the other pages on this site match the topic

Again this list could go on and on. You could have over 200 (like Google) of these engineered features. 

As data scientists we would prepare this data and use it to train our AI model, or ‘Fit’ (referred to as training) my model to predict the target state. Once fit we can use features for any new and unseen pages and have our new algorithm estimate an experigagement score for the new pages, ranking them best to worst. Matching pages to search terms is a slightly different problem but we will solve that another day (if you are unsatisfied with this you can use tf-idf for now) Tada were now all Gorgulon Search Inc billionaires.

So what’s the point of this? 

Well the engineered features will change over time, some of them will be really good at predicting the experigagement score and some will be weak predictors. We can get caught up worrying about fixing the features, we should be looking at the target, where we want to be, which is to provide the best experigagement for our customers.

Concentrate on creating content that is different from what is already readily available on the internet, that demonstrates your expertise that nobody else can do. The user’s intent should be top of mind when writing for the web and should be specific, informative and entertaining. 


We must focus on what makes Google a useful tool; relevant, practical, unique, engaging and trustworthy content is what keeps you using Google. If we help provide a content that supports Google’s aim then we will be rewarded.

Keeping the user experience in mind should help keep us focused on what’s actually important and where we should spend our efforts. Don’t prioritise tweaking the technical issues on the site to squeeze out a few extra milliseconds of load time, if there are opportunities to add some citations, an author’s bio, or some novel research charts signaling your expertise.

Read the full guide guidelines from google and start thinking like a quality rater rather than focusing on the levers.

At Coorie Dug we are experts in Data Driven SEO. We have built a whole approach to intent and purpose analysis and tooling to allow us to understand what your customers really want to achieve when they visit your site. If you would like to learn more give us a shout and we would be happy to give you a demo.

Leave a Comment