Get in touch with our team
Feature image for 27.04.2018


4 min read

Brighton SEO: Chris Liversidge – Using Machine Learning Technology To Build Audience-Led Analytics

This article was updated on: 07.02.2022

Understanding The Customer Journey

Chris began by discussing the importance of truly understanding the journey which consumers go on before deciding to make a purchase. Better understanding is key to improving your targeting, which ultimately leads to increased revenue and a much reduced cost per sale.

One of Chris’ key takeaways was the idea that on this journey, people move through conversion steps rather than sessions. The challenge for marketers then, is transitioning away from relying on session data to drive our strategies, instead moving towards talking about people. Machine learning provides a powerful solution to the problem of using otherwise disparate data sets, transforming them into a meaningful tool for predicting statistically significant outcomes.

Before going any further, it’s crucial to emphasise the importance of storing data in a GDPR compliant way. The way in which the machine learning data is processed can risk identifying people, so it’s key to make sure your data collection & processing practises are fully compliant with the new data protection laws. Businesses found to be in breach of the legislations risk receiving a fine of up to €20 million.

Using Machine Learning To Improve Attribution

Using session led event data is a great place to start but this data then needs to be mapped to customers, who can then be mapped using CRM data; machine learning is the way to do this.

When attributing users it is imporant to only account for factors which are proved to be statistically relevant in the user journey. Even though what Chris called the ‘re-engagement factor’ may not be the first visit a user has made to a site, more can be gained by looking at this event than at an original visit which happened 2 months ago. The statistical significance of each of these engagements is highly different, and need to be accounted for in your attribution model.

As a result, Chris advocates developing custom machine learning models for your data, rather than relying on the solutions offered by e.g. Google. In doing so, you will achieve a level of granularity which he says just isn’t available in other tools, even those using Beta features. He also discussed the need for a custom solution for each client, as the customer journey will change across different industries and audiences.

How To Develop An Attribution Model

So how can we go about building an attribution model? Chris recommended the following process:

  • Obtain your raw event data
  • De-duplicate and cleanse this data
  • Partition this cleansed data
  • Begin to identify your user groups
  • Absorb this data into your attribution model

He compared using machine learning to the technique used in weather forecasting – both processes rely on collecting historical data and making predictions based on it. But rather than predicting temperature or precipitation, marketers can instead use data to predict “purchase proximity” and revenue. These predictions should be statistically derived from the data provided by machine learning, rather than marketers predicting this themselves.

Once an attribution model is developed, its accuracy can be established by comparing it’s predicted revenue to the actual revenue which is delivered over a set time period. When the two match up, you know your model is working. Over time, you can then start to identify your core audiences which are crucial for growth. In the example Chris used, just 10% of customers were shown to drive 50% of conversions over a 3 day period – immediately demonstrating the impact of accurate attribution.

Audience Segmentation

Another benefit of using a custom built attribution model is that user groups can also be separated out by custom values, for example by lifetime value. As a result, these user cohorts can then be treated differently based on the predicted likelihood of conversion, for example with bid adjustments and customised ad copy, at each stage in the “See, Think, Do, Care” model. Running paid marketing campaigns to an audience predicted to have a high conversion probability indicates you will get a great return, or you could choose to run programmatic ads to certain user groups in the see & think stages. In this way, you can improve engagement with users who are higher up in the funnel, by addressing them in a way that best suits their current state of mind, which can significantly reduce the cost per sale.

Key Takeaways

“If analytics isn’t working with ‘people’, attribution is futile.”

Default attribution models are becoming less and less fit for purpose, and machine learning is here to stay. To fully understand how customers come to buy a product, marketers need to be proactive in developing their own machine learning tools. Using this vastly improved data will significantly improve your targeting capabilities, ultimately delivering a much improved cost per sale and return on investment.