There’s a huge buzz around media measurement and marketing effectiveness in the industry right now, and for good reason. The phase-out of the third-party cookie presents advertisers with attribution challenges that not only make it more difficult to make strategic decisions but also to achieve stakeholder buy-in for marketing investment.
To help you improve your knowledge of the modern measurement landscape and understand the best solutions to embrace user privacy without hindering marketing performance, Impression has launched a series of short videos.
Watch the full series here and gain insights into:
- The current technology and privacy headwinds challenging digital attribution
- How to respect user privacy online and implement compliant solutions
- How to more accurately measure the impact of your efforts through different solutions e.g. multi-touch attribution, incrementality testing and media mix modelling
- How to communicate the value of marketing measurement to key stakeholders
In part six, Technology Director, Aaron Dicks and Media Effectiveness Executive, Jake Piekarski explore:
- The importance of media mix modelling
- The benefits of a bespoke approach
- What a brand should look out for in a well-run media mix model
See the transcript of the video below the recording.
Aaron Dicks: Why is Media Mix Modeling so important?
Jake Piekarski: Well, first of all, it provides a holistic view of a particular client’s media mix and the outputs it can give us allow us to make data-driven decisions. So, for instance, one of the key outputs is the budget scenario, the budget optimiser, which shows you where certain channels can grow and where’s best to spend your money if you’re going to optimise these KPIs and ROAS.
Aaron Dicks: I know certainly when we’re talking about the sort of the benefits of the outputs and the benefits of our methodology, one of the most interesting aspects is the ability for these models to take into account the delayed effects of advertising. Could you talk a little bit about how we do that?
Jake Piekarski: Yeah, so it’s very important to take into account the delayed effects of advertising, purely because a consumer won’t necessarily act or do anything, or even see your advertising immediately. They might a few days after launch, so being able to provide or estimate the contribution over a large stretch of time as opposed to immediately is very important. The way in which we do that is using the model alone we’ll use a transform on the metric that we’re using to describe this channel. Using the model alone, that’s only theoretical, it’s based on associations and correlations. The other side to media effectiveness is incrementality testing and the way we can infer ad stock through that is by determining the point at which the interval leaves zero for whatever we’re measuring. When that happens, we know that’s roughly how long it’s taken to see the effects of this advertising and we feed that causal result into the model, which allows us to provide a suitable, acceptable estimate.
Aaron Dicks: I wanted to talk to you a little bit about data quality because I know data quality and certainly the data requirements of the model can have some impact on that level of uncertainty in the outputs. Can you talk me through what sort of data is required and maybe any particular issues we may see frequently?
Jake Piekarski: Data quality is really important in Media Mix Modeling and media effectiveness as a whole because what we’re measuring is dependent on data and historical trends and associations over time. So if we’ve got missing data or data that’s not clean or formatted the right way then that could really sway the conclusions and the results of the model in a way that we wouldn’t necessarily want it to. Data quantity is something we look at a lot, it’s quite important when conducting or building a Media Mix Model. We want to generate insights in the long term and if we haven’t got enough data, it really struggles to pick up the seasonality and the trends that are going on within the dataset.
Aaron Dicks: How much data look-back are we talking about here?
Jake Piekarski: Roughly 2 to 3 years, there’s no exact right figure. Usually, the longer you’ve got, the better. Conversely, if you have too much data, you’ll be capturing trends from a very long time ago and assuming that they may continue in the future. So a healthy in-between is roughly 2 to 3 years, that’s what I would recommend.
Aaron Dicks: What are some of the interesting data points we can feed into a Media Mix Mode? I guess we’d call them context.
Jake Piekarski: Context variables will refer to things that aren’t direct marketing activity.
So that includes variables such as a general trend, measuring if there’s an increase or decrease in trend, and seasonality. We can also take into account the average order value to see how your sales are evolving over time. My personal favourite is comparing the market value, so if we’ve got the market value of your particular product and we have the price of the client’s product and we divide one by the other, whenever that goes larger than one, the client will be selling the product more expensive, so you should expect a drop because consumers will go elsewhere and vice versa for the other way around.
Aaron Dicks: I know certainly when we’ve been talking in the team, there’s challenges
around certain datasets giving the right amount of signal, because ultimately we are comparing
cause and effect and correlating those two things with our delayed effects over time.
Jake Piekarski: One thing we encounter a lot is multicollinearity. What that is, that’s when two channels that we’re not measuring, so these are the independent variable channels, have correlated themselves. When that happens, the results could be thrown off and swayed in a way we wouldn’t want them to. To go about fixing that, we usually conduct, we call it Model-On-Model, which is essentially building a model that predicts how one channel affects the other. We feed that into our current model and it’s all accounted for like that. But in the absence of that, you would lead to inaccurate contributions and possibly model overfitting. That occurs where the model measures the noise as opposed to frequency. So it’s too good of a fit to the point where it doesn’t paint the right picture.
Aaron Dicks: What are the benefits you see in our approach? Both the type of statistics and the bespoke approach?
Jake Piekarski: Our approach is generally a lot more flexible. I think because we have relationships with clients across various different markets, applying a one-model-fits-all won’t capture and account for the specific business needs of various clients. Conducting Media Mix Modeling the way we do, where we can account for these prior to building the model, is important.
Aaron Dicks: What should a brand be looking out for then, in a well-run, well-conducted Media Mixed Model?
Jake Piekarski: Really good quality data, clean data as well. Incrementality Testing. Media
Mix Model alone isn’t going to explain the causation, so we need testing to back the model up.
Then also Hold Out testing, so once the models fitted over a long period of time, comparing the difference between your results and what actually happened.