The Advertising Research Foundation’s (ARF) “Attribution & Analytics Accelerator” conference concluded Thursday with a focus on attribution models.
Alice Sylvester, Partner of conference co-host Sequent Partners opened the day with a huge challenge: “Attribution is not dead but what is it morphing into?”
It was based on the increasing difficulty of obtaining consumer data in a privacy-compliant world and understanding that such data is attribution’s lifeblood.
The key deliverable of attribution is understanding what elements in an ad campaign worked – or did not – and by how much.
For Microsoft this is typically done by comparing those consumers exposed to a campaign versus those not exposed. However, what if there is insufficient consumer data to “match” the detailed profile of those exposed specifically including “not being exposed” to the campaign under consideration? And how can this control group be created?
advertisement
advertisement
Caroline Iurillo, senior data scientist, and Megan Lau, director of consumer data science, at Microsoft, explained how the non-exposed comparison group is simulated in aggregate via sophisticated “look-alike” techniques.
They made it sound easy, but it is far from straight-forward. As ad “exposure” is the key to the analysis, whether Microsoft uses viewable impressions, a device measure, as a surrogate for “exposed,” or the more meaningful audience eyes/ears–n contact data, was not made clear.
Recognizing that attribution and marketing mix models each have strengths and weaknesses – depending on the category and goals – Spotify Data Scientist Jay Habib shared that it is now using a “Markov Chain” model. This model provides a better understanding plus the relevance of the sequence of events that result from a consumer’s journey amid various advertising inputs. It can also provide both short- and long-term insights.
Rex Du, professor at the University of Texas at Austin, delivered a comprehensive summary of all the key aspects and considerations required to select an appropriate model and accompanying data sets to underpin an understanding of marketing influences on the consumer lifecycle. He stressed the importance of understanding the difference in incremental effects – both short and long term – of ad exposure.
His research also revealed the relative importance of reaching recent brand purchasers and the potential waste of focusing campaigns on immediate response.
Successfully measuring local market foot traffic and ad attribution, notably for streaming TV ads, OTT, is complex and tricky. Traci Will, vice president of analytics at Gamut and Stu Schwartzapfel, senior vice president of media partnerships at iSpot.tv, partnered to offer advertisers an effective, efficient way to reach and convert new customers. However, their presentation raised a question considering iSpot’s recent investment in TVision. Perhaps the viewable impression device-only TV data currently being used will be made much more rigorous via TVision’s person-level data on how people watch TV?
With data accessibility and quality under increasing threats every day, this bloodstream of modeling becomes ever more of concern to the validity of the results.
This conference’s four-day series of presentations underlined the relative importance of targeting along with creative versus media effects and their execution.
It also stressed the need to balance marketing investments against both long-term brand equity building and short-term sales results. The former appears to be best addressed by marketing mix models; the latter to attribution models.