Commentary

3.know: A Conversation With Mix Modeling Pioneer Ed Dittus

All those moments will be lost in time like tears in rain.

I can't think of a better way to kick off an interview I had with Ed Dittus than by the final line uttered by the character Roy Batty in Ridley Scott's 1982 science fiction classic "Blade Runner." And it's not just because we are on the cusp of actually having AI-powered synthetic humans in the not-too-distant future, but because Dittus kept repeating the line to me during a couple of conversations we had after reconnecting.

It's beautiful, wistful dialogue and it seemed to sum up some of Dittus' sentiment about an advertising, media and marketing world he helped create when he pioneered the field of marketing mix modeling many decades ago.

Covering Dittus' launch of MMA (Media Marketing Assessment) back then and watching as it jumpstarted a cottage industry of mix modeling -- and ultimately attribution modeling -- was one of the most interesting experiences in my time as a trade journalist, and I was reminded of it -- and Dittus -- by two recent events.

advertisement

advertisement

One was listening to long-time Nielsen and NBCU exec Kelly Abcarian wax poetic during April's CIMM East Summit about the fact that virtually every number we use in the industry today is now "modeled."

Another was a recent briefing I had with Henry Innis, CEO and founder of modern day marketing mix modeling platform Mutinex, and the fact that when I mentioned Dittus, Innis said he had never heard of him.

That was ironic for a number of reasons, especially the fact that both Dittus and Innis got their inspirations while toiling inside venerable agency Y&R -- Dittus in the agency's 1970s-80s media department, and Innis in VMLY&R's Australian operations.

My point isn't that Innis should have known not just about a former pioneering Y&R colleague, but that he was the guy who jumpstarted a revolution in marketing and media analytics, planning and buying.

And too me, that was also a bit like an important industry moment being lost like tears in rain.

As luck would have it, Dittus coincidentally reached out and asked me what was on my mind.

"Funny you should ask," I replied. "Are you game for an interview?

Watch it and you'll learn not just about the moment mix modeling took off, and why, but what its founder thinks of the current and not-too-distant variants of AI-enabled models and the impact they will have on advertising, media and marketing.

13 comments about "3.know: A Conversation With Mix Modeling Pioneer Ed Dittus".
Check to receive email when comments are posted.
  1. Ed Papazian from Media Dynamics Inc, August 25, 2025 at 5:29 p.m.

    Joe, do you believe hat virtually every number we use is modelled? So if  Nielsen rates a TV show's episode and estimates its average minute "audience"as 2.1 million, or it's PPM says that a radio station in Podunk reached 2,000 "listeners" per quarter hour or an ad awareness study found that 34% of a brand's target group was aware of its message, etc--that none of this is real or measured information, it's all created artificially?Sorry, I have trouble buying that.

  2. John Grono from GAP Research, August 26, 2025 at 8:04 p.m.

    +1 Ed.

  3. Joe Mandese from MediaPost Inc., August 27, 2025 at 7:58 a.m.

    @Ed Papazian:

    Wow!

    It's ironic that you lead with Nielsen's audience estimates, and there''s a reason we call them "estimates." They are absolutely modeled under their newest -- and current standard -- model: Big Data + panel.

    In fact, when you consider Nielsen's older methods -- pre-people-meter set-meters + NAD demographic data, and/or all the weighting schemas Nielsen historically used to adjust for underrepresented portions of the population not statistically balanced in any of its samples -- never mind all the arcane edit rules and adjustments in the fine print of its methodologies -- they were all just various forms of modeled data.

    https://en.wikipedia.org/wiki/Statistical_model

    But let's focus on the modern day standard of Big Data + panel, which is what Nielsen, and most alternative currencies are using, and what the MRC recommends, as well.

    It is absolutely modeled data, because it uses Big Data sources to infer the demographic characteristics of the audience estimates generated by its panel respondents. (Kind of like what Nielsen used to do with the NAD, but with much BIGGER scale.)

    It's also modeled in many other ways, including its use of a proprietary identity panel utilzing "virtual IDs," and lets not forget about VPVH -- or co-viewing adjustment factors.

    I'm honestly not expert enough to walk you through how all of the other audience "estimates" methodologies are generated today, but I regularly talk to enough of the experts to assure that most, if not all, utilize at least some form of modeling in their outputs.

    I mean, the MRC began accounting for this as early as its 2001 Minimum Standards requirements, citing the need to disclose "statistical models" to its auditors.

    The MRC's more recent 2019 "Cross-Media Measurement Standards," which really is the current industry standard for television audience measurement, references "models" or "modeling" 65 times: 

    https://www.mediaratingcouncil.org/sites/default/files/Standards/MRC%20Cross-Media%20Audience%20Measurement%20Standards%20%28Phase%20I%20Video%29%20Final.pdf

    The MRC's most recent 2022 "Outcomes and Data Quality Standards," which is the likely current standard for the not-too-distant future of most media measurement, references "models" or "modeling" 92 times.

    https://www.mediaratingcouncil.org/sites/default/files/Standards/MRC%20Outcomes%20and%20Data%20Quality%20Standards%20%28Final%29.pdf

    The reality is that most of the biggest and most promising audience estimating methodologies -- whether it is Nielsen's Big Data + panel, or the ANA's soon-to-be deployed Project Aquila, rely on modeling audience estimates based on calibration panels.

    Media audience estimates have always been somewhat modeled, but now they are virtually completely modeled.

  4. Joshua Chasin from KnotSimpler replied, August 27, 2025 at 11:24 a.m.

    I don't think Podunk is a PPM market.

  5. Ed Papazian from Media Dynamics Inc, August 27, 2025 at 11:52 a.m.

    Joe, based on your interpretation of "modelled" data every survey that we have ever had is "modelled" so this great change is nothing new at all. Going back to the old Nielsen service of the 1970s,  they used a national meter panel to estimate how many homes were tuned in--but these findings were weighted in various ways to account for homes that dropped out of the panel due to faulty meters or TV sets out of order. Then they applied viewer-per-home factors from a household diary panel -also "modelled" for sample balancing purposes to the meter home findings to get "viewers" by sex, age, etc. So the whole operation was "modelled". The same point applied to the Arbitron local market TV and radio ratings, to the Simmons and MRI magazine readership studies, etc.  I could go on and on. Not one survey I have ever seen--and there have been many--ever used only what its sample reported without statistical adjustments--aka "modelling".  

    As for media mix modelling, there are pros and cons. Many MMM exercises fail to use all of the required data--especially about ad impact and the effects of competitive brand activity, either because they don't have it or they don't know what to do with it. So this is hardly a foolproof method of evaluating how a brand allocates its ad dollars by media. What's more, if you look at what many brands actually do with their media dollars, they are running counter to what the MMM folks would be telling them. 

  6. Joe Mandese from MediaPost Inc., August 27, 2025 at 11:59 a.m.

    @Ed Papazian: Setting the record straight on your most recent comment, because it is NOT my intepretation that every survey that we've ever had is modeled. Just the ones that are modeled. That includes the ones that utilize mathematical equations and statistical assumptions to make real-world conclusions.

    What I'm saying, is that some experts believe most of the numbers we use in the media industry are now derived from at least some form of modeling, and I believe that to be the case. It's okay if you don't agree with that. I'm curious what others think about that thesis, because I'm working on other stories and commentaries that relate to it.

  7. Ed Papazian from Media Dynamics Inc, August 27, 2025 at 12:49 p.m.

    Joe, again setting the record straight, using the definition you employed, I agree that all of the data we use now is manipulated statistically in some manner. What I'm saying is that this has always been the case but I'd like to hear one of our "experts" cite a media study which took exactly what its sample reorted as is and published the findings. I can't think of one--but I may be wrong.

  8. Joe Mandese from MediaPost Inc., August 27, 2025 at 12:54 p.m.

    @ Ed Papazian: Great, so you agree then.

  9. Ed Papazian from Media Dynamics Inc, August 27, 2025 at 1:11 p.m.

    Yep, based on the definition that you cited, Joe. I'm also stating that this is nothing new.

  10. Joe Mandese from MediaPost Inc., August 27, 2025 at 1:16 p.m.

    @Ed Papazian: Me too. I only pointed out that I was struck by the way Kelly Abcarian described that during her recent talk at the CIMM East summit. It got my wheels turning.

  11. Jack Wakshlag from Media Strategy, Research & Analytics replied, August 27, 2025 at 1:56 p.m.

    Those who understand the methodology know that statistical models and adjustments have always been part of creating estimates.  We seem to gone a level higher in acceptability when the business started accepting fusion -- and though not perfect, it was good enough.  But saying all estimates are modeled seems to open the door for any and all "models."  That is where we need to be careful. Black box models, lack of transparency, shiny objects etc. need careful scrutiny by accepted players like MRC and customers/users.  

  12. Howard Shimmel from Janus Strategy & Insights, LLC, August 27, 2025 at 3:48 p.m.

    Ed, I think there's way more modeling done than the market acknowledges. With Nielsen People Meter service, since they began leveraging the Set Meter panel, they had to model age/sex demos of those Set Meter households. With Big Data offerings from Nielsen/VideoAmp/Comscore, there's alot more demographic modeling, modeling that also needs to correct for known issues like set top box on/off, ACR data typically only measuring one set per home. 

  13. Ed Papazian from Media Dynamics Inc, August 27, 2025 at 6:55 p.m.

    Agree, Howard and Jack. My only point was that there's a lot of talk about simulated or synthetic data going around, implying that any measurement is no longer needed. That's nonsense.You need some form of measurement to start with otherwise you are playing with fake data.

Next story loading loading..