Commentary

Don't Leave Any Demo Behind In Alt Measurement Movement

  • by , Featured Contributor, January 27, 2022
The tectonic plates of media measurement are moving, as digital media behaviors upend decades-old approaches to measuring the world of TV audiences and advertising, giving impetus to the rise of the alt measurement movement, one of the biggest stories in our industry today.

I, for one, can’t wait to see sex- and age-delineated gross rating points take a back seat to more granular audience metrics for TV ad targeting and measurement.

It’s great to see Nielsen move forward with its next generation Nielsen One product. And it’s refreshing to see other alternative measurement suppliers doing deals with media owners and ad buyers to complement or supplant TV advertising’s historic primary and secondary currencies, as audiences continue to fragment across broadcast, cable, satellite and, most importantly, fast-growing streaming channels.

Connected, smart TVs are helping power many of these new measurements, and the availability of real-time viewing data from many tens of millions of viewers is a huge boost to the world.  However, it is important that we don’t let the scale of this data blind us from some of its critical biases, particularly when we are talking about how it can help us better value and allocate the $80 billion or so of advertising spend in the U.S. this year on television and streaming properties.

advertisement

advertisement

The use of these smart TV datasets in TV advertising currencies is game-changing, but will need to be balanced and hybridized with significant amounts of viewing data from households that don’t have fixed broadband Internet at home -- the “return path” for the collection of smart TV data.

Pew research informs us that Black, Hispanic and lower-income households are significantly less likely to have fixed broadband in their homes. Many of them don’t have cable or satellite TV either, relying on broadcast TV signals, which have actually become much more robust over the past 10 years with the introduction of digital antennas prebuilt into new TVs, and many dozens of new multicast networks that offer programming free to air.

Thus, not only do so many non-Internet households get their local news and sports games for free over antennas, but they are also getting alternative language programming as well. As our industry is finally paying attention to the problems of decades of under-investing in programming and advertising for Black, Hispanic and lower-income populations, it is critical that we don’t forget them with our embrace of alternative measurements.

This isn’t hard to solve. Nielsen has been balancing its panel for appropriate representation of Blacks, Hispanics and lower-income. So too can those who want to supplement or unseat their measurements.

What do you think?

12 comments about "Don't Leave Any Demo Behind In Alt Measurement Movement".
Check to receive email when comments are posted.
  1. Jack Wakshlag from Media Strategy, Research & Analytics, January 27, 2022 at 2:24 p.m.

    Not a particularly difficult problem to solve but it must be done thoughtfully and carefully. It requires more than weighting that creates the appearance of representation but doesn't actually do so. It is costly but must be done to fairly represent otherwise excluded and protected groups. Large samples reduce error, but not bias. 

  2. Howard Shimmel from datafuelX, Inc., January 27, 2022 at 2:53 p.m.

    Dave, you should take a look at the Best Practices in Integrating ACR and Set Top Box Data that Gerard Broussard and I did for CIMM. We reported the distributions of all relevant segments by key demos including race and ethnicity. Your point is accurate- all of these big data sources have specific unique skews.

  3. Dave Morgan from Simulmedia replied, January 27, 2022 at 2:58 p.m.

    So well stated Jack ... big samples can fix error, but not bias!

  4. Ed Papazian from Media Dynamics Inc, January 27, 2022 at 4:01 p.m.

    Dave, sample balancing is an old and somewhat useful method of reweighting a panel or sample that has many  segments that seem over- or under-represented. However it does not guarantee that a proper fix has been made as the basic assumption is that who ever you got---by sex, age, income, race, etc. and combinations such as young but also upper income----is representative of all such persons---you just happened to get too few or too many of them. This is not necessarily true. So the real trick is to ensure by whatever means are required that you are really sampling representative crossections of each group and, better yet, each cell. This is a very difficult thing to accomplish---especially for panels where people are constantly dropping out---or  are dropped---and replaced with "similar" substitutes. 

  5. Dave Morgan from Simulmedia replied, January 27, 2022 at 4:27 p.m.

    Thanks Howard, I will check it out!

  6. Dave Morgan from Simulmedia replied, January 27, 2022 at 4:28 p.m.

    Great point Ed. Yes. It will take more than just balancing to best represent the populations, since households wihtout broadband will also have very different viewing behaviors.

  7. Jack Wakshlag from Media Strategy, Research & Analytics replied, January 27, 2022 at 4:30 p.m.

    Agree with Ed here, but it is necessary to do things to bring in disenfranchised and disadvantaged groups that fall short. Differential sample treatments are required. That takes effort and money. I don't see that anywhere else but at Nielsen.  Certainly not at the companies that grade their own homework and share what they find. 

  8. John Grono from GAP Research, January 28, 2022 at 5:13 p.m.

    I concur with all the above posts.   'Determinants of Usage' is also a very valuable tool in increasing precision in such forms of research.   The issue then becomes which 'usage' characteristics do you include.

    But taking another view, advertisers are increasingly moving to 'pin-point' marketing.   The smaller the pin-head the more stress is put on the measurement.   Costs soar - which is the enemy of advertising budgets.

    But if you read (and agree) with Byron Sharp's work in 'How Brands Grow', there is a difference between marketing and sales.   The narrower your laser-like advertising, the less message contacts are made.  In essence you get more cost-effective sales, but overall less sales.   Many 'pin-point' campaigns focus on existing users who then tend to shop around to get the lowest price.   To grow your brand you need to communicate with the 'new user' that you never knew would be interested in your brand.

  9. Ed Papazian from Media Dynamics Inc, January 28, 2022 at 6:52 p.m.

    John, your point is well taken. When advertisers zoom in on certain mindsets within their assumed customer base they wind up paying huge CPM premiums to media sellers who are willing to cooperate. The problem is simple. If you pay 35% more per potential viewer and improve your targeting by only 20% you are actually getting fewer targeted eyeballs than before when you weren't being so sophisticated. Worse. Having made the change in direction Year One when the next year arrives you will probably be buying the same more targeted media---hence no improvement in targeting--but the media sellers will keep asking for higher CPMs which means that you are in even worse shape.. By Year Two you may be paying 45% more per set of eyeballs  than bfore but still holding even at a 20% improvement in targeting. You might counter by saying, "What if I gain 100% in targeting ability at a CPM cost hike of only 35% Year One and another 10% in Year Two?" That's possible---but very unlikely. And what's to stop the media sellers from doubling their CPM demands?

  10. Scott McKinley from Truthset, January 31, 2022 at 4:48 p.m.

    The dirty secret is that targeting data - after it's gone through all the hops, skips, and jumps of the ad tech ecosystem - isn't accurate enough to justify its cost, much less to conduct credible measurement worthy of displacing legacy solutions. If better measurement is a team sport, so is accuracy. Until the ecosystem figures out how to validate the accuracy of demo data, the legacy methods of extrapolating from panel (and its failure to keep up with consumer behavior and media fragmentation) will remain in place. 

    Anybody who actually stops to look at how data degrades across ID syncs, device linkages, and various expansion techniques will quickly come to the same conclusion. 

  11. Dave Morgan from Simulmedia replied, January 31, 2022 at 4:54 p.m.

    Totally agree Scott. The industtry is being killed by inaccurate targeting data ... folks need to better understand what you're doing at Truthset to fix this. So, so important.

  12. John Grono from GAP Research, January 31, 2022 at 5:03 p.m.

    Plus one, Scott & Dave.

    In marketing terms ... too many brands are paying more to get less.   They are getting less lightning in each bottle.

Next story loading loading..