The Next Gen Of Audience Measurement Will Require Big Data

This has become a cliché, but Big Data is becoming “the oil of the advertising industry” -- and the phrase has never been more apt.

Nowadays, virtually every component of the ad industry has been impacted by Big Data, with the exception of national television audience measurement. That will soon change.

For example, to remain competitive and grow revenue, ad agencies are reliant on Big Data to improve their proficiency in better analyzing their clients' first-party data, enhance their digital media capabilities and uncover insights to make better media selections.

Over the past five years, every large holding company has invested billions in acquiring a data analytics company or have launched their own Big Data analytics department.

At the same time, traditional media companies have also been investing in Big Data, to keep the flow of ad dollars from being reallocated to digital behemoths.

These include the use of greater analytics to promote their lower-funnel capabilities. Both agencies and programmers have had advanced TV departments for years.



Big Data now impacts virtually every traditional industry, from banking/finance and real estate to health care and agriculture.

In next year’s midterm elections, candidates will use Big Data to geotarget persuadable voters with a particular campaign issue.

In baseball, "Moneyball" has permeated every team that now uses Big Data for scouting reports and adjusting their defensive alignment for each hitter. And so on and so forth. 

When Nielsen came under criticism from the VAB (Video Advertising Bureau) last April, the national people meter panel of 40,000 households came into question.

During the COVID-19 pandemic with lockdowns in place, Nielsen reported lower weekly reach and lower TV usage defying conventional wisdom. The decline in viewing resulted in a loss in ad revenue.

At times, programmers who are angry about their ratings have been critical of Nielsen, but typically these complaints soon fade away. This time, however, the conversation quickly escalated as to whether a nationwide panel is suitable in a fragmented and on-demand video landscape and whether Big Data sources should be used instead.

For example, Comscore -- which passively aggregates return-path data from millions of households -- reported no declines in TV usage during the lockdown.

In addition, in recent years, a number of ad-tech companies reliant on large data sets have emerged providing their clients with programming and consumer behavior.

Big Data has been used for local TV ratings in smaller markets for several years. As Nielsen’s TV diaries became increasingly unreliable, caused by low cooperation rates, this results in “zero cell” ratings and/or unstable (and unusable) ratings.

Advertisers even began using other ad-supported media as a replacement for local television. Some stations and advertisers in smaller markets began using Comscore and their larger sample as their negotiating currency. 

In response, Nielsen announced the long-awaited replacement of TV diaries with a new measurement tool called Viewer Assignment.

With Viewer Assignment, Nielsen uses RPD tuning data from in market MVPD households. For demographic ratings, Nielsen uses algorithms to identify “look-alike” households in their People Meter sample within the region. The People Meter viewing is then integrated with the tuning data from each RPD home. Nielsen’s move to a larger sample was applauded by TV stations, ad agencies and advertisers. 

While RPDs offer a far more robust database than a panel, there are some factors to consider. The number of MVPD subscribers has been declining each year and has not slowing down. Among demographics, younger adults are least likely to have an MVPD subscription. In addition, ethnic groups have historically been underrepresented among pay TV subscribers. 

Another Big Data source that can be used for national TV measurement is connected TV.

Leichtman Research reports that 80% of all U.S. households have at least one connected TV device. A new study from Magnite says 40% of U.S. TV households watch CTV exclusively. In just two years, Vizio, an OEM for “smart TVs”, has noted a sizable shift in viewers from linear TV to streaming video.

More importantly, CTV users offer a better representation of the video audience with younger age groups and ethnicity. 

There has been broad industry consensus that using only panels is no longer acceptable.

Although the pandemic brought to light their limitations, in reality, panel measurement has been inadequate for years.

Many thought leaders agree, in today’s video landscape with streaming video, mobile, on-demand, addressable and other opportunities to view video content, Big Data with their robust sample is needed.

In addition, Big Data can provide better targeting and the ability to measure the impact of ads on sales. Big Data sets and the granularity they provide have become a necessity when negotiating ad rates.

To accomplish these goals will require accessing Big Data from a number of sources. These include RPD from MVPDs, Automated Content Recognition from smart TV’s, CTV devices, programmers and breaking through the “walled gardens” from digital media (including social) among other providers. 

As the industry moves toward audience-based buying and away from demographics, the larger samples from Big Data will allow advertisers to integrate their own first-party to build a better targeted media schedule. 

Panels should still play a role in audience measurement, but the sample needs to be far more robust than now. 

Going forward, the next generation of audience measurement will also need collaboration and cooperation among key stakeholders in addition to Big Data owners; including programmers, measurement companies and advertisers.

Audience measurement will need to be scalable, privacy-compliant, interoperable and transparent with a goal of Media Rating Council accreditation. It must be consistent across nonlinear and linear viewing and must be standardized and flexible with the capability to measure, on a timely basis, any new media opportunities such as ATSC 3.0 (a.k.a. “Next Gen TV”), the rollout of 5G and the impact of the next new social media phenomenon that will have a bearing on viewing behavior.

Similar to any audience measurement provider, Big Data is not perfect -- but it will better address the needs of the advertising community than what is being used now. There are other issues that need to be ironed out. A more robust sample with Big Data still needs to be a representative of the U.S.

It will also require the licensing of Big Data providers which could be costly.

The complete overhaul from the past 70 years impacting every facet of the ad industry will be an adjustment, including the potential of several audience measurement providers. Finally, this could take several years to accomplish, but when completed, will be well worth the wait.

2 comments about "The Next Gen Of Audience Measurement Will Require Big Data".
Check to receive email when comments are posted.
  1. John Grono from GAP Research, October 28, 2021 at 8:21 p.m.

    I agree Brad.   In fact big data is being used, but not universally.

    Hiowever, big data also has its shortfalls.   It's strength is that it provides deeper and more accurate audience data (as long as the measurement is at the user-side and not the delivery-side) for the broadcaster/streamer/publisher etc.

    So that is all well and good on the seller's side.

    But on the buyer (i.e. advertiser) side what you get is a series of isolated vertical data.   As an advertiser or buyer what is needed is horizontal cross-media measurement (i.e. de-duplicated measurement).

    Where I see the opportunity for robust media panels is that they can, and do, provide duplication data.   In essence, you meld the 'traffic' side (big data) with the 'usage' side (panel), to scale the data down to provide usable/believable estimates of de-deduplicated audience who saw a campaign.

  2. Ed Papazian from Media Dynamics Inc, October 29, 2021 at 8:04 a.m.

    Brad, while big data in many cases represents real data---auto sales, for example, as well many forms of shopping behavior, banking, etc. when it comes to measuring audiences----especially for ads---there are many issues. Chief among these are  determining who was "watching" and whether the person or persons assumed to be viewing program content actually saw or were attentive to  the commercials. What we are now seeing are attempts by Nielsen and Comscore to statistically estimate who is watching the programs in smaller markets ---which is an imteresting approach---if it passes the sniff test. But when we get to the question of who watched the commercials, we are, again, in difficulty.

    It's hard for me to see how attempts to attribute "outcomes"---often meaning sales results not just clickthroughs to an advertiser's website or other "soft" manifestations of responsse---will succeed when the underlying "commercial audience data has error margins of 50-100%--between what was assumed to happen and what actually took place. That is why I, Tony and a few others are engaged in this, ultimately losing battle,against the acceptance of device usage as a surrogate for viewing. The evidence is overwhelming. It's a very sloppy and poor predictor with all sorts of variations based on commercial clutter, program type, viewer demos, time of day and other factors---all pulling in many directions. Because they do most of the funding, the time sellers will, no doubt, win this battle and get their big numbers  and we will have lost a great opportunity to really move forward by incorporating attention metrics into the equation.

Next story loading loading..