The broadcast networks are the driving force in maintaining the upfront, and are the main beneficiaries. The upfront system creates an artificial supply and demand, which enables broadcast networks to continually raise costs despite declining ratings.
There are so many cable networks that there is a virtually unlimited supply of national commercial rating points available to advertisers. But by traditionally having separate broadcast and cable upfronts (with broadcast generally going first), it creates the perception that there is a limited amount of inventory available.
Cable networks benefit because their rate increases are generally tied to what the broadcast networks get. And the system enables many cable networks to package less desirable product with the programs that advertisers actually want. Most cable networks have a small portion of desirable programming, but advertisers are generally forced to buy the lower-rated stuff in order to get the smaller amount of the good stuff (for example, you can’t go to a network like ION and just buy “Criminal Minds”).
There are few independent cable networks left. Most are owned by the Big Five media companies — Comcast, Disney, Viacom, 21st Century Fox, and Time Warner — which also own the broadcast networks. They can make deals, bundling their many properties together, and ensuring that their lowest-rated products get sold. Without an upfront, this would be much more difficult to accomplish.
And we should not forget that Wall Street likes to have an idea of how much revenue a media company will take in during the next broadcast year — although the upfront used to be a much better indicator of this 20 or 30 years ago than it is today.
What about advertisers? There is only one reason advertisers have ever benefited from an upfront: audience guarantees. By committing to buy so much inventory upfront, advertisers are guaranteed certain CPMs and rating points. If the networks fall short, advertisers receive additional inventory to “make good” on the guarantee.
Guarantees were once more important, because the gap between what the networks estimated for their future performance, and what advertisers and their media agencies estimated, was often significant. As increased competition and a fragmented media environment have forced networks to be more realistic, this is no longer the case. Guarantees have become less and less relevant over the years, and now just provide an illusion that advertisers are getting something in return for committing so much money upfront.
There was a time when advertisers needed to buy NBC’s Thursday night “Must See TV” lineup, and other select programming, whose high ratings and critical acclaim made them extremely desirable. To many advertisers, it was also a status thing to have a lot of commercials in those programs. Those days are long gone. Except for AMC’s “The Walking Dead,” there is virtually nothing on the air that is so high-rated or critically acclaimed that it is really a “must buy.”
We are now hurtling headlong into a big data world. While age/sex data is reasonably projectable and trendable over time, anyone who’s been in research for more than a year or two knows that product purchase and usage data is much less so. And linking any of that data to television viewing habits is even less projectable. Time-shifted, binge and cross-platform viewing are not particularly consistent from year-to-year. Anyone making buying decisions based on these types of data sets will need guarantees to avoid getting killed.
Whether or not all this means we still need an upfront season may not really matter much. It doesn’t seem likely to go away anytime soon.