The broadcast networks are the driving force in maintaining the upfront, and are the main beneficiaries. The upfront system creates an artificial supply and demand, which enables broadcast networks to continually raise costs despite declining ratings.
There are so many cable networks that there is a virtually unlimited supply of national commercial rating points available to advertisers. But by traditionally having separate broadcast and cable upfronts (with broadcast generally going first), it creates the perception that there is a limited amount of inventory available.
Cable networks benefit because their rate increases are generally tied to what the broadcast networks get. And the system enables many cable networks to package less desirable product with the programs that advertisers actually want. Most cable networks have a small portion of desirable programming, but advertisers are generally forced to buy the lower-rated stuff in order to get the smaller amount of the good stuff (for example, you can’t go to a network like ION and just buy “Criminal Minds”).
There are few independent cable networks left. Most are owned by the Big Five media companies — Comcast, Disney, Viacom, 21st Century Fox, and Time Warner — which also own the broadcast networks. They can make deals, bundling their many properties together, and ensuring that their lowest-rated products get sold. Without an upfront, this would be much more difficult to accomplish.
And we should not forget that Wall Street likes to have an idea of how much revenue a media company will take in during the next broadcast year — although the upfront used to be a much better indicator of this 20 or 30 years ago than it is today.
What about advertisers? There is only one reason advertisers have ever benefited from an upfront: audience guarantees. By committing to buy so much inventory upfront, advertisers are guaranteed certain CPMs and rating points. If the networks fall short, advertisers receive additional inventory to “make good” on the guarantee.
Guarantees were once more important, because the gap between what the networks estimated for their future performance, and what advertisers and their media agencies estimated, was often significant. As increased competition and a fragmented media environment have forced networks to be more realistic, this is no longer the case. Guarantees have become less and less relevant over the years, and now just provide an illusion that advertisers are getting something in return for committing so much money upfront.
There was a time when advertisers needed to buy NBC’s Thursday night “Must See TV” lineup, and other select programming, whose high ratings and critical acclaim made them extremely desirable. To many advertisers, it was also a status thing to have a lot of commercials in those programs. Those days are long gone. Except for AMC’s “The Walking Dead,” there is virtually nothing on the air that is so high-rated or critically acclaimed that it is really a “must buy.”
We are now hurtling headlong into a big data world. While age/sex data is reasonably projectable and trendable over time, anyone who’s been in research for more than a year or two knows that product purchase and usage data is much less so. And linking any of that data to television viewing habits is even less projectable. Time-shifted, binge and cross-platform viewing are not particularly consistent from year-to-year. Anyone making buying decisions based on these types of data sets will need guarantees to avoid getting killed.
Whether or not all this means we still need an upfront season may not really matter much. It doesn’t seem likely to go away anytime soon.
Steve, yes it's true that the upfront allows the broadcast TV networks to control---in large part--the pricing of national TV ad time. And it's also true that many cable channels can load up on commercial clutter anytime they want to create additional GRPs---since nobody seems to care about this. However there's more to the upfront than that. While you are probably correct that the supply of "linear TV " GRPs is not as finite as is assumed, I would counter that the supply of "quality" content on "linear TV", is, in fact, limited. It's the "junk", tonnage, that is readily available. While some advertisers---mainly those selling low interest, commodities like toilet paper, katsup, toothpaste, etc. don't care all that much about the kinds of shows their ads appear in, this is not the case with many other categories---financials, tech products, corporate image campaigns, travel, fashion, beer and qiute a lot of others. Such advertisers are willing to pay much higher CPMs to showcase their ads in what they percieve to be "quality content" and they can't be assured of having access to such content if they bought only in the scatter market. Also, many advertisers believe that endless reruns and low budget talk shows, talking head interviews, "unscripted" reality shows, etc. are not only to be avoided, but that such heavily ad cluttered environments have a negative impact on their ad exposure---hence ad impact. To such advertisers, they would rather get one spot in in a broadcast network drama than five or ten spots in "The Jerry Springer Show"-----no matter what the relative CPS are. So I both agree with you and disagree----like always.