Many years ago, when most usable television research data was in hard-copy reports released by Nielsen, there was a monthly book called the Program Cumulative Audience Report (PCA). Its various
nuggets of information included data on how many episodes per month the average viewer watched of every network TV show on the air.
Back then, even when there were far fewer choices
— no original scripted cable series, no Netflix, Hulu, or Amazon Prime, and no regular series on premium cable — people watched only slightly more than two out of four episodes per month
of the most popular series. There were no DVRs on which to store programs, or On-Demand to catch up on episodes. If you missed an episode of your favorite show, you had to wait until
summer repeats.
When DVRs started to become a thing, I wrote an article speculating that they would cause television usage to go up as people started to watch that third or fourth episode per
month of their favorite shows.
advertisement
advertisement
That has, to a large degree, been the case. The advent of on-demand programming has also had an impact on increasing viewership (albeit a smaller
one).
Once Nielsen started including DVR homes in its sample, roughly 95% of all DVR playback was done within seven days of the original broadcast. This is why one of the main debates was
between C3 and C7. With quick analysis of program performance and television buys being desirable, that extra 5% could be ignored.
Today, there are a lot more people who binge-watch
several episodes of a weekly series. This means, of course, that there is probably a tremendous amount of television viewing that is not being counted in Nielsen’s currency measurement
(even if C7 is used instead of C3).
I conducted a survey of my Facebook friends (a surprisingly diverse group), just as a fun exercise. I asked four simple questions about their DVR, on-demand,
and OTT usage. Twenty-five percent of respondents did not have a DVR (the national average is about 50%). Here are the results.
On average, they currently have eight programs queued on
their DVR (ranging from three to 20). There are currently four shows on average where they have two or more episodes stored (ranging from one to 11). Two-thirds of respondents watched at
least one program on demand in the past week. Two-thirds also watched Netflix, Hulu, or Amazon Prime in the past week, and spent 5 ½ hours doing so (ranging from one to 10 hours).
This seems like a lot of viewing not picked up and included in currency measurement. If you have eight programs on your DVR, you are likely not watching all of them within three days of their
initial broadcast. If you have four weekly programs with two or more episodes stored, by definition you are watching many of them more than a week after their initial broadcast.
Of course
this is far from a scientific survey. I’ve always believed, however, that what people are doing, what they are watching, how much they are watching, and what devices they own are
correlated with, in addition to age and sex, such things as geography, income, education, presence of kids, etc. But once they have the device, and once they are watching a program or video on
that device, how they watch TV is mostly contingent on age/sex.
In other words, there is no reason to think a 25-year-old in Wisconsin, who is watching “The Walking Dead” on
her DVR, is fast-forwarding through commercials at a different pace from her counterpart in New York (if anyone has research that shows otherwise, I’d love to see it). So a sample that
represents the country at large is much more important to measure some things than it is to measure other things.
If nothing else, this should be a conversation starter.