Study Finds Online Video Usage Dramatically Overstated

The amount of time Americans spend watching online video is vastly overstated, according to the findings of some highly regarded research made public Tuesday. The disclosure, which is likely one of the more controversial findings being mined from an ambitious piece of academic research that actually observed how people spend their time consuming media, was made during one of a series of so-called "collaborative alliance" meetings hosted by Havas media shop MPG for the advertising and media industry in New York.

"This may be the first study to document the dramatic overstatement of online video and mobile video," said Jim Spaeth, one of the founders of Sequent Partners, which collaborated with Ball State University's Center for Media Design on the Video Consumer Mapping Study on behalf of the Nielsen-funded Council for Research Excellence. The project, which cost $3.5 million to field, directly observed how people spent their day using media, found that while growing rapidly, online video and mobile video still account for a small fraction of the amount of time Americans spend watching all forms of video content, including live TV programming, time-shifted television, DVDs, video games, etc.

The researchers previously disclosed findings showing that traditional "live" television still accounts for more than two-thirds of the time Americans spend watching video content each day, and that online video represents less than 1%. The new findings unveiled Tuesday indicate that even the relatively small amount of time Americans spend watching online video has been, on average, grossly overstated by conventional forms of media research and audience measurement.

Conversely, Sequent's Spaeth said traditional TV viewing has been "pretty drastically under-reported" by research that asks people how they consume video. The reason why, he said, is that research based on how people perceive they consume media isn't nearly as accurate as research that actually observes who they use it.

The ad industry historically has known about such "halo effects" and the fact that it is considered socially unpopular for people to report that they watch as much TV as they actually do. On the other hand, Spaeth said people tend to over-report their online and mobile video consumption, because "it is new and cool."

Spaeth, and Mike Bloxham, director of insight at Ball State's Center for Media Design, are scheduled to reveal more previously unreleased details about online video from the study during a presentation and panel discussion at the upcoming OMMA Video conference June 16th in New York.

During MPG's meeting on Tuesday, Spaeth revealed other new insights from the study that he claimed actually "measure the future" of how people will consume video content. That aspect of the study relied on a method called "media acceleration," in which consumers were given substantial discounts - upwards of 50% - off the price of purchasing new consumer electronics equipment for their homes, and their media consumption patterns were observed before and after the new technologies were in place.

Spaeth said the No. 1 finding from that part of the study was that almost everyone who participated purchased a high-definition TV set - either their first, or a second one for their home - and that the adoption of HDTV generally led to greater usage of television initially, but that over time, that increased usage began to subside.

"There is an early indication that this may be a temporary effect," Spaeth said of HDTV's stimulus effect. But at least in the short run, he said, "Live TV viewing accelerated by more than twice as much among those people who acquired and HDTV."

12 comments about "Study Finds Online Video Usage Dramatically Overstated".
Check to receive email when comments are posted.
  1. David Sutula from 'peeps creative, June 3, 2009 at 9:25 a.m.

    This article cites halo-effect, that is - people perceive that their behaviors are different from what they actually are. So they subjects of this study 'think' that they were consuming more video online than they actually were. I would suggest that there is a heavy dose of 'Hawthorne-Effect' here as well. That is - how people report their own behavior is sometimes dramatically different than their actual behavior. People probably percieve online video consumption as younger, more hip and 'connected' than they actually are, so they make the decision, sometimes unconsciously, to report that they consume far more online media than they do. I think that this very effect is an actionable behavior by marketers. After all, don't people buy for the people that they think they are more readily than they buy for the people that they actually are? Even a guy like me who looks for survey errors like the Hawthorne Effect is still buying jeans with a 36 waist:)

  2. Rob Frydlewicz from DentsuAegis, June 3, 2009 at 9:53 a.m.

    Similar to what this study found about HDTV usage, a number of years ago Nielsen reported the same about DVR users, i.e. new owners used it a lot more than those HHs who had owned one for a number of years.

  3. Douglas Ferguson from College of Charleston, June 3, 2009 at 9:53 a.m.

    If we are to believe the Hawthorne Effect, then this widely-touted study has limitations. Who's to say that it's accurate and the studies are wrong? As for all research, one should look for consistent trends instead of a single end-all study, which is why all journalists should be required to take more math and learn to demonstrate quantitative research methods proficiency.

  4. Douglas Ferguson from College of Charleston, June 3, 2009 at 9:56 a.m.

    Sorry for the double comment, but online video usage probably should not be studied using self-report or observational research: The actual data are logged by the online video sites themselves. Is not a timer and user log more useful than recollection or observational interpretation?

  5. Joshua Chasin from VideoAmp, June 3, 2009 at 10:11 a.m.

    Joe, I think it is important too note that the headline here might lead the reader to believe that methodology-agnostically, online video usage is overstated. But at comScore, our Video meetrix service is based on hybrid measurement, including passive, empirical "metered" observation from a panel, merged with site-centric empirical counts of actual streams served, via beaconing. Thus, in Video Metrix, for example, a tool buyers and sellers use to understand the online video audience, which has absolutely no recall-based component at all, online video viewership is not, in fact, overstated.

  6. Monica Bower from TERiX Computer Service, June 3, 2009 at 10:27 a.m.

    "Study finds online usage dramatically overstated." For I don't know how long I have been responding that I spend anywhere from 30 - 50 hours per week online in Harris Poll and other surveys I might take. Though technically true, this is because I have a persistent connection (as does everyone who isn't on dial up) and the reality is I'm checking email or playing online games or surfing a limited number of sites - maybe six? that I hit up daily, and an even smaller percentage of time is spent on one-and-done sites I go to for one specific thing and never visit again. I contend that the problem is the wording of the question in both cases.

  7. Joe Mandese from MediaPost Inc., June 3, 2009 at 4:23 p.m.

    Lots of good comments here. I'd like to respond/add to a few.

    Re. Josh Chasin's point: Well made. I think Jim Spaeth was mainly talking about self-reported methods, but I think the other point is that there is value in benchmarking actual behavior via the observational method. It doesn't replace what you or others are doing with things like Video Metrix, which of course is sound, representative research of its kind.

    Re. Doug Ferguson, I am just a journalist and not a researcher, but my job is not to use mathematical skills to vet the quality of media research (that's up to organizations like the MRC, IAB, etc.). My job is to cover what they've found and report on that news, and let people like you weigh in. Sorry if I misled you into thinking otherwise.

    Re. Rob Frydlewicz's point, that is interesting, and ironically, one of the other findings that Jim Spaeth shared (but I did not report on) was that their data suggests the opposite: That, "Early adopters of DVRs spent much more time with DVR playback than more recent adopters." Interesting.

  8. Terry Heaton from Reinvent21, June 3, 2009 at 5:29 p.m.

    You know, Joe, I'm always intrigued by studies that seem to have as a mission the unringing of the Internet bell. Nielsen has a very big dog in this fight, and that has to be acknowledged before exploring any data. I'd need to know specifically whose research "overstates" online video usage and let them have the opportunity to respond before swallowing anything from a group with something to gain from such a broad statement, namely a company whose business depends on TV viewing via a box in the living room. As we say in "the biz," stay tuned.

  9. John Grono from GAP Research, June 3, 2009 at 6:47 p.m.

    In read Terry's comments with interest and felt compelled to provide comment.

    The Committee for Research Excellence (CRE) which commissioned this research consists of around 35 research professionals. Sure Nielsen (who pay for it all) are there, as are the broadcasters. But so are the media agencies and the advertisers, such as Unilever, P&G, Kraft, Kimberly Clark. The CRE then went outside of that group and commissioned Jim Spaeth (Sequent) and Mike Bloxham (Ball State University) to perform a rigorous ethnographic study.

    And what is Terry's take on this august group and the reporting of their findings? That they are trying to "unring the Internet bell". Just once I would like to see the same rigour applied to internet audience measurement. Yes, the same internet that doesn't even have an agreed set of definitions and procedures on how to measure itself, and has no agreed currency - may he who can report the biggest numbers win seems to be the war-cry. The internet is the new home of measuring traffic and passing it off as audience, having wrested that crown from Out-Of-Home who are working hard on audience models due soon. In my experience, most online sales and marketing people don't even realise that there is a difference between traffic and audience. They point to their server-log data and how huge the numbers are, then turn and criticise the robust research conducted that tries to provide audience data (yes, you Josh at ComScore and your compatriots over at Nielsen Online as well) and say "well they can't be right". This is "Research 101" stuff, and the gross majority of the internet business don't get that traffic does NOT equal audience. At the risk of repeating an example I have used before on here - when you aggregate the de-duplicated traffic data here in Australia you get a "uniques" figure of 45 million. Problem is that there are only 21.7 million of us. Josh may have similar data for the US that he could share, but I am sure it would be of the same magnitude.

    I look forward to the day that the rigour that the CRE have approached this project with applies to all Internet measurement. Until that day, I'm afraid that the majority of online data and reports must fall into the overflowing hyperbole bucket.

    Fortunately, the IAB in Australia is in lengthy and deep discussions with the agencies (full disclosure - I represent the media agencies), advertisers and the 'long tail' publishers to develop an agreed audience measurement system, having just this month launched an agreed-on audited traffic measurement system.

    I'll also give you the tip - if television converted it's audience back in to traffic data, then they would swamp the internet! Fortunately, they realise it is a useless statistic.

  10. Joe Mandese from MediaPost Inc., June 4, 2009 at 7:29 a.m.

    Re. Terry Heaton's comments, John Grono is correct, Nielsen did not conduct the CRE research, and I'm told does not influence it directly. It merely underwrites it. The research is designed, overseen and commissioned by an independent committee of Nielsen clients from agencies, marketers, and media companies. The goal is to conduct primary research to help understand how people are using media. The observational method used by Ball State's Center for Media Design, is highly regarded in many research circles.

    One other point of clarification, this project actually used layered a couple of methods on top of the observational method to better understand how people are using media. I referenced one, the "media acceleration project" in the article we published. I did not explain how the CRE project gleaned the differences between "observed" and "self-reported" estimates for TV and online video viewing. To do that, the researchers fielded two separate waves -- one in the spring and the other in the fall of 2008 -- and also recontacted households with a follow-up telephone interview asking them to self-report how much time they spent watching TV, online video and other media. It was that differential that Jim Spaeth was referring to.

    We linked to the actual study in the original article, but I am posting a link to it here as well:

    http://www.researchexcellence.com/vcmstudy.php

  11. Peter Contardo from Endavo Media, June 4, 2009 at 2:20 p.m.

    Overstated or not (what's a few decimal points among friends), online video usage continues to change how people access information, use the web and consume content. YouTube is a mega information, marketing and entertainment engine. Cable companies are struggling to figure out new business models because their customers have become so used to getting the content they want, when they want it. Newspapers are becoming more open to citizen journalism by allowing user-generated uploads and comments. Small and emerging brands are competing with the ‘big boys’ by aligning with niche online communities and content providers. We can argue about the accuracy of the numbers all we want, but you can’t ignore the shift that is happening in media, business and society in general that has been caused by online video. http://endavomediablog.typepad.com/

  12. John Grono from GAP Research, June 4, 2009 at 6:21 p.m.

    Peter, there definitely IS a shift occuring, which NO-ONE is disputing. What the CRE project is trying to do is gauge the SIZE of the shift. There are several components to this.

    * The first is "how many people have (ever) watched online video?" ... the answer is lots and lots of them.

    * The second is, "of those who have ever watched online video how many are repeat consumers?" ... the answer is far less clear but in all probability it is 90%+.

    * The third is, "of the repeat consumers how much time are they spending regularly consuming online video?" ... the answer is even less clear but until 100Mbps broadband is ubiquitous it appears as though it is the exception and not the rule.

    What CRE are trying to do is put some meat on the bones and provide two things. First, they are trying to provide a robust estimate of each component to provide a picture of what is happening right now. Second, this starting point will provide a benchmark against which we can trend growth - growth we ALL know will happen.

    The question I MOST want answered is, "do people who start consuming online video completely forsake forever traditionl video (i.e. television), and if not what 'share of eyeballs' does online tend to get?" The hyperbole of the online world says they do. The CRE research seems to be indicating that lots of us have trialled online, and become consumers of both - with the choice of which distrubtion channel at any given instance being a matter of content, convenience and cost (the three Cs of choice). It is also clear that we are not talking about a few decimal points, but that there is a massive gap between the barrow that those that are selling online video solutions, and what is actually happening (or is likely to happen in the next couple of years in my opinion).

    If robust research and benchmarking is seen as being negative towards online, then so be it. Our job as researchers is to establish the facts to the best of our collective ability. If the facts don't accord with the hyperbole and the vision, please don't shoot the messenger!

Next story loading loading..