What do marketers care most about in evaluating content Web sites as advertising vehicles? According to Advertiser Perceptions,
marketers and ad agencies polled in the research firm's semiannual Advertising Intelligence Research study consistently select three priorities in the same order: reach, composition and
engagement. And while the Web lends itself to a wealth of analysis in each of those three areas, there is widespread uneasiness about the state of online measurement. Questions
about the accuracy of measurement methods continue to frustrate advertisers and publishers. Just how to measure reach, composition, and engagement then, is a fundamental issue. And when it comes to
evaluating online engagement, there is no consensus on what engagement really signifies. A big part of the problem is that engagement is a complex affair -- and with the proliferation of social and
professional networks and collaboration -- becoming more complex all the time.
In this series, I will examine the current approaches to online measurement and offer solutions
that will enable us to define measure and benefit from engagement intelligence. In this first of three articles I will look at the battle over measurement methods and why we need a better concept of
online engagement. The second article will focus on breakthrough concepts of what engagement should entail and how it can be measured. And finally, I will explore how advertisers and publishers can
collaborate to use the insights from engagement intelligence to produce the effective advertising strategies --succeeding because they are well-connected to specific audience experience, interests,
and needs.
Media industry dissatisfaction with methodology has its roots in the argument that traditional measurement doesn't work for online. Web publishers have long
complained of the wide discrepancy that often exists between direct publisher data and that of Nielsen and comScore. Both firms use a "panel," or a representative sample base of users as the
source of their research.
This model falters though with smaller and midsized sites, which do not show up on the Nielsen or comScore radar screens. And even among the largest sites,
there has been a rising crescendo of complaint among big brand name publishers (for example, see "How Many
Hits? Depending on Who's Counting," in The New York Times) that Nielsen and comScore frequently account for only half or even a third of all their site visitors. This hue and cry
has prompted the Interactive Advertising Bureau to force the hand of Nielsen and comScore, and each has agreed to submit to outside audits.
It is hard to see, though, how even an
independent audits will placate Web publishers. There will still almost certainly be significant discrepancies between large site data and panel-based measurements. And there will still remain
millions of sites not large enough to be reliably measured. The questions then become: Have panels outlived their usefulness, and is there a way to validate or reconcile data captured directly by the
Web publishers?
Enter into this debate, startup measurement firm, Quantcast, which is brandishing a method that embraces
direct publisher data as an indispensable piece of the puzzle. "The reason why the old TV way of measuring audiences doesn't work," says Quantcast CEO, Konrad Feldman, "is
because the fragmentation of the Web kills the utility of the panel." The Internet, he says, "is the opposite of broadcast. It delivers on an individual basis. A panel-only approach cannot
deal with such massive individual delivery of content and interaction." Quantcast uses the panel approach as a starting point, claiming a user base of 2 million, the same size as comScore and
Nielsen. But it also sources Web publisher sites as well and then reconciles the different sets of data through an algorithm called "Mass Inference."
If publisher data can be
verified and normalized through the Quantcast method, then it will be a happy day for publishers of all sizes. Small and medium sites -- the "long tail" of the Web -- would then stand a
better chance of gaining the confidence of advertisers and agencies. And it stands to reason that larger publishers who believe they have been significantly undercounted have the opportunity to
capture greater shares of media plans. Indeed, Feldman says, one of the developments that has surprised him the most has been the growing list of top tier Web sites "have come to us to be
'Quantified.'"
The idea of reconciling publisher data with independent sources appears to be an eminently logical end strategy for the metrics battle. The obvious question
would then be the reliability of the reconciliation. If Quantcast is blazing the best path, the company will surely have to prove the veracity of its "mass inference" algorithm. I would
expect to see it -- or any other firm that travels down the reconciliation path -- ultimately need to gain IAB or Advertising Research Foundation validation.
If the industry can then
establish a standard method to verify measurement results, we will be in a better position to meet two of the big three advertiser priorities for site evaluation: reach and composition. That leaves
engagement, which will be the toughest of all to measure. Unlike reach and composition, there is no established metric. Nielsen made big news this year by shifting from page views to time spent on the site as the key engagement metric. The problem here is that time on a site tells an
incomplete story. If, for example, users on AA.com are booking flights faster and more efficiently than they are at Delta, what does duration as a standalone metric mean?
Instead of
time, Quantcast looks at how often a user returns to a site as the critical engagement metric. An audience is then categorized as "Passers-by," "Regulars," or "Addicts."
Addicts visit a site 30 or more times per month, Regulars more than once a month, and Passers-by once a month or less. To get an idea of how measuring engagement in this way can enhance advertising
conversations, take a look at Yahoo and MSN's numbers today on Quantcast. 16% of Yahoo's visitors would be classified as Addicts and 63% as Regulars. Those numbers drop to 4% and 52%,
respectively, for MSN. Which means that MSN has a much higher slice of Passers-by: 44% to Yahoo's 21%.Hmmm... These numbers would seem to illustrate something -- though we don't have much here
to explain what that is --about the relative engagement of visitors to these sites. The strength of this single engagement metric is the adoption of an index that allows us to gauge
site loyalty. The weakness is that it is limited to one aspect of engagement.
ComScore has a more layered approach, which gets us closer to the complexity involved in measuring
engagement. The company has moved to a "visits" metric, with multiple views of the engagement -- how often a user returns
to a site, adding in the amount of time and page view per visit. Is that enough to describe engagement? Hardly, as we are still looking at standalone metrics, albeit an array of them, without
reference to the way they interrelate with each other. In other words, we are still seeking a definition of engagement.