In the digital space we have something that no other medium has ever had before: the ability to collect empirical census data on the behavior of machines. Boy, is this an incredibly helpful
thing to have! Since consumers access the Internet by using one of these machines (like TV or radio, but unlike print or place-based media), the ability to empirically track machine
behavior has been one of the lynchpins of the success and growth of the online medium.
TV companies are only now coming to grips with the fact that they too have the ability to track
the behavior of some machines (those with digital set top boxes). The development promises to throw TV audience measurement into its greatest period of flux and uncertainty ever (well, that and
C3.) Fortunately in the digital space we've grown up with it.
But I have always believed - from back when I was an independent consultant going to ARF Internet Measurement Council
meetings in the early 2000s - that this capability was both a blessing and, in some ways, a curse. Because the relative (and I stress the term relative) certainty we have about machine
behavior has probably tainted our collective perspective on both the importance and difficulty of measuring the behavior of persons.
Most (but still not all) of us understand the very simple
truth that cookies aren't persons. But as we've gotten deeper and deeper into tagging, I've come to understand the issues with page view data as well. While unified (panel + census) page
view projections are typically quite close to a publisher's internal page view counts, there are many reasons for the two sources to diverge -- including the placement of multiple tags per page, the
page not completely loading, differential tag placement on pages or sites, or in a surprising (to me) number of cases, pages with missing or incorrect tags.
And too, some new
things we've learned about the prevalence (and identification and prevention) of non-human traffic suggest strongly that even with IAB- and MRC-compliant practices (including active identification and
filtration of non-human traffic), there can still be issues. Long story short, while site-centric census data can tell you all about what your server has done, there are numerous reasons
why that might differ from understanding what the people out there on the other end of the content request experienced.
I mention all this just to show that the certainty we think we have
about census data is really, as one digs into the details, somewhat more tenuous than we'd like to think.
This is not to suggest that census data (cookies and tags) lacks value. Quite
the contrary; census data is one of the core technologies that drives digital monetization - both through web analytics, in which KPIs are developed and tracked that enable site optimization, and
through cookie targeting, which, although I occasionally bash it, I will grudgingly concede does have a place in the inventory monetization schema.
Now ponder the challenges in measuring
people, as opposed to machines. Sure, you might be delivering ads to cookies, but it is still a consumer - a person - using the device who sees that ad (or doesn't) and who has some sort of
attitudinal or behavioral response to the ad. It is still the consumer, and not the cookie, who actually goes out and buys your stuff.
People. Dicey and
important.
People have become even more important over the past year or so, as the advent of smartphones and tablets has upended some core assumptions we made about Internet access
(specifically, that the machines we use to access the Internet are in fact computers.) As I speak with advertisers, agencies and publishers, perhaps the most consistent theme that emerges -- as far as
a present and future need -- is to understand digital consumption in a holistic, as opposed to (or in addition to) a siloed context.
Visitors access my site from computers, phones and
tablets, and I need to understand total consumption: how does reach build across these platforms? How do the platforms work together? It's hard to dope that out with cookies and tags
alone.
Oh yeah - and how do these platforms work with other platforms, like print and TV?
Now keep in mind, before you bust the vendors' collective chops for being behind the eight
ball on tablet measurement, 13 months ago there weren't any iPads. So forgive us all, I ask, tongue somewhat in cheek, if it takes more than a year between the time the first unit
sells to a consumer, and the time we've sussed measuring the platform.
Solving the measurement challenges that a changing digital landscape presents is what keeps us online metrics insiders
up nights. We're about to come out with a Total Universe report, combining and de-duplicating reach (unique visitors) and page views across computer, mobile, and tablet platforms. Total Universe is
the first salvo in attacking the problem (along with a multi-screen initiative with AT&T); there will be more.
All of us who make a living counting eyeballs will keep doing the best we can
do, keep innovating, and keep improving, because you push us, dear reader, and because we push ourselves. No, this stuff ain't easy. But if it were, what fun would that be?