The online and offline dialogue the series has generated provides good food for thought. It seems to me that the feedback is fairly unanimous that the Peterson model opens a new and valuable dimension -- acknowledging the complexity of
measuring and talking about engagement, and showing us a way to make the analysis and storytelling well worth our while. It also strikes me that we now have the conceptual framework, but we need
publishers and measurement companies who can step forward and lead the way. To summarize the online and email perspectives, I would bucket them in three key areas:
1. We need to refer site engagement indexes to industry means. What are the individual engagement indexes? Click depth, loyalty, interactivity, recency, duration, and subscription. Each index measures a type of online engagement on a site. In the way that Eric Peterson has developed the concept, the individual site is the reference point for each index. If you were looking at CNN.com, for example, you would identify a range of averages within the site and chart each index against that site's average.
Chris Murdoch of Boston.com, though, made the point that we need to refer the index to something larger than the site. "Industry means should be used as reference points," he said, "for individual indexes that make up the composite index (instead of averages or means)." Ah, but we would need to know what those industry means are, and here we can see a role that comScore, Nielsen and Quantcast can play. It was my perspective that the panel-only approach of comScore and Nielsen was a liability to get at the kind of deep site information, but based on the reactions from various quarters, I am not so sure I was right about that.
Eric Peterson offered his thinking on Murdoch's observation: "One of the distinct advantages my calculation applied to a panel-derived dataset is cross-site visibility, something that EVERYONE seems to be out there looking for. Imagine a series of vertically focused engagement calculations -- one for media, one for retail, etc. -- in which thresholds were set based on average/median/observed distribution of participant data across the core index calculation. Media buyers would then be able to buy at ‘high engagement media properties where engagement is primarily driven by loyalty and likelihood to subscribe,' etc."
2. The Engagement Index is too linear -- it needs to be shaped or weighted. John Grono of GAP Research of Australia found the Peterson engagement index "interesting," but saw an "underlying assumption of linearity" in the model because each index is given an equal weight.
This seems to be a valid point. Is duration equal to click depth? Are they both on the same plane as site loyalty or recency? In fact, Peterson developed the engagement index to enable individual Web publishers to understand their own audience interaction on their sites.
But then can the index be useful in discussions between advertisers and publishers? I would say, absolutely. As I see it, the Peterson model is an ideal means for a publisher to understand the activity and audience behavior and preferences. As a Web analytics tool par excellence, the engagement index can be the starting point for conversations about the relative significance of duration, interactivity, and click depth. The more publishers understand about their audience engagement, and the more that that knowledge can be quantified, the better positioned that publisher is to address the ultimate advertising question, "What kind of advertising approach will work best with your audience for this particular campaign?"
3. Are there irreconcilable differences between panel-based and publisher-specific data? In his comScore blog, Marv Pollack, senior vice president of marketing communications at comScore, argues against the idea that newcomer Quantcast has a valid approach in its use of publisher data combined with a panel method. Pollack calls into question whether the engagement model can work: "Intriguing though the concept may appear, the claim that server data can be integrated with panel data in order to obtain accurate online audience data is fundamentally flawed. No amount of "black box" manipulation can correct for the fact that server data are incapable of knowing who is on the computer visiting the site. You simply can't make a silk purse out of a sow's ear."
Is it true that publisher data cannot be reconciled against the panel? I leave that question for comScore to debate with Quantcast and others who are emerging with panel + publisher analysis engines. But if Pollack is right, then my question would be, "Can you build on the Peterson model and show the media industry a way to look more deeply into engagement dynamics?" There is nothing at all stopping publishers from using Web analytics tools from Omniture/Visual Sciences, Unica, or WebTrends to activate the Peterson model today. Wouldn't it be an ideal world, though, if we could create the industry engagement means that we described above?