Commentary

Making Sense Of Results From Online Campaigns

When I talk to marketers these days, one of the most frequently cited needs I hear about is the ability to measure (and make sense of) results from online campaigns.

 

 

As reported by McKinsey last year, the number one gap in most organizations is their ability to measure results. Beyond high level reporting, most have trouble tracing the source of actions (e.g. engagement, leads or sales) back to those individual ads, email blasts or search phrases that were invested in through the Web.

They have data and reports, but they often do not make sense. While useful tools such as Omniture, WebTrends and Google Analytics now enable them to attribute actions to the ad that preceded the conversion, the results are often contradictory to what their media plan forecasted.

If you are in this category, don't despair -- you are in good company. True understanding of online campaign results is not an easy task. While there are numerous factors that impact the equation, I believe the two most pressing issues are: 1) shortcomings of cookie-based tracking, and 2) the fallacy of last-click analysis and what I call the "Wingman effect."

Crumbling Cookies

As consumers become smarter and savvier online, they are also more deliberate and less impulsive in their decision-making process. With so many more options at their fingertips, they can easily do research and comparison shopping before taking action. For big purchases, they often read reviews and confer with others. And we are increasingly seeing consumers "surf at work, buy at home", doing research on one computer and taking action on another. This is why conversion rates after 6 p.m. are often higher than they are during work hours. Since we rely on cookies to track actions for each individual, these issues limit our ability to measure results. Here's an example:

Let's say you are at work, and thinking about how much you need a vacation. You then see a VacationsToGo ad for a Caribbean cruise on your MyYahoo home page. You click through and like what you see, but you don't have time and cubicles don't offer enough privacy for vacation shopping. Later that day you tell your spouse about the trip and direct them to VacationsToGo.com to learn more. Your spouse finds it through Google and does some research. Later that evening, in the safety and comfort of your home, you use your personal computer and navigate directly to VacationsToGo.com to book the trip. Five minutes later, you are thinking about what to wear in Cozumel.

From the Web site's perspective, you are a mystery. Since your home computer does not have a cookie from a prior visit, it assumes you are a first-time visitor. Since it has no way of knowing that you previously visited through display and search ads, prior visits will appear to be wasted ad spend. The poor analyst has no clue the display and search ad contributed to the process. He will only see that at 7 p.m. a first-time visitor booked a cruise. The CMO may think "if our brand is this good, we don't need to advertise as much!" You can see where this may lead...

The example above illustrates how multiple visits and machines reduce the effectiveness of cookie-based tracking.

The increasing use of cookie-cleaning tools exacerbates the problem. This is especially pertinent if your visitors are browsing at work. If, as widely reported, 40% of third-party cookies are either not accepted or deleted within 30 days, we are blind to what impacts a significant portion of our results. Consequently, cookie-based analytics platform will not provide an accurate measure of what is driving results.

So the lesson here is: use the data from your reports, but don't put too much credibility in the stats. You are only seeing part of the picture.

The Wingman Effect

The second impediment to measurement is that most Web analytics platforms (Omniture, WebTrends, Google) were designed to attribute credit for a conversion to the last medium clicked (e.g. display ad, email message or search engine listing). As noted above, user engagement typically entails multiple touch-points, both online and offline before a transaction occurs. This is especially true with considered purchases, where customers often make multiple visits, via many paths before taking action. Even if all of the visits are done on the same computer (and you could track using cookies), assigning credit for the conversion to the last click reveals only part of the true picture. And given that consumers are increasingly relying on search engines to find a site they've already visited, it often rewards the search engine at the expense of display ads, email, social and other media.

To illustrate this concept, I offer the analogy of the proverbial Wingman. In a social setting, the Wingman is the guy who helps his buddy get dates. He is often the one who initiates conversation and breaks the ice so his friend can join in, take over the conversation, and hopefully get a phone number. If the friend is successful, he knows he has the Wingman to thank for initiating the conversation.

In the online marketing arena, display ads, aka banners, serve the role of the Wingman. They often start the conversation, engage the prospect and funnel them back to the search engine. But unlike the social scene, search engines get all the credit and the Wingman's contributions go unnoticed.

When media planners attribute credit only to the last click, they often inadvertently cut display ads because they can't see the supporting role they play in the engagement cycle. The same can be said for social media, email, and other online media. If you cut everything that doesn't convert directly, you are likely to kill the goose that lays the golden eggs.

The takeaway for marketers is that you need to track interaction through the engagement cycle. Recently coined "Engagement Mapping," savvy marketers are now tracking the first, middle and last clicks to determine how each media unit impacts results.

We've seen firsthand on numerous campaigns that for every conversion we can directly attribute to an ad, there are 0.5 to 2.0 indirect actions that are not traceable for the reasons cited above.

While these are formidable challenges, do not despair -- there are affordable, proven methods for overcoming both of them. Since this is how we make a living, I can't share all the secrets with you. But I can provide some general recommendations.

First, take a strategic approach to engagement mapping that will shed light on the various contributions (lead, supporting, etc.) each online media unit plays in the engagement cycle. Once you understand which units create awareness, and which ones close the deal, you can produce more strategic media plans.

Second, you should treat every campaign as a learning experience and make systematic media testing an ongoing program. By varying flight dates, you can gain better insights into the performance of each component of your online media mix.

Lastly, you should look at the overall lift in site traffic and activity, not just the visits that are directly attributable to clicks. Don't underestimate the tendency for people to take action on their second, third or forth visits. Take a holistic view and you'll see a much clearer picture.

Tags: commentary
Recommend (1)