Commentary

Direct Marketing on Steroids

RAM-Book Excerpt-Digital EngagementFor most marketers, there is only one metric that counts: conversions. This usually means sales — how many sales did the banner ad or that email blast bring in? How many newsletter sign-ups or Web video pass-alongs were created in a MySpace campaign?

Out of ignorance or laziness, that's usually about as far as most executives get when it comes to exploring better ways to improve their online operations.

If you're using paid media, your site report and the reports generated by your third-party ad server may not be enough to adequately measure the return on investments made in online display ads, search and email campaigns. A great many marketers who are bullish on paid search always point to successful conversion rates. But these also may be misleading. This may be especially damaging for organizations that, in a recessionary economy, may have to dial down the dollars they spend on online marketing. Sticking to what seems to have worked in the past may be a losing strategy.

Those marketers tempted to go the less expensive route and just purchase search rather than bother with display advertising may be sacrificing more than lift — they may lose the whole ball game. That's what Brian McAndrews is telling us.

And Brian McAndrews isn't just anybody. He's held top-level positions at aQuantive and Microsoft. So we should pay attention. McAndrews says search is given too much credit because of the way the effectiveness of ads is measured. When an online transaction takes place, the sale is attributed to the last ad viewed, which is most often a search ad.

Since search logically is often the last thing people do, it's arguably getting more credit than it deserves," he says. "It's probably overvalued now." For example, if a customer sees a banner promoting a product on MSN and watches a related video on Time Warner Inc.'s AOL and then searches for the brand on Google before making a purchase, only Google gets credit for the sale.

There has been much discussion of the "last click" issue, with conjecture that each industry might have its own formula for weighting "assists" that lead to a conversion. Most agreed that the "last click" has been over-attributed, in general to the benefit of Google.

Until very recently, neither Atlas (part of aQuantive, now Microsoft) nor DoubleClick (now a part of Google) could see beyond the last click to show the role of display ads and other search terms in contributing to the final click. Now, Atlas, DoubleClick, Eyeblaster and others offer tags such as Atlas' Universal Action Tag that reveal the path the user took to get to a conversion. It's this newly available data, McAndrews argues, that will diminish the relative importance of search and boost other kinds of online advertising.

This does not diminish the importance of search marketing, but it is more likely to enable us to really see the interrelationships of media exposure and to plan accordingly. Our old colleague Rick Bruner (with DoubleClick, and now with Google) and John Chandler of Atlas recently teamed up for a presentation on "attribution management" that could be shown to track how adding display lifted conversions by 22 percent, even though about 70 percent of the "final clicks" came directly from search links.

Why Even Good Numbers Lie
A very interesting research project done in summer 2007 by Jim Sterne of Stone Temple Consulting compared different Web analytics tools on the same set of Web sites. The test looked at four different Web sites with analytics tools supplied by ClickTracks, Google Analytics, IndexTools, Unica Affinium's NetInsight, Visual Science's HBX Analytics, Omniture's SiteCatalyst and Web Trends. With the exception of the last two, engineering support was provided by the tools' developers to assist in the test. The results: Different tools installed on the same Web site produce different numbers. In some cases the variance was staggering, as high as a 150 percent difference, according to Sterne.

A few factors cropped up to explain the variances. One involved the use of JavaScript on the site; if placed low on a screen page (below the fold, you might say) a slow load might result in some visitors leaving the page before the JavaScript program could be executed. Thus, the data for that visit would be lost, and data regarding the original landing page, as well as keyword data (if the visitor came in from a search engine), would not be recorded or counted.

Time lag in counting also made a difference. In the test, it was observed that sometimes Google counted more, but generally counted faster. The Google tool executed its count in an average speed of 0.07 seconds, while the next closest, IndexTools, took 4 seconds to execute the count. The differential of 3.3 seconds may be significant, considering it takes much less time for a Web user to click onto a Web page, look around and depart.

The study was useful in that it uncovered typical reasons why a metric might not be counted properly. First among them: tagging errors. Making mistakes in placing JavaScript on a Web page turns out to be a common problem when every page has to be tagged, and whenever pages are updated, added or removed.

Metrics for Web 2.0
Keeping track of viral programming can be easier if materials are tagged before being sent off into the World Wide Web. This helps tracking, but analyzing the data is still rather vague.

Jodi McDermott is one of several Web gurus proposing that measuring the effectiveness of a widget can be quantified by combining three types of data — the number of times a widget is imported into a new URL environment ("placements") plus the number of requests to view a widget ("views"), divided by the number of unique visitors to the Web site ("unique visitors").

Viral videos can also be tracked by tagging their URLs with a bit of extra code, which is a method favored by viral consultants like Dan Greenberg.

A loyalty index may provide some measurement of how engaged unique visitors are with a branded Web site. One clue to this may be return rate. David Smith of Mediasmith recently compared two high-volume Web sites, the networking site Facebook, and a pure news-and-information site, about.com. In the time period measured, Smith noted that more than 60 percent of Facebook users were returning to the site more than 30 times in a month, or at least once a day, while just 2 percent of about.com's visitors came back on a daily basis.

Again, this doesn't tell the whole story. The statistics suggest that the other 98 percent of about.com visitors presumably found their answer to their specific question and didn't need to come back to the site. And the frequent visitors to the Facebook site may have simply been bored post-teens briefly checking their status on a cell phone, rather than active participants in the online social whirl. Duration data might help here, but that still doesn't quantify how satisfied visitors to either site might feel about the experience.

Eric Peterson is one of several people contemplating how best to measure engagement with a purely statistical analysis. His formula includes some new and interesting metrics that appear to resemble those used to quantify search engine rankings, and includes "click depth" as a measure of how many visitors move beyond, say, the first two or three pages of a Web site, and "recency" as a measure of repeat visitors within a given time frame.

Adapted from Digital Engagement: Internet Marketing that Captures Customers and Builds Brand Loyalty by Leland Harden and Bob Heyman. Copyright © 2009 Leland Harden and Bob Heyman. Published by AMACOM Books, a division of American Management Association, New York, NY. Used with permission. All rights reserved. http://www.amacombooks.org.
 

3 comments about "Direct Marketing on Steroids".
Check to receive email when comments are posted.
  1. Gary Thoulouis from Orange Peel S.L., June 1, 2009 at 3:10 p.m.

    Bob,

    Thank you. In fact the power of online marketing in its entirety, just as offline marketing, is never really appreciated by the senior execs, as they concentrate only on measurable outcomes, in spite of the lesser accepted fact, as you have pointed out, that the final click is often a result of a brand recognition brought on by many other elements of marketing.

    May I be so bold as to indulge your experience with two simple questions?

    Question number one: If I can easily measure the amount of unique visitors, return visitors, page views (including which pages and for how long), origination and conversions produced by a website, could I not easily determine how best to configure my online presence to increase my return based on this information?

    Question number two: How would you list the following online business tools in order of importance - website content, website plan, SEO, online advertising, affiliate network, PPC advertising, CPM advertising, social media (blogs, micro blogs, forums, social networks)?

    Thank you
    Gary Thoulouis

  2. Jodi Mcdermott from comScore, June 2, 2009 at 9:12 a.m.

    Bob/Leland,

    I am writing to clarify that I don't believe I have ever proposed a metric where installs (also referred to as placements) plus views are divided by unique visitors as stated above. Effectiveness of a widget must be measured against the stated intentions of a particular campaign and its business objectives.

    The metrics that we at Clearspring use to evaluate effectiveness of a customer's campaign include:

    * Installs (the number of unique instances of a widget) evaluated over the domains that the widget was installed on).
    * Grabs (the number of unique instances of a widget evaluated over the domains from which the widget was spread from).
    * Views
    * Unique Visitors
    * Interaction metrics such as: clicks, clickthroughs, time spent and custom events that the widget developer may have instrumented their widget with.

    Depending on what the widget's purpose was (promotion, utility, brand awareness, direct response), each one of these metrics are used as a tool to evaluate actual campaign behavior against intended behavior.

    I do not propose one "engagement" metric that is a catch-all for evaluating the effectiveness of a campaign. I firmly believe that an analyst/marketer must evaluate multiple metrics together to paint the picture of whether or not their campaign is effective against the business goals that they are trying to achieve.

    Regards,

    Jodi McDermott
    Director, Data Strategy
    Clearspring Technologies

  3. Brian Clifton, June 2, 2009 at 11:11 a.m.

    Good article Bob - A couple of points for consideration:

    1. Whilst it is true the attribution model of all web analytics vendors is not perfect when it comes to identifying all possible touch points a visitor my have before becoming a customer, this issue in itself is not as significant as you might think. Let me explain...

    The example given by Brian McAndrews is an over simplification:
    "if a customer sees a banner promoting a product on MSN and watches a related video on Time Warner Inc.'s AOL and then searches for the brand on Google before making a purchase, only Google gets credit for the sale."

    If you think about it, that path (MSN -> AOL -> Google) is not a likely visitor journey. A searcher that is an MSN user is much more likely to return to MSN and conduct their brand search, not another search engine. Search engine loyalty is in fact very strong on the web - users search on Google/MSN/Yahoo/Baidu/Naver or whatever they prefer, and rarely (if ever) mix and match their search engines.

    There are in fact much larger issues to consider when it comes to web analytics accuracy (see whitepaper link in point 2)

    Firstly, cookies are used by all the major vendor as the method for tracking visit activity. These can get lost, blocked and deleted by users. Research has shown that after a period of four weeks, nearly one third of tracking cookies are missing, which means the visitor referral history is lost.

    The second is a much larger problem and is due to how many devices people use to access the web. For example, consider the following scenario:

    You and your spouse are considering your next holiday. Your spouse first checks out possible locations on your joint PC at home and saves a list of website links.

    The next evening you use the same PC to review these links. Unable to decide that night, you email the list to your office and the next day you continue your holiday checks during your lunch hour at work and also review these again on your mobile while commuting home on the train.

    Day three of your search resumes at your friend’s house where you seek a second opinion. Finally you go home and book online using your shared PC.

    The above scenario is actually very common – particularly if the value of the purchase is significant, which implies a longer consideration period and the seeking of a second opinion (spouse, friends work colleagues).

    Simply put, there is not a web analytics solution in the world that can accurately track this scenario, nor is there likely to be in the near future.

    When you consider this, the finer detail of an exact attribution model is less important. While of course this is not perfect, it is still light years ahead of measurement in the off-line world. So its not such a big deal for marketers to get their head round...

    2. For an in depth whitepaper on all web analytics accuracy issues, please take a look at: www.advanced-web-metrics.com/accuracy-whitepaper/ (it also references the Stone Temple report)

    BTW, Jim Sterne does not work for Stone Temple unless there is another one (http://www.linkedin.com/in/jimsterne). Surely there can only be one Jim Sterne....!

    Best regards, Brian Clifton
    Author: Advanced Web Metrics with Google Analytics

Next story loading loading..