Indeed, in many organizations you need a PhD in stats to understand analytics; yet the people making the business decisions, the CMOs, CFOs and media planners, lack the benefit of years of training in advanced analytics. Consequently, a major gap exists between the researchers and the decision makers To bridge that divide, we tell our clients that you need to establish processes and guidelines for translating analytics into easy to use business insights. One key component of this is having a rare breed of person who can see things from both sides of the fence -- the analytics and the business side -- to bring all the insights together. Unless this is done, the people who make decisions won't understand the insights gleaned from the analytics, or even the analytics themselves. The net result is that they are wary of the analytics and certainly not willing to "bet the farm" on them.
Another challenge is misinterpretation. Unless information is presented in an appropriate context, it can too easily be misinterpreted. In fact, we've seen companies make wrong decisions because they misconstrued the analytics. This can harm the credibility of an analytics program, and even more dire, hurt the business.
Beyond this, an analytics program becomes more complicated when you add into the mix numerous projects, which in some cases provide conflicting insights. We've seen some companies invest large sums of money in multiple analytics programs only to throw up their hands and not use any program because they didn't understand the results.
Leading companies, however, are putting in place the necessary processes and systems to make all this work. Here are some lessons learned that will help improve your analytics program.
Lesson 1: Gain agreement on the objectives and scope of the analytics
All too often companies fall into the trap of doing analytics for the sake of doing analytics. Before the analytics team rolls up their sleeves, they need a clear definition of what decisions they will support, what business questions will be answered, and with what level of specificity. That will help ensure the analytics are used.
Lesson 2: Set and manage stakeholder expectations
When launching an analytics program, a company needs to be sure all stakeholders know upfront what can and cannot be done, and the level of precision to expect. As a program gains support, it is easy for its focus to blur and the program expanded so much that it never gets completed. To prevent "analytics-bloat," establish a program's parameters at the get-go, and stick to them.
Lesson 3: Clearly define roles, responsibilities and communication
We've seen too many situations where a study is completed only to have one senior-level person stand up, and pointing his finger, say, "The data you used is wrong," or "Why didn't we answer this question?" To prevent these mid-term naysayers, it is imperative that all the right people be involved at the right point of the process, providing appropriate input, review and approval.
Lesson 4: Keep the math and analytics simple and transparent
Most decision makers aren't well-trained in statistical analysis so you need to establish creative ways to help them understand how the analytics work, and why they should trust them. Organizations that are doing this well have full transparency into the analytic process, including the strengths and weaknesses of the results.
Lesson 5: Establish guidelines for interpreting and translating results into actions
Today's analytics are advanced, yet complicated, with many moving parts, and no user interface. Think about how important ease of use is for driving the adoption of products such as the iPod and iPhone. Imagine how difficult they would be to use if they lacked a user interface allowing you to quickly and easily work with them. The same is true for an analytics program. The analytics team needs to establish the "analytics user-interface" that delivers the right information available to the decision makers at the right time with the appropriate rules and guidelines for translation/interpretation and usage. The key is ease of use and context.
Establish a decision process
To formalize the process, establish a system to determine when different stakeholders should meet to review results, what information is needed at the meeting, and how to integrate multiple analytics during the planning process and throughout the year.
Doug Brooks, VP, MMA, will be delivering a workshop on this topic at the ANA's Marketing Accountability Conference on July 14t- 15th in Laguna, CA. http://www.ana.net/events/conferencemtg/MAC-JUL08