Challenging Web Analytics As 'The Little Engine That Could'
A few weeks ago I had the pleasure of giving one of the keynotes at the Web Analytics Congress in Amsterdam. Sandwiched in between my regular work and biking around the city, the conference further proved to me that an emphasis on creating a data-driven culture from Web data and other data sources was a global concern. The challenges and solutions in collecting, measuring, reporting, and analyzing Web behavioral data transcend cultures and geographies: the lack of standardization, complaints about ineffective tools, and concerns about how to analyze data and create meaningful insights. We're all in this space together sharing the same challenges, and, hopefully, overcoming them.
The main theme of my presentation was that it takes effort to get web analytics "right" and the discipline is, metaphorically speaking, the "little engine that could" amidst a garage full of other, more traditional and established, bigger engines that can't. You know those engines, like your huge transactional business intelligence systems with their seemingly unintelligible schemas, failed ETL, and latent cubes, and thus reports. Perhaps your enterprise data warehouse that the tech team decided to create without agreeing upon standard data definitions from the business, so business units have carved it out into functional data marts stove piped amongst your organization. And, then there is your web analytics tool - the best of which can integrate with many other systems both in your enterprise and outside it - that can help drive your digital channel and, at the pinnacle, enable automated interactive marketing.
The buck does not stop with Web analytics by any means. Web analytics tools can fail. Web analytics systems fall victim to the same organizational, governance, political, definitional, and/or technical challenges that have encumbered other business and technical initiatives for deriving insight from your corporate data.
The high-level challenges are numerous and include:
· Lack of high-quality data. By that I mean standard data definitions agreed-upon by stakeholders, practiced with diligence, and enforced by your organization (let's call it data governance on the business side and master data management on the technical side). In Web analytics, many of the standard "Big 4" metrics (visitors, visits, page views, time spent) are already defined, but there are all sorts of nuance to these data. Sure, folks like ABCe can help ensure you are counting them, um, "correctly." But don't look at auditing firms to help you count "custom metrics" -- these are the business-specific numerators and denominators in your corporate KPIs. They are much harder to define, collect, report and ensure that they're high-quality across your business.
· Company resistance. If you ask a company's principals about being data-driven, they say "Yes, we want to be." But is that truly in the corporate culture and DNA? Senior leaders may want to be data-driven. But do the line-of-business people? Does the company tend to be "data fearful," "data reactive, not proactive," and "data self-promoting"? People like the data until they don't. Then it's wrong or inaccurate. And, at this time in the Internet's evolution, Web data has a lot of issues, like the eternal "cookies vs. people" argument, to issues with the lack of believability from proprietary statistical models based on population sampling or "hybrid" data combination (whether accredited or not). When "data-disbelief" is inflated to an enterprise concern and already has friction, good luck creating or optimizing the "data-driven" culture -- doing so is that much harder.
· Lack of leadership around data and analytics. Ownership of the data and analytics shouldn't be a technical function. Data and analytics leadership, governance, and stewardship are business functions. Technology is an important component for managing the systems that collect, create, and present data, but can only be successful under business guidance (not the other way around). Companies succeeding with analytics establish business leadership around data and analytics. Leaders advocate for data and analytics usage, form cross-functional business relationships that enable shared responsibility for analytics, and create the data and analytics "ecosystem" necessary to deliver it. They are tasked with focusing on data and evaluated by delivering results from it.
· Failure to define goals. One of the hardest activities in analytics is getting people to agree on goals. Goals create transparency and accountability. Businesses and the people within them have different levels of comfort with transparency and accountability. Failure to define goals means less (and in the worse case) no context for analysis. Smart companies define and manage toward goals. Analytics goals should not generate fear. They are something to strive toward, from which you can derive corporate motivation to lead with analytics.
· The resource issue. No corporate function is ever adequately resourced, but when I speak to analytics managers, the lack of resourcing in analytics functions seems extremely severe. There are 1,500 people worldwide in the Web Analytics Association. There are hundreds of millions of Web sites. Do the math! The resource shortage is severe -- and in Web analytics there is a long-tail of beginners. Database marketers and other research professionals also claim that they "can't find people." When you are in a company that wants Web data and analytics - you need to staff effectively and cherish those workers. Because as the economy thaws, those companies that haven't nurtured and cared for those resources, will inevitably lose them, exacerbating an already-existing problem.
· Never enough time. Like resources, there's never enough time. This issue is acute in Web analytics. The best Web development lifecycle processes allow enough time for Web analytics, while the worst ones don't. Think of the Web analytics value chain: from business questions and site goals, to data definitions, to tagging/site instrumentation, to tag testing, to report creation, to report testing, to reporting, to analysis, to insights and recommendations, to site optimization. That's a long, complex value chain and many externals can impact and prevent success in each part. But the main offender may be lack of time from disconnected corporate processes that don't understand the importance of "giving enough time" to derive analytics.
Solutions exist to all these challenges. I'll discuss those I consider necessary to move the "little engine that could" over that huge mountain in a future column. Until then, what challenges and solutions have you encountered during your career when doing analytics and trying to create a data-driven culture?