Commentary

Why Web Analytics Tools Fail

In 2009, Web analytics managers have a multitude of different tools to select to deploy at their corporation.  Sets of tools from industry leaders, such as Omniture, WebTrends, Unica, CoreMetrics , Google, and Yahoo, are among the most popular, while options from smaller players like ClickTracks and Woopra exist as well.  In theory, you deploy a tool, customize it to fit your needs, and start analyzing the reports -- and it all goes swimmingly, right?

Then why have many corporations already chewed through two, maybe even three tools over the last several years, or deployed multiple tools in an attempt to arrive at where they need to be -- delivering comprehensive and systematic analysis to their business community, helping to drive action from insight and taking the mantra of "competing on analytics" and "data driven culture" to the next level?  Several factors cause disconnects between the promise of a tool and the successful use of a tool, which cause a tool to fail:

Inability to customize to business needs.  As sites adopt more and more AJAX, Flash, and Web 2.0 technologies like video, social media, and RSS, many Web analytics tools do not have features necessary to track these new media.  The catalyst for change comes when the business desires to track events on the site and can't using the current tool, so the company begins to search for a new tool that can.    

Training.  A corporation must hire or train people who understand how to use a tool.  It doesn't always follow that because someone knows how to use Tool X, they can easily move over to using Tool Y.  If a corporation doesn't budget both the time and money to extend its team's ability to use a tool, the tool will not be effectively wielded and will wither on the vine.  It's important to allocate resources to ensure your staff has the most current training available; otherwise, the tools you have could be considered useless because they can't be employed effectively, which leads to the exploration of alternatives and the subsequent purchase of other tools.

Lack of analytical resources.  Not a tool problem per se, but this issue reflects itself in an inability to quickly and agilely respond to business requests to extend the tool, provide data, or, worse yet, analyze the data.  If a company can't dedicate sufficient resources to using and extending a tool and analyzing the data collected, a business can quickly conclude the tool, rightly or wrongly, has little to no value and seek alternatives.

Too much aggregated data.  Most Web analytics tools provide cumulative sums of data at the visit level.  They will tell you "you have had X instances of Y."  What most tools won't tell you is how a particular visitor or groups of segmented visitors behave on the site.  For those companies that want to do targeted email campaigning based on understanding visitor level data or evaluating the performance of ad campaigns on a per campaign ad creative basis based on visitor behavior, many web analytics tools just can't meet that business requirement.  Or, the analytics tool may require multiple applications that weren't purchased to fulfill the vision.

Inordinate complexity.  The idea of analytics tool deployment and extension being "easy" is somewhat of a joke in analytic's circles. The difficultly and complexity in taking full advantage of a web analytics tool is in how you extend it to meet your business needs.  And many tools make it less than intuitive or in the worse case way too hard to extend a tool across an enterprise -- from challenges with page tagging, to orchestrating changes to page tagging, to QAing tags and reports, to building out a custom schema to requiring the configuration and integration of additional applications to deliver against requirements .  When these things go wrong, companies get frustrated and seek alternative solutions often abandoning a tool in the process.

Many other reasons exist, of course, for why tools fail (cost, infrastructures, data availability, and so on), but I think the five issues I've mentioned above are some of the primary drivers for tool change in the corporate world.  But now I ask you, the experts, why has your Web analytics tool failed you and caused you to switch to another solution?






2 comments about "Why Web Analytics Tools Fail".
Check to receive email when comments are posted.
  1. Matt Biskup from AdEngine, LLC, April 9, 2009 at 3:20 p.m.

    Sometimes you need multiple tools. Analytics doesn't do full path analysis. For that you need Urchin. Or Omniture Site Catalyst, but Omniture's tracking parameters are different from Analytics so if you're using GA tracking parameters, good luck using them in Omniture. And of course there's always the original home-grown system which everybody has. All of these tools are subject to a valid IT install and setup, that's yet another point of failure that doesn't really have anything to do with the statistics provider itself.

  2. John Lane, April 21, 2009 at 11:45 a.m.

    As many times as not, I feel the failure happens before the tool is decided on. No matter which tool you use, web analytics reports will fail if you haven't defined a metric for success (or failure) to measure against.

Next story loading loading..