This week marked the launch of the first E-metrics Summit in Toronto, Canada hosted by the Web Analytics Association. The kick-off started with a panel discussion led by Chris Williams of Media Contacts Canada on the various reporting and audience measurement tools available to the web analyst and other data consumers within the organization. Now we have all discussed this topic before, but it was interesting to have two Web analysts (myself and Judah Phillips) side by side with the president of comScore Canada, Brent Bernie, and the president of IAB Canada, Paula Gignac. The panel quickly dove into the struggle that the Web analyst has in reporting throughout their organization.
So how should Web analysts use audience measurement reports, data from ad-servers and Web analytics tools to perform their job? The conflicts that arise from combining these tools to answer business questions are sometimes harder to explain than the numbers they provide. Knowing that the three data resources will never match each other, when and why do you use each tool? And how do you explain to your internal constituents (usually your manager, other data consumers or the CMO) the variances that are just about un-resolvable?
Here is what we learned:
· Web Analytics tools give you that granular data of what is going on within your own site. If you know that your site is tagged correctly and that you understand the filters applied to your data, there is a ton of valuable information that can be gleaned from your Web analytics tool. This is your tactical weapon for understanding how visitors are traversing your site and the only way to compare your SEM, SEO, display advertising and affiliate programs side by side.
· Ad serverdata provided by your agency or direct relationship with a publisher dictates how advertising transactions are billed between an advertiser and a publisher. Impression and click data is going to vary from what your Web analytics tool is telling you (whether your tag sits side by side with the ad tag or you are measuring click-throughs from a campaign code back to your own Web site). This is due to three primary issues:
o Latency -- The ad tag will fire long before your Web analytics or widget analytics impressions or views are captured. This is due to the simple fact that there is an order of operation for how these requests are made -- milliseconds can make a huge difference in the order of operation when we know that many visitors leave pages before they ever fully load. There is nothing we can do about this except to educate and understand that latency exists.
o Filtering -- Variances between how tools filter are never well documented or understood by the marketers or management that rely on their output (hmm...topic for a future post?). At a high level, you have robots and spider lists, IPs (internal, external or extraterrestrial), HTTP status codes, odd-behavioral rules, page names and content types to filter by. No two vendors do it exactly the same -- and good luck in asking them to clearly define what those rules are.
o Environment hazards - Welcome to the distributed Web! There is a laundry list of what I like to call "environmental hazards" that contribute to the variances between data sources. Some tags are blocked by personal/corporate firewalls and spyware software these days (and some of them specifically target ad servers as well as Web analytics tools). Keeping up with the existing and emerging hazards is like keeping up with the Kardashians. One would like to be fully in the know -- but it just is not going to happen.
· Audience measurement tools like comScore allow you to evaluate your site against your competitors. Because they are based on panel data, they won't provide you with the in-depth analysis that your Web analytics tool will provide, but they WILL let you know how you stack up against the competition. Audience measurement tools also provide deep insights into demographics like age, gender, household income and geography.
So where does that leave the analyst? Bottom line: Understand what each data source offers. These are all tools in your tool chest to perform your job. They each serve a specific purpose and can answer discrete business questions. One thing that panelists were all in agreement on is that you have to keep it simple. You can confuse your executive management and the managers that you support very quickly if you take them into the weeds - so don't. If they question the numbers, and they will, educate them at a high level on the merits of each source. Keep it at the bullet level -- and make sure you know your stuff and can deliver the message with confidence. Because if you don't understand it, they won't understand it, either.