To start, analytics are not an audit. An audit involves a series of checks and balances, beyond simply quality control, that tests data for accuracy. Further, audits provide standardized metrics
and methodology, consistency of process, and transparency of results.
Traffic analytic tools certainly have their place as an important business resource, but they do not produce audited
data. From their inception, reputable companies providing analytic tools have never claimed to be auditors, but they have consistently positioned themselves as "third party."
In the
world of media, third party equates to an auditor. This has caused misunderstanding in the marketplace whereby analytic data has been accepted as audited data.
This is not to disparage
analytic tools. They do a very good job providing actionable information to help one manage a site and improve performance. But analytic tools were never intended to produce the audited traffic data
upon which an ad buy/sell decision is made. And buyers of online media need to be aware that different tools produce different results.
advertisement
advertisement
Standardized Metrics and
Methodology
Two years ago, a white paper was issued by Stone Temple Consulting entitled "Web Analytics Shootout." It is a comprehensive look at many of the popular analytic tools and explains why
they produce different results.
An excerpt:
Web analytics packages, installed on the same web site, configured the same way, produce different numbers. Why?
1. By far the biggest source of error in analytics is implementation error. There are dozens (possibly more) implementation decisions made in putting together an analytics package that affect
the method of counting used by each package.
2. Placement of JavaScript on the site.
3. Differences in the definition of what each package is counting.
The way that analytics packages count visitors and unique visitors is based on the concept of sessions. There are many design decisions made within an analytics package that will cause it to count
sessions differently, and this has a profound impact on the reported numbers.
Makes sense, right? But it's also scary because it means data results from a single site can be
inaccurate due to several factors. It also means data results from different sites are not comparable, since there is no way to tell if mitigating factors are the same across all sites measured.
Let's look at a specific example of Point #3 above using sessions and duration as the metrics.
Using analytics Package A, session begins when one arrives at a site and ends when one
leaves. The session will also end if there is 30 minutes of inactivity from that visitor. So if a visitor arrives at a site, stays 5 minutes, leaves, and returns 10 minutes later again for 5 minutes,
the package will report two sessions with a duration of 5 minutes each.
Using analytics Package B, a visitor exhibiting the same activity pattern (5 minutes on the site, 10 minutes
away, 5 minutes back) will be reported as one session for 20 minutes. That's because this package allows any visitor returning within 30 minutes to count as part of the original visit.
For buyers of online media, this poses a significant problem: how does one reconcile the difference between Package A and Package B in order to evaluate activity and make the best possible buy?
Consistency of Process
A further variable in analytic packages is that the user of the package controls many of the processing functions. For example, the
user can control the degree to which filters are set to exclude mechanical traffic from spiders/robots. The user also controls whether to set any filters at all.
A generally accepted
best practice is to filter spiders/robots listed by the Interactive Advertising Bureau. Analytic tools certainly have the capability to filter
according to the IAB list -- but to what extent? With the user of a tool controlling the filter settings, traffic results can be manipulated. Unless all sites follow a standardized process of
applying filters, as occurs in an audit, results can be questionable and are clearly not comparable.
Transparency of Results
One function of an
independent media auditing firm is to make results publicly available. This is typically done through the auditing firm's Web site, which buyers of online media can access to identify those
audited sites in a specific vertical market. The availability of audited traffic data in a single location is a benefit to buyers, as it provides an easy-to-use resource that makes the search process
quicker and more efficient.
In conclusion, here's another excerpt from the Web Analytics Shootout:
Don't get hung up on the basic traffic numbers. The true
power of web analytics comes into play when you begin doing A/B testing, multivariate testing, visitor segmentation, search engine marketing performance tracking and tuning, search engine
optimization, etc.
Not a single mention in the paper of using the data for ad selling/buying. And that's because analytic tools were never intended for this purpose. They are
intended to help one better manage a site and they do a good job of that. But for an actual audit of traffic data, only a truly independent media auditor provides standardized, reliable data upon
which a media evaluation can be made.