It is a shame that after so many years of advancement in digital media planning and buying, it is still such a pain to pull USEFUL reach and frequency-related stats from ad servers. Granted that
major ad servers like DoubleClick or Atlas all feature reports that would give you reach frequency numbers in some forms. Unfortunately useful insights you can glean from their standard/free offerings
can be quite limited. If the standard reports on impressions, clicks, conversions are the benchmark of what a campaign report should look like, reach frequency reports fall far short of that standard
in two areas:
First of all, standard ad-server reach and frequency reporting does not always allow the kind of slicing and dicing that planners/analysts need to understand campaign
performance. Rather, standard reporting in this area generally contains only campaign-level information at a canned up-to-now or weekly/monthly intervals. The truth of the matter is that measurement
of campaign effectiveness for optimization purpose is an exercise that should go far beyond how the campaign has been performing on overall level at a rigid/pre-defined time interval. Good
optimization practice is usually preceded by deeper understanding of how each site/placement/creative has been faring, so that intelligent decisions can be made and appropriate fine-tuning can be
carried out by pulling such media levers as capping frequency level, shifting media dollars across site/placement, or throwing out non performing creative assets.
Secondly, one of the most
important aspects of reach frequency report (if not THE most important aspect of it) is to know and understand frequency distribution as well as the impact of such distribution upon conversion. And
most times the ad server reports on frequency distribution suffer from a very similar type of rigidity as the regular overall reach report does. Namely almost none of the frequency distribution
reports would go beyond the overall campaign level. Even though frequency distribution in theory would provide one of the most actionable metrics for campaign optimization, for reason beyond the scope
of this small piece, frequency distribution at overall campaign level is only something interesting to look at. To make it actionable, we would at least need site-level information.
The lack
of more flexible reach and frequency reporting is particularly troublesome because reach and frequency are supposed to be the king of media planning - at least' that's been the case with traditional
TV channel for very long. After all, the most powerful measurement in TV market is GRP, which is a composite of reach and frequency.
No one in the media community would dispute the fact that
reach and frequency are important. Hence it is a little disquieting that in digital media, where on the measurement front everything is supposed to be better than traditional media due to availability
of data, we can't even do campaign optimization on the very metrics that are proven to be the most effective. This has led, for instance, to under-utilization of a powerful media lever, frequency
capping, which currently is either implemented based mostly upon some kind of gut feeling or is not implemented at all.
I know while I am making my argument here, there would be people in our
community eager to discourage me from using ad server reports, because of some well-known issues relating to cookie tracking: cookie deletion and counting computers vs. counting people. I want to say
that I am well aware of the problem associated with cookie-based reporting. However, unless the size of online panels is large enough to speak statistical significance to every nuances of the buy, we
probably will have to live with a lesser evil for in-flight campaign optimization. After all, online panels in the shape they are in today, though they could be powerful in driving a better media
plan, are still inadequate in churning out performance reporting for optimization.
Also by pointing out that ad servers so far have been doing an inadequate job with reach and frequency
numbers, I am NOT trying to trivialize the difficult process they have to go through to produce such stats. I fully recognize that numbers around reach and frequency require exponentially more
computational resources even with the current sampling approach. However, it is also a fact that computation in the past 10 years has been becoming cheaper and faster. In contrast, we have only been
seeing a glacial improvement in reach and frequency reporting. In the history of business, there are plenty of instances in which business decisions are being held hostage to engineering priorities.
Hopefully we, on the business side of media, will be seeing some daylight at the end of the tunnel very soon.