Commentary

The Spread: Return On Investment Is More Noise Than Substance

The media industry proposition to be able to measure advertising and marketing ROI looks to be more noise than substance.

Despite voices to the contrary, an assessment of our actions, evidenced in marketing and advertising campaigns, plans, empirical tests and validations, reveals that finance directors, CEOs and company boards are right in concluding that marketing and advertising effectiveness measures just don't stand up to scrutiny.

Support for the invalidity of the proposition is my 15-year accumulation of evidence from 1993 to 2007. In that time I have been intimately involved in reviewing the value for money and efficiency and effectiveness of media buying - initially confined to the UK -but ultimately across Europe, and more recently in the USA and in Asia.

I have enjoyed access to the marketing strategies, business, and campaign and media plans of a large number and wide range of domestic and international advertisers, working with all the major media agencies.

advertisement

advertisement

As a judge at advertising effectiveness awards, I have reviewed the authority placed in establishing measuring and validating advertising and media effects.

This gave me a rich insight into the current convention and practice of measuring the effects of advertising media activity.

A range of experiences from scientific research to pragmatic marketing operations highlights four key requirements necessary for inputs and outputs is to be quantified and validated.

My findings are that these requirements are hardly ever applied to the measurement, evaluation and attribution of the relationship between advertising (the input) and business performance (the output), however specified.

First, a plan needs to embrace an understanding of the possible and likely outputs that may emerge from the inputs. Without these, no research design can possibly pick up and attribute all possible outcomes. Most plans don't have these. So an inconclusive result - the worst of all possible outcomes - is inevitable.

Second, is the need to start and continue the marketing campaign until the planned effect starts to occur. I have found that advertising is regularly terminated before any effects can emerge. This is why confusion over longer term and shorter term effects arise.

Third, and among the most basic ingredients, is the necessity to have a placebo input in which a dummy variable operates. This allows the plan to eliminate noise and other distracting random events and effects. We have hardly ever seen a plan in which a non-advertising element was embraced.

Fourth, is a design in which the relationship between variables can be directed. The execution must go beyond correlation to reach cause and effect conclusions in a way that is as unambiguous as possible. Rarely does this plan occur. Indeed it's more likely to be the other way around, where sales in year one determine advertising budgets in year two!

A common thread here is the necessity for media plans to include variables of input. With the input constant and fixed there is no chance whatsoever of attributing outputs to inputs.

The media plans we have reviewed have a common thread. There are no variables in the plans.

There is an absence of any constructs such as:

*Up weight dollars vs. norm weight dollars vs. down weight dollars.
*Advertising here but not there.
*Media together with price promotions vs. media with norm pricing.
*Solus media vs. mixed media for constant dollars.
*Long ads vs. short ads.
*Big ads vs. small ads.
*Display and internet vs. Internet.
*A/B copy splits.
*And so on and so forth (The list of non- activity is endless!)

As if that were not bad enough, once plans are "signed-off" nothing within the advertiser's domain is able to change them. No learning and no changes mean no improvement.

Advertising can have long-term and short-term effects. But we have never found a plan that delivered the required long-term effects that also had unsatisfactory short-term effects. Short term measures are realistic but rarely adopted.

Plans to deliver unambiguous outcomes are hard to define and achieve. Yet it is essential that business undertakes experiments and delivers learning's.

Our evidence is that the experience of a planning phase is hardly ever used to fuel the next phase of the plan.

There is no systematic learning theory being applied in media planning practice. The concept of continuous improvement seems lost from the world of media planning.

And yet all this non-activity comes at a time when the mechanisms for measuring and relating inputs and outputs have never been better. We now enjoy audits of consumer activity. Interactive media deliver opportunities for planning subtlety at the one-to-one level.

Until and unless we build into the art of media planning some of the more rigorous scientific disciplines designed to relate cause and effect and attribute changes in business performance to marketing inputs, return on investment will forever go un-measured.

The opportunity to deliver return on marketing and advertising activity has never been greater. It's such a pity that we are losing this chance because of stereotype, one dimensional planning that's afraid to stand up and be counted.

Let's now spend more energy on winning the prize and less on the price of failure.

Next story loading loading..