Regular readers of this column will know that I have repeatedly complained about the annoyance of ads, which I have pointed out can be particularly bad for publishers. Based on some recent findings, I have come to realize that this annoyance is a particularly acute problem for mobile advertising.
Reports show that somewhere close to half of all mobile ad taps are accidental. I suspect that if I polled all the readers of this blog, 100% would say that they find mobile ads to be particularly intrusive and annoying, more so than desktop ads. Why? I’d say, intuitively, it’s because mobile ads are much harder to ignore, and because they tend to interfere directly with whatever it is that we are trying to do.
If mobile ads are so annoying, why do mobile advertising budgets keep growing? The primary reason seems to be that mobile ads yield substantially higher CTRs than their desktop counterparts — often 2x-5x. But are higher CTR rates an accurate indication that mobile ads are more effective?
And that is the Elephant in the Mobile Advertising Room: There is increasing evidence that mobile ads are so annoying, in some cases they go beyond ineffectiveness. They can actually have a negative impact on engagement.
Let’s look at some recent data that support my sensationalistic claim.
About one year ago, xAd released the results of a study done in collaboration with Nielsen and placed.com, covering nearly 80 mobile ad campaigns that totaled some 200 million impressions. The study measured click-through rates (CTR) as well as secondary actions rates (SARs) such as calling or requesting more information, and store visitation lift rates (SVLs). SAR and SVL are more direct measures of consumer engagement or sentiment than CTR.
The xAd study optimized some of the campaigns for CTR, some for SAR and some for SVL. Their results show two seemingly paradoxical findings:
These results cannot simply be attributed to accidental taps (the “fat finger” effect). Even if we assume that the higher CTR of some ads was entirely due to accidental taps, then the number of purposeful taps would be the same regardless of CTR, and thus the SARs and SLVs should also be the same. The fact that SARs and SLVs are actually lower for higher CTRs strongly suggests that consumers are more annoyed by the more intrusive ads, to the point where the net impact is negative. In other words, some mobile ads are so annoying that even some of the consumers who might otherwise have been interested in the offering were so annoyed that they chose not to engage.
Additional evidence of this phenomenon can be found in a recent experiment reported by Google. The mobile page for Google+ included a full-page interstitial ad to encourage download of the company’s native app. While the interstitial ads had a high CTR at 9%, Google noticed that site abandonment rates were high. Concerned about user experience, the company replaced the interstitial ads with smaller banner ads. The results were surprising: Google found that app installs were largely unaffected, while the number of active users on its Web site increased by 17%.
Once again, we see that CTR is not a good measure of consumer engagement or satisfaction. In fact, both of these studies provide quantitative support for the intuition that annoying ads can actually backfire.
More to the point, publishers should pay particular heed to the Google+ study: The cost of the more annoying ads is borne by the site hosting the ad, much more so than by the advertiser.
How long will it take before the industry as a whole notices this elephant in the room?