Commentary

All Bots Aren't Created Equal: Battling The Good, The Bad And The Ugly

At the IAB Leadership Forum, I witnessed something pretty shocking — advertisers are mad. They are grab-the-microphone, fist-shaking mad. And they have their sights set on fixing one big problem: traffic fraud in digital media.

Traffic fraud is a complex topic, but in the IAB conversations I saw marketers focusing their anger on one specific subject: paying for “bots” — those automated impression generators that come to a Web site and artificially drive up traffic volume. Many in the room expressed how quickly this problem has grown to a point that even their Fortune 500 CEOs are involved and that it is absolutely suppressing digital media investment.

Marketers vehemently demanded that something needs to be done now -- which prompted me to think about the importance of understanding the work that has already been done to improve the authenticity of digital media -- and to consider what more needs to be done to make sure traffic is high-quality, safe and real.

The idea of “good” and “bad” traffic is interwoven with how, and from whom, media is sourced. Because our transparency tools are still evolving, it’s difficult for marketers to see that progress has been made and have trust in their investments. 

All Bots Are Not Created Equal

I spoke to a number of publishers during the event, but one particular conversation was an “a-ha” moment for me. Not all bots are designed to inflate traffic. Bots can be used to harvest content, scrape for data, capture analytics or countless other site automation tasks. In fact, the more popular the site or premium the publisher, the more likely that bots are used for these “good” reasons. We live in an automated environment — and automation by definition means bots. If history has taught us anything, it’s that there will be more automation. But not all of these bots are intended to defraud marketers.  

Premium Publishers Are Penalized Too

It’s not just the advertisers who suffer because of how bot detection tools work. Even premium publishers can get caught up in buyer “clean sweeps.” Once a publisher is blacklisted in a buying platform, they will never receive money from the platform again. In addition to being bad for the publisher, this is a double whammy for buyers who initially pay for bots and then subsequently lose high-quality media experiences because of blacklisting. 

Impression-Level Analysis is the Key

Because this problem affects programmatic media buys AND direct non-programmatic media buys we need to get really tactical about evaluating the quality of every impression at media delivery time. To be able to evaluate media quality at high scale, technology is a requirement. This kind of policing can’t be done manually by hand. Manual blacklisting can take entire good sites out of the media mix for bad reasons. But just understanding whether an impression is a bot is still not the right approach. There is a perspective that is even more fundamental.

Focus on the Human                 

Whether your buys are being done programmatically or directly, as an industry we certainly need to eliminate bots as best we can — but more importantly, we need to begin talking about how we serve ads to real people. Services and technologies need to help us authenticate that an impression equals a real life person so marketers can buy media without fear. There will always be more technologies and more bots, and trying to optimize them out is almost futile. You can’t just “whack-a-mole” your way to better quality traffic. Yes, it can be complex and costly to focus on delivering to real humans, but what’s even more costly are the wasted media dollars spent that never turn into sales. It’s time to enable media to be not only focused on accountability but also ensuring that ads are delivered to real consumers on real Web sites who drive real purchases. 

2 comments about "All Bots Aren't Created Equal: Battling The Good, The Bad And The Ugly".
Check to receive email when comments are posted.
  1. Maren Woodlock from Noble Advertising, February 24, 2015 at 4:53 p.m.

    Bot traffic, sites that aren't real just pages that someone created, clicks without impressions, multiple banners for a client on a single web page, these are all things we are seeing in programmatic. I believe the DSP providers have the responsibility to ensure that the traffic they are delivering through their platforms from the exchanges is quality, yet our clients have to pay extra for brand safety and they look at four banners for the same client on a single web page as four impressions. It's truly infuriating how little responsibility the trade desks take for making sure they are delivering the quality inventory that clients expect.

  2. Keith Pieper from IMM, February 25, 2015 at 9:19 a.m.

    Blacklisting doesn't work in today's ecosystem where technology allows impression level control. Why use a chainsaw when you can simply use a hand pruner? This is especially true as so many URLs are masked or fraudulent - the same "content" appears on one domain after another has been shut down.

Next story loading loading..