There is a school of thought, though, that says that non-human Web traffic actually isn’t that big of a deal. After all, a certain percentage of traffic on any site being robots is probably already baked into the price. But true or not, this isn’t a popular argument, so while it’s worth mentioning, it’s not worth pursuing.
The spotlight and blame for the phantom traffic issue seems to be focused in a single direction these days: online publishers.
(Of course, it should be noted that “online publisher” is a pretty loose term, covering a broad spectrum that ranges from highly legitimate media outlets providing real value (and who may be victimized by robot traffic), to a couple of guys in a basement in the Ukraine running a bogus .net sites -- and every other site in-between.)
But blaming publishers alone is a bit of a copout, and it won’t solve the problem. Like trying to stop the sale of drugs by only going after the dealers, the bottom line is that if there’s a demand, there’s always going to be a supply. To effectively stamp out non-human traffic, it can’t just be up to the publisher; marketers have to play their part, too.
The large majority of marketing directors often demand that a site have a certain (often unrealistic) threshold of visitors before they’ll consider advertising. In my own experience, I’ve had marketing directors and buyers from brands and agencies tell me to contact them again when a site has a million uniques, or three million, or five million. Why? They’ll buy ads on cable TV shows that have far fewer viewers than that, so why the insistence on ridiculously large numbers when it comes to the Web?
And, if we stripped away every last non-human visit, how many sites with more than a couple of million unique human visitors, with room for display ads, and the ability to showcase videos to support pre-rolls, would truly be left? Maybe a dozen or so? Certainly not enough to handle the demand. Unfortunately, because of this dynamic, smaller sites that have a good foundation of traffic and good quality content become incentivized to fill the gap through poorer quality traffic sources, some of which may unwittingly end up sending non-human visitors.
I’m not blaming marketers when publishers behave badly. I am saying that marketers need to change their mindset about what they value in online media partners if we want to fix this problem.
To move forward and create an online environment that’s profitable for both advertisers and publishers, marketers need to abandon their obsession with unproven success beacons like audience size and click-through rates. Whatever happened to demographics and branding? Since when did the only audience that mattered become a large audience, and in whose mind does taking an action as soon as an ad is presented qualify as the only measure of publisher relevance?
There’s value in smaller audiences, especially when the source of that audience is highly controlled and transparent, and the content is targeted and meaningful to a given marketer.
A small community of, say, half a million human visitors on a highly targeted site, with more unique ad opportunities available (article sponsorship, branded content, product placement within videos, etc.) could be infinitely more valuable than a site that hosts millions of uniques, some large percentage of whom may not even be human, that’s fairly broad, demographically, with ad opportunities built solely in the hopes of gaining a 1% click-through rate.
Publishers need to be more careful in how they run their traffic, to be sure, and provide true value to marketers. Similarly, though, marketers need to recognize that true value doesn’t come only in the form of a large audience.