As the world turns its eyes to Davos, Switzerland this week, a group of advertisers and agency executives are pushing to leverage their own economics to create a “more sustainable
and responsible” digital media marketplace that “protects consumers, the media industry and society.”
The initiative, being organized by the Global Alliance for
Responsible Media (GARM), was founded in June 2019 by the World Federation of Advertisers, represents $97 billion spent by six agency holding companies on behalf of 39 advertisers.
Interestingly, it also includes big media companies such as Facebook, Twitter and Google’s YouTube, which some critics blame for contributing to an irresponsible digital media
environment enabling questionable content and misinformation to perpetuate.
However, the group also cited estimates that “620 million pieces of harmful content”
were removed by YouTube, Facebook and Instagram between July and September 2019.
advertisement
advertisement
“Because of the platforms’ investments in teams and tools, the majority of this content
was removed before consumers actually saw them,” GARM asserted, adding, “However, approximately 9.2 million pieces of harmful content still reached consumers during that three-month
period, equating to roughly one piece of harmful content viewed per second.”
The GARM announcement goes on to say it is taking a “collaborative approach,” working
with digital media platforms to improve their environments “with the goal of eliminating harmful online content and ensuring that bad actors have no access to advertiser
funding.”
The alliance unveiled a three-prong plan, including:
Shared definitions: The Alliance has developed
and will adopt common definitions to ensure that the advertising industry is categorizing harmful content in the same way. The 11 key definitions covering areas such as explicit content, drugs, spam
and terrorism will enable platforms, agencies and advertisers to a shared understanding of what is harmful content and how to protect vulnerable audiences, such as children. Establishing these
standards is the first step needed to stop harmful content from being monetized through advertising.
Common tools and systems:
The Alliance will develop and adopt common tools that will create better links across advertiser controls, media agencies tools, and the platform efforts to categorize content. Creating these linkages
will improve transparency and accuracy in how media investments are steered towards safer consumer experiences – in images, videos and editorial comments.
Independent oversight: The Alliance will establish shared measurement standards so that the industry and platforms can fairly assess their ability to block,
demonetize, and take down harmful content. Transparency via common measures, methodology for advertisers, agencies and platforms is key to guiding actions that enhance safety for consumers. Adopting
key measures and agreeing to independent verification will be key to driving improvement for all parties, which we will look to track annually. A special working group from the GARM will be activating
this strategy starting in April.
“It’s time to create a responsible media supply chain that is built for the year 2030, one that operates in a way
that is safe, efficient, transparent, accountable, and properly moderated for everyone involved, especially for the consumers we serve,” stated Procter & Gamble Chief Brand Officer and GARM
member Marc Pritchard, adding, “With all the great minds in our industry coming together in partnership with The Global Alliance for Responsible Media, we can and should avoid the pitfalls of
the past and chart a course for a responsible future.”