At a time when the advertising and media industry seems transfixed -- and maybe a little torn -- about the rapid acceleration of artificial intelligence, we’re also starting to see
some accountability put in place.
On Thursday, news and information reliability rating service NewsGuard announced it has launched an auditing service for AI providers, including a
new generation of chatbots, and generative AI services that are becoming baked into news, information and marketing tools, including both search and publishing content.
The new
auditing service comes a little more than a week after NewsGuard released findings of a tracking study analyzing the propensity of OpenAI’s chatbots to output false news and information
narratives in response to queries.
The study found that its latest version (ChatGPT-4) advanced 100% of 100 false narratives benchmarked by NewsGuard -- up from 80% when it analyzed an earlier
version (ChatGPT-3.5) in January.
While machine learning, AI and the use of increasingly sophisticated algorithms to organize, parse and spread information is not new, its power and
sophistication is advancing at rates that may be greater than many humans can detect.
One of the big problems is not just that hostile actors might be using the technology for
nefarious, disruptive purposes, but that the technology itself doesn’t necessarily have the same safeguards about truth and reality that people do -- including the ones that are training
them.
Generative AI is great at iterating possible outcomes to queries and questions, but it needs to be vetted -- at least for now -- by people for truth and veracity, because as
well trained as the technology currently is, it’s simply synthesizing information that is available to it. And sometimes that information is erroneous, or synthesized in ways that aren’t
real.
“The good news about ChatGPT is that it will always give you an answer,” begins a joke AI experts like to tell about it. “The bad news is it will always give
you an answer -- even if one doesn’t exist.”
At a time when the advertising and media industry are tackling a wide array of ethical issues about the use of media --
everything from diversity, equity and inclusion to the role media plays in climate change -- it’s probably a good time for the industry to begin adopting standards, protocols and the means of
evaluating the role AI plays in the information health of the ecosystem.
NewsGuard is a logical player to step forward, so good luck to them. Despite some disinformation --
ironically -- being circulated about them, they are good people and serious media-industry professionals with a strong track record for quality journalism.
Full disclosure, I worked
for one of its co-founders -- Steve Brill -- back in his Brill Media and Inside.com days, so I may be a little biased, but I admired his strong ethical pursuit of the truth long before and long
after he was my boss.
“For all the extraordinary promise of generative AI, it also presents a great threat to trust in information,” Brill said in a state announcing the
new AI auditing service Thursday, adding: “The early launches of these services often respond to prompts about topics in the news with well-written, persuasive, and entirely false accounts of
the news. This could become a force multiplier for those wishing to spread harmful conspiracy theories, healthcare hoaxes, and Russian disinformation at unmatched scale.”
Other organizations -- most notably the Media Rating Council (MRC) -- have been developing media industry standards for evaluating and measuring the validity of AI, though those likely will
relate explicitly to how they impact audience measurement and ratings services, including whether they are accurately representing audiences, or if they have some machine biases baked in based on the
way their AI was trained.
Maybe it’s time for other industry authorities -- yes, I’m alluding to the Association of National Advertisers, the World Federation of
Advertisers, or maybe even the Institute for Advertising Ethics -- to step up and set some baseline standards as well.
All the pressing issues confronting the advertising and media
industry have one common ingredient necessary to advancing their causes: the truth.
Without that, we’re all just crawling down rabbit holes.