As watchdogs, regulators and pundits debate the impact artificial intelligence will have on political campaigns -- and outcomes -- I would suggest they shift from a future tense, to what's already going on. I mean, it's right there, albeit in fine print many human eyes might otherwise miss. Squint and look very closely and you'll see it.
"Built entirely by AI imagery," reads the minuscule disclosure on a "Beat Biden" YouTube video the Republican National Committee began circulating a couple of months ago, which may not be the first one it has fabricated using humans and/or machines, but is the first to carry that explicit disclaimer. It likely will not be the last, as AI-generated fabrications, even absolute Deep Fakes, become part of the new state of the political media spin art.
"The idea that technology can be used to spread lies and untruths is not new," Bill Gates writes in a "risks of AI" post on his GatesNotes blog this week, noting, "People have been doing it with books and leaflets for centuries."
He goes on to point out that the practice has grown easier with increasing advances in media technology -- from word processors and laser printers to email and social media.
While that's true, something has changed in recent years and will change even more as we approach a new epoch of AI-generated media: speed.
Yes, early microprocessing technology has increasingly sped up the time it takes to fabricate content, but until recently, it was at a relatively human speed. Now it is happening faster than humans can create it, or more to the point, faster than they can vet its veracity.
That's more than just a paradox for democratic processes that were created hundreds of years ago, when print media still needed to be published and distributed manually in order to influence how others think, feel and vote.
And given how slowly our legislative processes work, I don't anticipate any material changes governing the use of AI-generated media anytime soon. On the bright side, Big Tech appears to be accelerating its own efforts to self-regulate. Well, at least Google is.
“That’s why we’re rolling out tools like ‘About this image’ in our search, which provides important information such as when and where similar images may have first appeared," Google Senior Vice President-Research, Technology & Society James Manyika explained during his recent keynote at the Cannes Lions festival.
While I think it's unlikely that most people -- certainly most voters -- will notice, much less click on, Google's metadata, anymore than the percentage of people who will notice the GOP's fake ad's fine print, but at least it's a start.
The next phase, Manyika said, will be to begin digitally watermarking all AI-generated content and embedding indelible metadata identifying the source of it, which even if people don't notice it, new machine tools will be developed that can, enabling people and platforms to filter it accordingly.
And in May, the American Association of Political Consultants issued a statement condemning the use of deceptive generative AI content in political campaigns, though presumably the ones who work for the GOP weren't among those endorsing it.
“AAPC is proud of and committed to upholding our code of ethics and believed we needed to address this burgeoning technology and make it clear to our members that its use is a blatant violation of our existing code of ethics," AAPC President R. Rebecca Donatelli said about the release of the statement.
While some legislative solutions have already gathered steam (the Sentate's Deepfake Report Act of 2019), and President Biden has proposed an "AI Bill of Rights" to safeguard Americans against new and future abuses, we're still operating at human, not machine speed.
"More than ever before, political actors can generate curated content to persuade voters," analysts at political ad tracking firm AdImpact write in a timely new report on the use of AI in political advertising. "However, these innovations harbor great risk to mislead and manipulate the American public."
On that note, I'd like to propose yet another new media industry acronym we should all be against: AIGD.
It stands for AI-generated democracy, and its also an oxymoron.