Somewhere between the Arab Spring and the Jan. 6 insurrection of the U.S. Capitol, digital media technology morphed from an agent of democracy to one of autocracy. The truth is, it was probably always somewhere in the middle, depending on who was leveraging it and how skillful they were in doing so, but there’s a growing belief that the next generation of AI-empowered digital media technology will fall squarely on the undemocratic side.
“Contrary to conventional wisdom, AI can seriously undermine autocratic regimes by reinforcing their own ideologies and fantasies at the expense of a finer understanding of the real world,” reads an important soon-to-be-published piece in Foreign Affairs magazine.
The article, “Spirals of Delusion: How AI Distorts Decision-Making and Makes Dictators More Dangerous,” was co-written by three renowned academics – Johns Hopkins University’s Henry Farrell, Georgetown University’s Abraham Newman, and Cornell University’s Jeremy Wallace – and argues that the real challenge for democratic nations isn’t winning the race for AI technological dominance, but in countering authoritarian societies that will increasingly be “in the throes of an AI-fueled spiral of delusion.”
Personally, I think there are potentially even bigger unintended consequences of AI that we have yet to contend with, but the acceleration of autocratic ideologies seems like a pretty good place to start.
You may recall a couple of “Red, White & Blog” columns I’ve written correlating the rise of digital media access – principally the internet – and the decline of the world’s population living in democracies. According to the authors of the Foreign Affairs piece, AI will accelerate that on steroids.
Instead of fueling democracy, they argue “machine learning” will be an answer to the prayers of autocrats, giving them super powers to understand “whether their subjects like what they are doing without the hassle of surveys or the political risks of open debates and elections.”
The powerful machine learning algorithms of social media platforms, arguably, have already been contributing to the decay of democracies, but helping to game discord and polarize the beliefs – even the senses of reality – among their citizens. And the next generation of AI can only exacerbate that.
“The challenges to democracies such as the United States are all too visible,” the authors write, noting, “Machine learning may increase polarization – reengineering the online world to promote political division. It will certainly increase disinformation in the future, generating convincing fake speech at scale. The challenges to autocracies are more subtle but possibly more corrosive. Just as machine learning reflects and reinforces the divisions of democracy, it may confound autocracies, creating a false appearance of consensus and concealing underlying societal fissures until it is too late.”
The authors go on to make the case for “weaponized AI,” a phrase that has already been applied to platforms like the defunct Cambridge Analytica, as well as a far more successful Palantir Technologies.
What the authors don’t do unfortunately, is offer an explicit solution to the problem. It's more of a cautionary tale for an inevitable future.