He'll answer that question at the end of this column, but first we'd like to address the elephant in the room. No, not the mascot of the Republican Party, or even the conspiracy theories being propagated by the most extremist members of it. Or even worse, anonymous sources who have been spreading some of the most potentially dangerous theories for political gain.
Turns out it might be all of us, according to some fascinating new research released this week by the renowned consumer researchers at Ipsos MORI.
The study, which may be the first to actually attempt to segment and analyze the demographic composition of conspiracy theorists (see the top line above), also has a startling conclusion: that conspiracy theorizing isn't necessarily a "binary" either/or, believe/disbelieve thing, and that it falls on a spectrum that essentially means every one of us falls into one of those buckets, especially if you ask people enough questions about enough theories over time.
“One of the most interesting points that I think we’ve found is that belief about conspiracy theories is perhaps more nuanced than perhaps it first looks," Ipsos MORI head of behavioral science Colin Strong said during a briefing on the findings of the study, which the firm took on, not for any client, but as a public service to educate society about the growing phenomenon of conspiracy theories -- especially the ones that have been destabilizing confidence in key institutions, public authorities, and in some cases, leading to dangerous behaviors that "can kill people."
Strong, and a panel of some aforementioned authorities organized by Ipsos MORI, concluded that conspiracy thinking is an ingrained part of human nature, and occasionally has very rational and even sometimes good foundations -- like a healthy dose of skepticism -- to it. Still, such thinking has been growing and recently producing some dangerous results.
They also provided some top-line advice for brand marketers and government officials on how to respond to various conspiracy theories -- or, in some cases, not to -- because they likely will run their course and fade into the ether from which they came.
But more often then not, brands, governments, institutions or other stakeholders affected by conspiracy theories should "participate," because the downside risk of creating "information vacuums" that allow them to fester outweighs the risk of amplifying the theories by giving them more exposures.
One of the most ingenious, proactive strategies for dealing with conspiracy theories involves getting out in front of them.
"People have suggested that rather than debunking, we should engage in 'pre-bunking,'" suggested academic historian Peter Knight. That means priming "people in advance about some of the kinds of misinformation in conspiracy theories so that when they do come across them, they come across them in the framework that these might be problematic. Particularly, if you also show that the people pushing these conspiracy theories have their own dubious political agendas.”
Another suggestion is that because conspiracy theories -- and conspiracy theorists -- aren't necessarily binary situations, it's important to research the "content," the people who are spreading it, as well as the context for spreading it, in order to develop a proactive strategy for contending with it.
The Ipsos MORI study, which was based on a recent poll of more than 4,000 British adults, may have some findings that are somewhat unique to U.K. culture, media and politics, but the overall takeaway -- that conspiracy thinking is nuanced and falls on a spectrum, and that all of us fall into some bucket if you look at it that way -- was supported by all of the experts. Of course, those experts are authorities, and some conspiracy theorists might question their agendas too.
What I found most lacking in the findings, the discussion, and especially in the recommendations of the experts, was how they avoided addressing how media is used to, in effect, "weaponize" conspiracy theories, and the role that rapidly evolving and fast-moving digital media -- apps, social networks, and various "dark" pools of communication -- have been playing to accelerate discord and destabilize institutions and society at large.
Readers of this blog probably already know (by the way, this is Joe, not John-John, writing here) that I have long believed some conspiracy theories of my own, that some sophisticated bad actors -- both foreign and domestic -- have been leveraging media to amplify those outcomes. I refer to them as theories, but some of them have already been proven, such as the role Russia's military intelligence units, and their Internet Research Agency (IRA), played in spreading hoaxes and conspiracies during the lead up to the 2016 U.S. presidential election. (MediaPost even gave an award to the IRA as "Disruptor of the Year" for that achievement.)
Or even more recently, that bad actors perpetuated conspiracy theories that contributed to the Jan. 6 insurrection, as well as continuing to seed distrust that the election results were legitimate.
But I take some heart in the fact that the Ipsos MORI team at least helps us understand that these positions are nuanced and not necessarily unilateral. Otherwise, I'd have to shake my head about polls showing the majority of Republicans still believe in the theories that fueled the events of Jan. 6. Based on the Ipsos MORI findings, I think the truth is that most of them would like to believe it, or are somewhat skeptical about the official explanations of certain outcomes. At least that's what I want to believe.
Again, the most blatant omission coming from the Ipsos MORI panel was the way the experts downplayed the role of media in accelerating some of the most potentially dangerous theories (see related news story on the study's findings about media and institutions).
In particular, political commentator John Rentoul's wholesale dismissal of the idea that factors like automated bot accounts have played a material role in amplifying dangerous theories.
"I think bots are a conspiracy theory," Rentoul, a columnist for the U.K.'s The Independent, asserted during the discussion, adding: "I don’t think there is a significant contribution in social media from automated accounts. I just don’t see any evidence for that.”
I guess Rentoul is not familiar with numerous scientific studies documenting the presence and role of bot accounts active on social media platforms such as Twitter and Facebook, such as the Pew Research Center's 2017 analysis showing that two-thirds of the traffic on Twitter came from bots, not people, on the accounts it analyzed (see video by Computational Social Scientist Stefan Wojcik explaining the methodology below).
While Ipsos MORI did not go so far as to suggest what the Sparks & Honey team did -- that following conspiracies like QAnon's was a form of "entertainment" for some people, and that conspiracy theory promoters often use sophisticated gamification techniques to spread them -- they did reveal some fascinating insights about the emotions respondents said conspiracy theories elicit for them.
For some of the wackiest theories -- that COVID-19 vaccines have been implanting microchips, that 5G spectrum causes COVID-19, or that the U.S. Presidential election results were falsified -- one of the strongest emotions was "amusement."
And while Ipsos MORI did not survey people about one of QAnon's most popular conspiracy theories -- that John F. Kennedy Jr., who died in a plane crash in 1999, would reemerge as a former U.S. president's vice president when he is reinstated -- I have to believe that many of those supporting that theory are also into it for the laughs.
In other words, the joint byline on this column is just a joke. Please don't spread any theories about it.