Commentary

Good And Bad News About Black Swans

The following was previously published in an earlier edition of Media Insider.

First, the good news. According to a new study, we may be able to predict extreme catastrophic events like earthquakes, tsunamis, massive wildfires and pandemics through machine learning and neural networks.

The problem with these “black swan” type of events (events that are very rare but have extreme consequences) is that there isn’t a lot of data that exists that we can use to predict them. The technical term for these is a “stochastic” event – they are random and are, by definition, very difficult to forecast.

Until now. According to the study’s lead author, George Karniadakis, researchers may have found a way to give us a heads up by using machine learning to make the most out of the meager data we do have.

“The thrust is… to proactively look for events that will signify the rare events,” Karniadakis says. “We may not have many examples of the real event, but we may have those precursors. Through mathematics, we identify them, which together with real events will help us to train this data-hungry operator.”

advertisement

advertisement

This means that this science could potentially save thousands -- or millions -- of lives.

But -- and now comes the bad news -- we have to listen to, and take action on, those warnings.  And we have a horrible track record of doing that.  Let’s take just one black swan: COVID 19. Remember that?

Justsecurity.org is a “online forum for the rigorous analysis of security, democracy, foreign policy, and rights.” In other words, it aims to minimize the impact of black swans. The forum put together a timeline of the U.S. response to the COVID 19 pandemic.

It’s a terrifying and maddening read. It was months before the U.S. federal government took substantive action against the pandemic, despite repeated alerts from healthcare officials and scientists. All the bells, whistles and sirens were screaming at full volume, but no one wanted to listen.

Why? Because there has been a systemic breakdown of what we call epistemic trust: trust in new information coming to us from what should be a trustworthy and relevant source.

I’ll look at this breakdown on two fronts: trust in government and trust in science. These two things should work together, but all too often they don’t. That was especially true in the Trump administration’s handling of the pandemic.

Let’s start with trust in government. Based on a recent study across 22 countries by the OECD, on average only about half the citizens trust their government.

The U.S. wasn’t included in the study. But the PEW Research Center has been tracking trust in government since 1958, so let’s look at that study.

The erosion of trust in the U.S. federal government started with Lyndon Johnson, with trust plummeting further during the Watergate scandal. Interestingly, although separated by ideology, both Republicans and Democrats track similarly when you look at erosion of trust from Nixon through George W. Bush, with the exception being Ronald Reagan. That bipartisanship started to break down with Obama and began to polarize even more with Trump and Biden. Since then trust overall has been increasing, but the overall trend has still been toward lower trust.

Now, let’s look at trust in science. While not as drastic as the decline of trust in government, PEW found that trust in science has also declined, especially in the last few years. Since 2020, the percentage of Americans who have no trust in science had almost doubled, from 12% in April 2020 to 22% in December 2021.

It’s not that the science got worse in those 20 months. It’s that we didn’t want to hear what the science was telling us.

The thing about epistemic trust -- our willingness to believe trustworthy information -- is that it varies depending on what mood we’re in. The higher our stress level, the less likely we are to accept good information at face value, especially if what it’s trying to tell us will only increase our level of stress.

Inputting new information that disrupts our system of beliefs is hard work under any circumstances. It taxes the brain. And if our brain is already overtaxed, it protects itself by locking the doors and windows that new information may sneak through and doubling down on our existing beliefs.

This is what psychologists call confirmation bias.

The only thing that may cause us to question our beliefs is a niggling doubt caused by information that doesn’t fit with our beliefs. But we will go out of our way to find information that does conform to our beliefs so we can ignore the information that doesn’t fit, no matter how trustworthy its source.

The explosion of misinformation that has happened on the internet has made it easier than ever to stick with our beliefs and willfully ignore information that threatens those beliefs.

The other issue in the systemic breakdown of trust may not always be the message -- it might be the messenger. If science is trying to warn us about a threatening Black Swan, that warning is generally going to be delivered in one of two ways, either through a government official or through the media.

And that’s probably where we have our biggest problem. Again, referring to research done by PEW, Americans distrusted journalists almost as much as government. Sixty percent of American Adults had little to no trust in journalists, and a whopping 76% had little to no trust in elected officials.

To go back to my opening line, the good news is science can warn us about Black Swan events and save lives. The bad news is, we have to pay attention to those warnings.

Otherwise, it’s just a boy calling “wolf.”

Next story loading loading..