It’s interesting to see how we humans assign trust.
Consider the following scenario. At any time, in any city in the world, you will put your life in the hands of a complete stranger, in an environment you have no control over, without a second thought. We do it every time we hail a cab. We know nothing about the driver or his safety record. We don’t know if he's a good person or a psychopath. We place trust without any empirical reason to do so.
Yet a number of recent surveys indicate the majority of us don’t trust self-driving cars. A recent survey by AAA found that 71% of us would be afraid to ride in a fully self-driving vehicle.
I’m one of them. I’m not sure I could slam the door on a self-driven Uber and relax in the back seat while AI takes the wheel. Yet I pride myself on being a fairly rational person -- and there are plenty of rational reasons why self-driving cars should be far safer than the human-powered equivalents. Even the most skeptical measured comparisons call it a toss-up.
And that brings us to a key point: We don’t assign trust rationally. We do it emotionally -- and we have a tortured emotional relationship with technology.
The problem here is twofold. First, our trust mechanisms are built to work best when we’re face-to-face with the potential recipient of trust. Trust evolved to be a human-dependent process.
And that brings us to the second problem: Over the last thousand years or so, we have learned how to trust in institutions. But that type of trust is dissolving rapidly.
Author and academic Rachel Botsman has spent over a decade looking at how technology is transforming trust. In an interview with Fast Company, she unpacks this notion of imploding institutional trust: “Whether it’s banks, the media, government, churches . . . this institutional trust that is really important to society is disintegrating at an alarming rate. And so how do we trust people enough to get in a car with a total stranger and yet we don’t trust a banking executive? “
I think this transformation of trust has something to do with the decoupling phenomenon I wrote about last week. When we relied on vertically integrated supply chains, we had no choice but to trust the institutions that were the caretakers of those chains.
But now that our markets have flipped from the vertical to the horizontal, we are redefining our notions of trust. We are digitally connecting with strangers through sharing economy platforms like AirBnB and Uber -- and, in the process, we are finding new signals to indicate when we should trust and when we shouldn’t.
There is another unique aspect to our decision to trust. We tend to trust when it’s expedient to do so. Like so many parts of human behavior, trust is just one factor wrapped up in our ongoing risk vs reward calculations. Our emotions will push us to trust when it’s required to get what we want. The fewer the alternatives available to us, the more we tend to trust.
Our lack of trust in self-driving vehicles is a more visceral example. I don’t think anyone believers the creators of self-driving technology are out to off our species in a self-driven version of a Mad Max conspiracy. We just aren’t wired to trust machines with our lives. There is an innate human hubris that believes that when it comes to self-preservation, our fates are best left in our hands.
Self-driving proponents believe that with time and exposure, these trust issues will be resolved. The trick to us trusting machines with our lives is to lull us into not thinking about it too much.
Millions of us do it every day when we board an airplane. The degree to which our airborne lives are dependent on technology was tragically revealed with the recent Boeing Max incidents. The fact is, if we had any idea how much our living to see tomorrow is dependent on technology, we would dissolve into a shuddering, panic-stricken mess. In this case, ignorance is indeed bliss.
But there are few times when we have to make the same conscious decision to put our lives in the metaphorical hands of a computer to the extent we do in a self-driven car. If we look at how we decide to trust, this an environment strewn with psychological landmines.
Remember, we tend to trust when we have no options. And in this case, our option couldn’t be clearer. The steering wheel is right there, begging us to take over. It freaks us out when the car pulls away from the curb and we see the wheel start turning by itself. It’s small wonder that 71% of us are having some control issues.