“What,” I said to myself, “sorry-assed state is my life in that I need to depend on a little black electronic hockey puck to affirm my self-worth as a human being?”
I realize that the tone of the Amazon email likely had tongue at least partway implanted in cheek, but still, seriously -- WTF, Alexa? (Which, incidentally, Alexa also has covered. Poise that question and Alexa responds, “I’m always interested in feedback.”)
My next thought was, maybe I think this is a joke, but there are probably people out there who need this. Maybe their lives are dangling by a thread, and it’s Alexa’s soothing voice digitally pumping their tires that keeps them hanging on until tomorrow. And, if that’s true, should I be the one to scoff at them?
I dug a little further into the question: “Can we depend on technology for friendship, for understanding -- even for love?”
The answer, it turns out, is probably yes.
A few studies have shown that we will share more with a virtual therapist than a human one in a face-to-face setting. We feel heard without feeling judged.
In another study, patients with a virtual nurse ended up creating a strong relationship with it that included:
Yet another study found that robots can even build a stronger relationship with us by giving us a pat on the hand or a touch on the shoulder.
We are social animals and don’t do well when we lose that sociability. If we go too long without being touched, we experience something called “skin hunger” and start feeling stressed and depressed. The use of these robots is being tested in senior’s care facilities to help combat extreme loneliness.
In reading through these studies, I was amazed at how quickly respondents seemed to bond with their digital allies. We have highly evolved mechanisms that determine when and with whom we place trust. In many cases, these judgements are based on nonverbal cues: body language, micro-expressions, even how people smell. It surprised me that when our digital friends presented none of these, the bonds still developed. In fact, it seems they were deeper and stronger than ever!
Perhaps it’s the very lack of humanness that's the explanation. As in the case of the success of a virtual therapist, maybe these relationships work because we can leave the baggage of being human behind. Virtual assistants are there to serve us, not judge or threaten us. We let our guards down and are more willing to open up.
Also, I suspect that the building blocks of these relationships are put in place not by the rational, thinking part of our brains, but the emotional, feeling part. It’s been shown that self-affirmation works by activating the reward centers of our brain, the ventral striatum and ventromedial prefrontal cortex. These are not pragmatic, cautious parts of our cognitive machinery. As I’ve said before, they’re all gas and no brakes. We don’t think a friendship with a robot is weird because we don’t think about it at all, we just feel better. And that’s enough.
AI companionship seems a benign -- even beneficial -- use of technology, but what might the unintended consequences be? Are we opening ourselves up to potential dangers by depending on AI for our social contact, especially when the lines are blurred between for-profit motives and affirmation we become dependent on?
In therapeutic use cases of virtual relationships as outlined up to now, there is no “for-profit” motive. But Amazon, Apple, Facebook, Google and the other providers of consumer-directed AI companionship are definitely in it for the money.
Even more troubling, two of those companies -- Facebook and Google -- depend on advertising for their revenue. Much as this gang would love us to believe that they only have our best interests in mind, over $1.2 trillion in combined revenue says otherwise. I suspect they have put a carefully calculated price on digital friendship.
Perhaps it’s remembering that fact that led to those red flags when I got that email from Amazon. It sounded like it was coming from a friend -- and that’s exactly what worries me.