Commentary

The Promise And Peril Of AI Friends

"Did you know that the first Matrix was designed to be a perfect human world? Where none suffered, where everyone would be happy. It was a disaster. No one would accept the program. Entire crops were lost. Some believed we lacked the programming language to describe your perfect world. But I believe that, as a species, human beings define their reality through suffering and misery. The perfect world was a dream that your primitive cerebrum kept trying to wake up from. Which is why the Matrix was redesigned to this: the peak of your civilization." -- Agent Smith, The Matrix

Artificial friends, including ones with benefits, have been around a long time.

Almost 30 years ago, we had Tamagotchi. These digital pets-on-a-keychain “hatched” and needed constant care -- so much so that the Gen 1 and 2 versions could “die” in less than half a day if they weren’t constantly monitored. For obsessed children, when the artificial pets died, the grief was real.

advertisement

advertisement

Our non-alive companions don’t have to be digital. The movie "Lars and the Real Girl" tells the story of a socially awkward young man who forms a deep non-sexual attachment with an anatomically correct sex doll named Bianca. He brings her to family dinner, neglecting to mention to his brother and sister-in-law that his “visitor” he “met” online is not a human being. Over in the real world, 9.7% of American men and 6.1% of American women have a sex doll.

As soon as AI became a thing, so too did AI companions. Replika launched in 2017, well before ChatGPT.

By January of last year, 10 million people were building relationships with their chatbots. You can specify what type of relationship -- like “friends,” “siblings” or “mentors” -- but, not surprisingly, 60% of users said they had had a romantic relationship with their chatbot.

Lest ye be judging, you should know that a Stanford study found Replika beneficial for people with depression, and 3% reported Replika played a crucial role in preventing suicide. Loneliness is more dangerous than smoking 15 cigarettes a day.

Companionship is not a nice-to-have -- it's essential.

So, yeah. Not gonna bash the artificial companions. And yet…

One of the perceived benefits of artificial friends can also be one of their biggest drawbacks -- specifically, that they might like you too much.

“Unconditional positive regard” is the technical term: that you are seen in a positive light, no matter what you do or say. That whatever we do is perfect. That we never need to change.

Like a sugar hit, unconditional positive regard feels great in the short term, but can be damaging in the long term. Negotiation, friction, argument and recovery are part of building healthy relationships. When we avoid conflict, we also miss out on conflict resolution.

Interpersonal interactions are friction-filled by nature. Each of us has different needs, different values, different life experiences, and different perspectives that need to be navigated for us to come together. Part of the joy of human relationships is in overcoming those frictions.

Working through disagreements requires a set of important human skills. Empathy. Compassion. A willingness to be wrong. A willingness to learn when it is constructive to put someone else's needs ahead of our own -- and when it is not.

These skills get learned over time by being put into practice. And the way they get put into practice is by smashing your hopes and dreams against someone else's.

AI companions brings solace -- real solace, important solace. They have their place -- a real place, an important one. But they do not help us get better at navigating real-world relationships. They might just make us worse at it.

Next story loading loading..