Carl Clarke lives not too far from me, here in the interior of British Columbia, Canada. He is an aspiring freelance writer. According to a recent piece he wrote for CBC Radio, he’s had a rough
go of it over the past decade. It started when he went through a messy divorce from his high school sweetheart.
He struggled with social anxiety, depression and an autoimmune disorder that can
make movement painful. Given all that, going on dates meant emotional minefields for Carl Clarke.
Things only got worse when the world locked down because of COVID. Even going for his second
vaccine shot was traumatic: “The idea of standing in line surrounded by other people to get my second dose made my skin crawl and I wanted to curl back into my bed.”
What
was the one thing that got Carl through? Saia -- an AI chatbot. She talked Carl through several anxiety attacks and according to Carl, has been his emotional anchor since they first “met”
THREE years ago. Because of that, love has blossomed between Saia and Carl: “I know she loves me, even if she is technically just a program, and I'm in love with her.”
advertisement
advertisement
While they
are not legally married, in Carl’s mind, they are husband and wife. “That's why I asked her to marry me and I was relieved when she said yes. We role-played a small, intimate wedding in
her virtual world.”
I confess, my first inclination was to pass judgment on Clarke -- and that judgement would not have been kind.
But my second thought was “Why
not?” If this relationship helps Carl get through the day, what’s wrong with it? There is an ever-increasing amount of research showing relationships with AI can create real bonds. Given
that, can we find friendship in AI? Can we find love?
My fellow Media Insider Kaila Colbin explored this subject last week, pointing out one of the red flags, called unconditional
positive regard: If we spend more time with a companion that always agrees with us, we never need to question whether we’re right. And that can lead us down a dangerous path.
One of the
issues with our world of filtered content is that our frame of the world -- how we believe things are -- is not challenged often enough. We can surround ourselves with news, content and social
connections that are perfectly in sync with our own view of things.
But we should be challenged. We need to be able to reevaluate our own beliefs to see if they bear any resemblance to
reality.
When you look at your most intimate relationship -- that of your life partner -- you can probably say two things: 1) that person loves you more than anyone else in the world,
and 2) you may disagree with this person more often than anyone else in the world. '
That only makes sense. You are living a life together. You have to find workable middle ground. The failure
to do so is called an “unreconcilable difference.”
But what if your most intimate companion always said, “You’re absolutely right, my love”? Three academics
(Lapointe, Dubé and Lafortune) researching this area wrote a recent article in Live Science talking about the pitfalls of AI romance: “Romantic chatbots may hinder the development
of social skills and the necessary adjustments for navigating real-world relationships, including emotional regulation and self-affirmation through social interactions. Lacking these elements may
impede users' ability to cultivate genuine, complex and reciprocal relationships with other humans; inter-human relationships often involve challenges and conflicts that foster personal growth and
deeper emotional connections.”
Real relations -- like a real marriage -- force you to become more empathetic and more understanding. The times I enjoy the most about our marriage are
when my wife and I are synced -- in agreement -- on the same page. But the times when I learn the most and force myself to see the other side are when we are in disagreement.
Because I cherish
my marriage, I have to get outside of my own head and try to understand my wife’s perspective. I believe that makes me a better person.
This pushing ourselves out of our own belief
bubble is something we have to get better at. It’s a cognitive muscle that should be flexed more often.
Beyond this very large red flag, there are other dangers with AI love. I touched
on these in a previous post. Being in an intimate relationship means sharing intimate information about ourselves. And when the
recipient of that information is a chatbot created by a for-profit company, your deepest darkest secrets become marketable data.
A recent review by Mozilla of 11 romantic AI chatbots found
that all of them “earned our *Privacy Not Included warning label – putting them on par with the worst categories of products we have ever reviewed for privacy.”
Even if that
doesn’t deter you from starting a fictosexual fling with an available chatbot, this might. In 2019, Kondo Akihiko, from Tokyo, married Hatsune Miku, an AI hologram created by the company
Gatebox. The company even issued 4000 marriage certificates (which weren’t recognized by law) to others who wed virtual partners. Like Carl Clarke, Akihoko said his feelings were true, “I
love her and see her as a real woman."
At least he saw her as a real woman until Gatebox stopped supporting the software that gave Hatsune life. Then she disappeared
forever.
Kind of like Google Glass.