Shift to no screens – an increasing number of connected devices will change the concept of what it means to be online.
Content tunnel vision – As the content we see is increasingly filtered based on our preferences, what does that do for our perception of what is real?
But while we’re talking about our imminent surrender to the machines, I feel there are some other themes that also merit some discussion. Let’s limit it to two today.
A New Definition of Connection and Community
A few weeks ago, I read an article that I found fascinating, by neuroendocrinologist and author Robert Sapolsky. In it, he posits that understanding Capgras Syndrome is the key to understanding the Facebook society.
The syndrome, first identified by French psychiatrist Joseph Capgras, is a disorder where we can recognize a face of a person but we can’t retrieve feelings of familiarity. Those afflicted can identify the face of a loved one but swear that it’s actually an identical imposter.
Recognition of a person and retrieval of emotions attached to that person are handled by two different parts of the brain. When the connection is broken, Capgras Syndrome is the result.
This bifurcation of how we identify people is interesting. There is the yin and yang of cognition and emotion. The fusiform gyrus cognitively “parses” the face, and then the brain retrieves the emotions and memories that are associated with it. To a normally functioning brain, this process is seamless and connected, but because two different regions (or, in the case of emotion, a network of regions) are involved, they can neurologically evolve independently of each other. And in the age of Facebook, that could mean a significant shift in the way we recognize connections and create “cognitive communities.”
Sapolsky elaborates: "Through history, Capgras syndrome has been a cultural mirror of a dissociative mind, where thoughts of recognition and feelings of intimacy have been sundered. It is still that mirror. Today we think that what is false and artificial in the world around us is substantive and meaningful. It’s not that loved ones and friends are mistaken for simulations, but that simulations are mistaken for them."
As I said in a column a few months back, we are substituting surface cues for familiarity. We are rushing into intimacy without all the messy, time-consuming process of understanding and shared experience that generally accompanies it.
Brains do love to take short cuts. They’re not big on heavy lifting. Here’s another example of that:
Free Will is Replaced with An Algorithm
With an article in The Atlantic, Derek Thompson explored “The Post Human World” in a conversation with historian Yuval Harari, author of the best-seller "Sapiens." One of the topics the two discussed was the end of individualism.
Humans (or, at least, most humans) have believed that our decisions come from a mystical soul -- a transcendental something that lives above our base biology and is in control of our will. Wrapped up in this idea is the concept of our importance in the world as free-thinking agents.
In the past few decades, there is a growing realization that our notion of “free will” is just the result of a cascade of biochemical processes. There is nothing magical here; there is just a chain of synaptic switches being thrown. And that being the case, if a computer can process things faster than our brains, should we simply relegate our thinking to a machine?
In many ways, this is already happening. We trust Google Maps or our GPS device more than we trust our ability to find our own way. We trust Google Search more than our own memory. We’re on the verge of trusting our wearable fitness tracking devices more than our own body’s feedback.
In all these cases, our trust in tech is justified. These things are usually right more often than we are. But looking at the larger issue of humans versus machines, there's a slippery slope that we’re already well down.
Harari speculates what might be at the bottom: "What really happens is that the self disintegrates. It’s not that you understand your true self better, but you come to realize there is no true self. There is just a complicated connection of biochemical connections, without a core. There is no authentic voice that lives inside you."
When I lie awake worrying about technology, these are the types of things that I think about. Here's the big question: Is humanity an outmoded model?
The fact is, we evolved to be successful in a certain environment. But here’s the irony in that: We were so successful that we changed that environment to one where it was the tools we’ve created, not the creators, that are the most successful adaptation. We may have made ourselves obsolete.
And that’s why really smart humans, like Bill Gates, Elon Musk and Stephen Hawking, are so worried about artificial intelligence.
"It would take off on its own, and redesign itself at an ever increasing rate," said Hawking in a recent interview with the BBC. "Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."Worried about a machine taking your job? That may be the least of your worries.