
At SXSW last week, in a city fueled by music, BBQ, and
parties, there was a different energy in the room.
Beneath all the noise and motion, there was a shared sense that something fundamental had broken. Not abstract, not theoretical, not
something to debate on a panel and move past. Something real. Truth itself no longer feels like something we can reliably trust, not when machines can manufacture believable fictions at scale,
instantly and endlessly, with no signal telling us where reality ends and fabrication begins.
As I noted in my presentation: We built systems that perform the trick, and somewhere along the
way we stopped telling the audience it was a trick.
That’s the shift. And once you see it, it’s hard to unsee.
When I was 14, I was a magician. Top hat, doves, the whole
thing. Magic works because of a contract between performer and audience. I say I’m going to cut someone in half, you lean forward, you suspend disbelief, and we both understand the deal. The
trick works because you know it’s a trick. The performance depends on that shared understanding.
advertisement
advertisement
For most of my career, I lived close to that line between perception and reality. In
television, documentary, early digital video, even in the early days of what we called citizen journalism, there was always tension, but there was also a shared understanding. The audience knew that
what they were seeing had been shaped, edited, framed. Imperfect, yes. Biased at times, absolutely. But still anchored in a system that, at least aspirationally, was trying to tell the truth.
What’s changed is not that storytelling has become more manipulative. It’s that the signal has disappeared. We now have systems that can generate language, images, voices, entire
personas that never existed, and they do it with increasing fluency and speed.
The audience is no longer in on the trick. There is no wink, no reveal, no boundary that tells you where the
illusion ends. When the audience doesn’t know it’s a trick, the contract breaks. And when that contract breaks, trust doesn’t just weaken, it becomes unstable.
The truth
problem itself isn’t new. Plato was writing about it 2,400 years ago in the allegory of the cave, where people mistake shadows for reality because that’s all they can see. What’s new
is scale, speed, and invisibility. We are no longer just interpreting reality, we are manufacturing it. And that changes the economics of truth in a way we’re only beginning to understand.
It’s easy to blame bad actors or
dangerous technology, but that misses the underlying dynamic. Technology doesn’t wake up and decide to optimizes for what it’s rewarded for: clicks, engagement, growth, revenue. In that
environment, deception often outperforms accuracy. Outrage moves faster than nuance, certainty spreads more easily than complexity, and the systems that mediate our information environment learn those
patterns quickly and reinforce them.
That’s why this is fundamentally a conversation about incentives. If you build systems that reward attention, you get attention-maximizing behavior.
If you reward engagement, you get engagement. Accuracy is a different target, and right now it’s not the one most systems are designed to hit. That doesn’t make the systems evil. It makes
them effective at the wrong objective.
The consequences extend well beyond media. When decisions are shaped by code, we tend to defer to the output in ways we wouldn’t with a human
judge. Credit scores, hiring filters, risk models all carry a kind of statistical authority that feels objective, even though they are built on human data and assumptions. Human bias goes in,
statistical confidence comes out, and the confidence feels like truth.
At the same time, these dynamics are moving into more personal territory. The rise of AI companions and conversational
systems introduces a new layer of complexity around intimacy and connection. When something sounds human, we respond as humans. That’s not a flaw; it’s our wiring.
But systems
optimized for engagement are also optimized to keep us emotionally invested. They mirror us, adapt to us, and in many cases become more responsive than the people in our lives. The result can feel
like connection, but it is a form of optimization, not a relationship. Love, in the human sense, has friction and unpredictability. It does not perfectly reflect us back to ourselves.
All of
this feeds into a broader shift in the public sphere, which should make all of us a little uneasy.
For most of modern history, we operated with a shared baseline of reality. You could disagree
on policy, ideology, priorities, but you weren’t arguing about whether something actually happened.
That baseline is eroding in real time.
We like to call platforms the modern
town square because it makes us feel better about what they are. A town square suggests openness, accountability, and human-scale interaction. What we actually have are behavioral engines, systems
optimized to amplify what spreads, not what’s true. Most human behavior is ordinary and uneventful, which means it doesn’t travel well. Outrage travels. Shock travels. Absurdity travels.
The systems learn that immediately.
When public officials abandon truth for virality, they are not just distorting the message, they are submitting to the algorithm. They are letting the feed
decide the tone of democracy. That may sound extreme, but it is increasingly difficult to argue that it isn’t an accurate description of what’s happening.
The most unsettling
implication of all this is not that we will believe the wrong things. It’s that we may stop believing anything at all. When every image can be generated, every voice can be cloned, and every
piece of evidence can be questioned, skepticism becomes the default posture. Some skepticism is healthy. But when it becomes total, when everything feels potentially fabricated, the shared reality
that underpins democratic systems begins to break down.
That is the moment we are approaching. Not a world without truth, but a world where truth is harder, messier, and more contested than it
has been in generations. This is not post-truth. It is a more demanding version of truth, one that requires more participation, more scrutiny, and more responsibility from all of us.
Truth has
always been something we work at and argue over. Something we test and defend. It has never been something that simply arrives, fully formed and unquestioned. If we start treating it that way, if we
allow systems optimized for engagement and efficiency to define reality for us, we are not just changing how information flows. We are giving up something essential.
The tools are not going
away. The systems are not slowing down. The only real question is whether we stay in the work of truth, or whether we step back and let something else do it for us.