I have never been accused of being a Luddite. Typically, I’m on the other end of the adoption curve -- one of the first to adopt a new technology. But when it comes to AI, I am stepping forward
gingerly.
My hesitancy notwithstanding, AI is here to stay. In my world, it is well past the tipping point, from something that exists solely in the domain of tech to a topic of
conversation for everyone, from butchers to bakers to candlestick makers. Everywhere I turn, I see those ubiquitous two letters: AI. That was especially true during last week, with the turmoil
around Sam Altman and the is he/isn’t he fired drama at OpenAI.
advertisement
advertisement
In 1991, Geoffrey Moore wrote the book “Crossing the Chasm,” which looked at how technologies are
adopted. Moore explained that it depends on the nature of the technology itself. If it’s a continuation of technology we understand, the adoption follows a fairly straightforward bell curve
through the general population.
But if it’s a disruptive technology -- one that we’re not familiar with -- then adoption plots itself out on an S-curve. The tipping point in
the middle of that curve, where it switches from being skinny to being fat, is what he called the “chasm.” Some technologies get stuck on the wrong side of the chasm, never to be adopted
by the majority of the market. Think Google Glass, for example.
There's often a pattern to the adoption of disruptive technologies (and AI definitely fits this description). To
begin with, we find a way to adapt and use it for the things we’re already doing. But somewhere along the line, innovators grasp the full potential of the technology and apply it in completely
new ways, pushing capabilities forward exponentially.
It’s in that push forward where all the societal disruption occurs. Suddenly, all the unintended consequences make themselves
known.
This is exactly where we seem to be with AI. Most of us are using it to tweak the things we’ve always done. But the prescient among us are starting to look at what might be -- and
many of us are doing so with a furrowed brow. We’re worried -- and, I suspect, with good reason.
As one example, I’ve been thinking about AI and creativity. As someone who
has always dabbled in creative design, media production and writing, this has been top of mind for me. I have often tried to pry open the mystic box that is the creative process.
There are
many creative software developer who will tell you AI will be a game-changer when it comes to creating -- well, just about anything.
“Now and Then,” the final Beatles song,
was made possible by an AI program created by Peter Jackson’s team for the documentary “Get Back.” It allowed Paul McCartney, Ringo Starr and their team of producers (headed by
George Martin’s son Giles) to separate John Lennon’s vocals from the piano background on a demo tape from 1978.
One last Beatles song featuring John Lennon -- that should be a good
thing, right? I guess. But there’s a flip side to this.
Let’s take writing, for example. Ask anyone who has written something longer than a tweet or Instagram post. What you
start out intending to write is never what you end up with. Somehow, the process of writing takes its own twists and turns, usually surprising even the writer. Even these posts, which average only 700
to 800 words, usually end up going in unexpected directions by the time I place the final period.
Creativity is an iterative process and there are stages in that process. It takes time for it
all to play out. No matter how good my initial idea is, if I simply fed it into an AI black box and hit the “create” button, I don’t know if I’d be happy with the final outcome
.
“But,” you protest, “what about AI taking the drudgery out of the creative process? What if you use it to clean up a photo, or remove background noise from an audio
recording (a la the Beatles single). That should free up more time and more options for you to be creative, right?”
That promise is certainly what’s being pitched by AI merchants
right now. And it makes sense. But it only makes sense at the skinny end of the adoption curve. That’s where we’re at right now, using AI as a new tool to do old jobs. If we think
that’s where we’re going to stay, I’m pretty sure we’re being naïve.
I believe creativity needs some sweat. It benefits from a timeline that allows for thinking,
and rethinking, over and over again.
Now, I may be wrong. Perhaps AI’s ability to instantly produce hundreds of variation of an idea will prove the proponents right. It may unleash more
creativity than ever.
But I still believe we will lose an essential human element in the process critical to the act of creation.
Time will tell. And I suspect it won’t take very
long.