
AI has put an end to
emotions in marketing, we are told. Just look at what LLMs rely on in making brand recommendations. It’s all about facts. Not about emotions.
An analysis by Digital Bloom found that comparative listicles are far and away the most-cited content format by LLMs.
How-to guides and FAQs were frequently cited as well. Omniscient found that for branded prompts the bulk of LLM citations
come from editorial sites, online forums, review sites and directories.
In other words, AI looks for facts, whether they are scientific facts or practical facts or asserted facts or
discussion facts or evaluative facts or comparative facts. AI wants data related to performance and price. AI recommendations are rooted in those facts, not emotions.
advertisement
advertisement
There is a
longstanding debate in marketing about heads versus hearts, or thinking versus feeling, as it is often described. Which is to say, facts versus emotions.
The importance of emotions had been on
the comeback trail until AI exploded onto the scene in November 2022. Now, facts are ticking upward in importance as marketers scramble to ensure their brands are fully represented factually in online
knowledge graphs and social forums.
With more and more consumers relying on AI answer engines for buying recommendations, many prognosticators have proclaimed the ascendence of facts
over emotions as the future of marketing. Meaning structured information instead of creative content. No more emotions.
Maybe.
But then again, maybe not.
Giving up on emotions just because the latest technology is not a good fit with emotions is letting the tail wag the dog. It is capitulating the message to the medium, to paraphrase Marshall
McLuhan, and not unfairly because this is what McLuhan meant.
Different media are experienced and processed in different ways, to the point that the medium is often itself the
message taken away by the audience. McLuhan’s focus was linear print culture versus oral broadcast culture—print versus TV, or hot versus cool media, in McLuhan’s words.
AI is a new sort of media experience in which certain kinds of content work, and not others, and in which people are engaged interactively with chatbots mimicking the language and style of
human interlocutors. It is feared that this leads inevitably to decisions made only on the basis of facts or the structured information AI relies upon.
This fear is reinforced by the
intensity with which people have quickly become attached to chatbots. An MIT Media Lab studyestimates one-in-five American adults have had an intimate encounter with a chatbot. The
top use case for generative AI in 2025, according to Filtered.com, was therapy and
companionship.
With such strong relationships to chatbots, consumers will rely significantly, if not wholly, upon what chatbots have to say, which means all the facts from LLMs.
People find chatbots very persuasive. Research has found that chatbots can talk people out of their political opinions—even belief in conspiracy theories—in as little as 9 minutes. Chatbots are more convincing than ads, influencers and
storekeepers, using only facts and no emotions.
But the conclusion that AI is all facts and no emotions is belied by the emotional connection between humans and chatbots. If this
strikes you as peculiar, it is no more peculiar than brand love or brand superfans or brand evangelists, all of which are concepts about intimacy and passion between humans and commercial entities.
Emotions are always present.
It is not that emotions have been lost with the rise of AI. It is that emotions have been displaced or shifted from brands to chatbots. The emotional
connections that tie people to the marketplace no longer go just through brands. They now go through chatbots, too, and maybe only chatbots in the near future. But there are still emotions.
The biggest risk for brands is not the loss of emotions to facts, but the loss of emotional connections to chatbots. This risk will grow as AI evolves from shopping assistants to shopping
agents.
As long as humans are making the final decision about what to buy, emotions will always be in the mix. Emotions will be lost to the process only when AI takes over
decision-making. That won’t happen as long as AI provides only recommendations or assistance. However, it could very well happen when AI matures into self-directed agents that take charge of all
decisions. No people, no emotions.
But this scenario presumes that emotions, and the emotional benefits people get from brands, are lost because they are not part of the information
used by AI agents to compare and contrast brands. The further assumption implicit in this is that emotions are too sentimental and inexact to be represented as structured information for LLMs.
This underestimates marketing modelers. It just means that we don’t yet know how to code emotions into knowledge graphs. We will soon figure that out, guaranteed.
I
feel confident saying this because we have figured it out before. My friend and colleague Josh McQueen figured it out with an emotional lexicon he developed for testing ads when he ran research
worldwide for Leo Burnett.
My mentor and boss Kevin Clancy figured it out with the "Wheel of Emotions" he compiled from various social psychology sources to use in testing brand
positionings.
Russ Haley, originator of attitudinal segmentation and popularizer of the five-point purchase interest scale, spent the last third of his career at the University of New
Hampshire developing ways of measuring the intangible (often emotional) elements of ads that make them work.
Figuring this out for AI is only a matter of time. And given the speed at
which AI is evolving, it won’t take long.
LLMs are channeling emotional information already. LLMs rely heavily on discussion forums, Reddit and Quora especially. These are not
emotionless online forums. All kinds of emotions can be found in online discussions. Negative emotions have gotten a lot of attention, but it’s a full range of emotions in the arguments and
conversations people have online about every topic under the sun, brands included.
Emotions are thus part and parcel of what LLMs scan and learn. It is inaccurate to claim that facts
have displaced emotions. Many of these facts from online forums are emotionally laden and emotionally impactful on the nature and direction of the overall online discussion. To the extent that these
facts comprise part of the corpus of surveillance for LLMs, emotions have an impact.
Not to mention that the AI future is likely to see a revival of emotionally-driven advertising
and positioning.
Today, marketers are investing heavily to ensure their brands are part of the AI evaluation and recommendation loop. Once this initial surge of innovation and updating is
completed, though, marketers will be faced with feedback loops hard to break into, creating the subsequent need for ways of breaking these loops.
I predict a renaissance of
traditional media as marketers look to influence how people interact with AI. That won’t come from AI personalization loops. It will come from TV or billboards or live events or other non-AI
connections outside the loops. The desired behavior will be different. Not consideration or buying; rather, telling AI to do something different or to focus on a particular brand. It’s back to
the future.
However the future of AI unfolds, emotions will be a part.
Emotions are still in the picture, forcing brands to compete for consumer passions with a new set of
chatbot competitors. And emotions will be a big part of tomorrow as brands lean harder into every kind of consumer connection to sustain relationships in a new technological ecosystem. Which, of
course, is what brands have done every time a new medium has come along. It’s never either/or with heads or hearts, nor will it be with AI either.