artificial intelligence

How Emotion AI Is Being Used in Marketing


Within a short time, artificial intelligence has become fairly commonplace. If you call a large business’s help line, a system on the other end analyzes your query and determines where to route your call. Many of our cars can detect if we’re tired, help us park, and warn us of dangers of which we weren’t previously aware.

Likewise, AI is being used in marketing more and more to analyze situations. Emotion AI is a new twist, using the discernment of consumers’ emotions to fuel individual responses to each consumer.

A recent report by Gartner outlines the way AI has permeated marketing and how Gartner expects it to continue to evolve over the next decade. Gartner defines Emotion AI as “using AI technologies to analyze the emotional state of a user via computer vision, audio/voice input, sensors and/or software logic.”

advertisement

advertisement

As an example, Gartner highlighted Jeep’s use of Affectiva’s Media Analytics capabilities to measure viewers’ emotional responses to its Electrified Wrangler 4XE ad. Gartner claimed Affectiva was able to confirm the ad’s market-shaping potential among those further away from making an active purchase decision to buy an EV.

The report stated that, “By combining EAI with computer vision to analyze the composition of each frame in a video, ML can discover which features (music, dialogue, celebrity appearances, logos, etc.) elicit the strongest responses or lead to tune-out among select contextual audience segments.”

In other words, AI can be trained to isolate the strongest reactions among viewers and use that as a guideline to shape future ads.

Of course, Gartner advocates full disclosure for such marketing, citing research in which most consumers state that they want full visibility into how AI is being used to recognize them, act as an automated assistant or suggest purchases based on the profile it has made of them.

Above all, Gartner warns of freaking consumers out with AI: “Avoid trickery,” the report warns, “and beware the uncanny valley, a term that describes the disquieting reaction people have to simulated realistic entities that fall short of appearing completely natural.”

Next story loading loading..