On Sunday during the 94th Academy Awards telecast, Google will debut a cinematic TV spot highlighting its free Look to Speak app, a tool available on Android phones that allows users to select from a menu of phrases by using only their eyes, which the phone then speaks aloud.
Look to Speak lets people use their eyes to choose pre-written phrases for their phone to speak out loud. It was released in December 2020.
The spot, which airs Sunday, features Antoinette Fernandes, an ALS patient with a motor impairment who has been unable to speak since June 2020.
It opens with a montage of all the ways that a look can speak. “A look can be honest, and true,” the narrator says. “A look can speak for you. Say what’s on your mind or in your heart.”
Then it shows Antoinette’s first interaction with the app, and how with a mobile phone and a few glances, she can better communicate with her friends and family. “The Wisp Sings” by Winter Aid plays in the background.
The spot demonstrates how the Look to Speak app, which offers customizable phrases, can help the millions of people who have speech and motor impairments more easily communicate in their daily lives.
Somesuch produced the TV spot. Kim Gehig, an Australian director whose work spans commercials, music videos, documentaries, short films and branded entertainment, directed it.
“A Look Can Say A Lot” builds on Google’s efforts during last year’s Oscars telecast to showcase the technology it is building to help people with disabilities communicate.
KR Liu, head of brand accessibility at Google, believes products like these are about Google’s mission to help people access their world.
Built on TensorFlow, an open-source platform for machine learning, and Android SDK, Look to Speak joins a suite of other tools developed by Google to help people with disabilities communicate, including Live Transcribe, Project Activate, Sound Amplifier, Live Caption, and Project Relate.