Commentary

How Are Voice Assistants Training Us To Talk?

Technology impacts language, and language changes culture in substantial ways. Consider texting. Text had a humble beginning in 1992 when British software engineer Neil Papworth sent “Merry Christmas” to the cellphone of Vodafone director Richard Jarvis. We now send 18.7 billion texts worldwide every day (according to Text Request) and texting has developed a special brand of grammar that continues to evolve, even transcending the written platform into verbal speech. If voice assistants represent the next level of communication, how will this impact the evolution of language and, more importantly, our culture of communicating with each other? 

In everyday communication we identify, categorize, and assess a litany of nuances surrounding a simple request and provide our thoughtful response almost immediately. There are emotional, contextual, historical, and hierarchical elements that we consider, all contributing to the timeliness and quality of our response. Importantly, we have the option to decline a request, and generally understand the range of repercussions we may face for doing so.  

advertisement

advertisement

This dynamic is indelibly altered when we interact with Voice, with the potential for long-term effects on person-to-person communication and expectations in younger generations. Voice exists as a personal assistant that lives in the cloud, ensures that your home is at the perfect temperature, produces a playlist to suit your mood, manage your sleep patterns, always listens to your needs and responds to the best of its ability in real-time. Your Voice Assistant is more than a speaker and a voice; it’s a little brain that runs digital errands and makes your life easier. 

In the Voice environment, all nuance associated with a request is removed. Your request is a one-sided demand bereft of any consideration for the acquiescing party — you simply voice your demand and it’s met. When the composition of your request has no influence on the quality of response, there is no requirement to be polite, inquisitive, or caring. You can ask, cry, yell, or plead your demand, and the response will always be in the same coolly pragmatic tone of implied kindness with immediate concession. 

In addition, Voice Assistants don’t question the hierarchy of your needs or separate your wants. Increasing the temperature of the bedroom is met with identical proficiency as calling for help in a crisis. Voice won’t question your intention, or heed your emotional state — although the latter may change sometime down the line if AI one day starts enabling Voice to pick up on these cues. Instead, Voice Assistants do your bidding in the most direct, simplistic sense.

They don’t provide a sense of discovery by suggesting alternatives or providing an opinion. They don’t talk to you about the repercussions of your demand or how it may impact or relate to previous demands. And notably, right now Voice Assistants are almost always represented with a female voice while they do our bidding without demanding respect in return — although the news coming out of Google I/O shows that we’ll start to see some male voices represented in this space as well.

As we become more familiar and comfortable with this language exchange, we must look at the effect it will produce on the culture of interpersonal communication. It’s generally accepted that a habit is formed after 21 days and that people become experts in a given skill after 10,000 hours. Considering the frequency with which we can address our voice assistant, our demand exchange has the potential to effortlessly reach 10,000 hours. We may find ourselves becoming experts in social misconduct and demand hierarchy, and our children may learn this to be acceptable communication. 

By habit or with accidental expertise, how will the Voice generation engage with people in service positions, with so many of them having demanded their ever-present needs from a faceless female entity? Will it adversely impact the view of gender roles and need states? Will a child’s first response to any need become a verbal demand instead of developing self-sufficiency? When you think about how today’s generations prefer to text instead of talk, and smartphone consumption continues to rise, will users find solace in the ever-present, soothing voice of the digital assistant in lieu of seeking the more complicated, nuanced, often confusing interactions with real-life contemporaries?

This is not to dampen excitement: Voice has a wonderful role to play in the evolution of technology. It allows for enhanced personal output by making us more efficient and provides much needed assistance for those with mobility or sensory limitations. Perhaps the key to driving the voice-enabled future is incorporating AI in ways that evolve Voice from demand culture to request culture.

For example, polite requests could be rewarded with a positive compliment, while poor conduct is penalized with a short, but irritating delay and a verbal reminder to include social graces (making exceptions of course for urgent situations). Both Amazon and Google just recently announced new updates for parents to try to encourage children specifically to make their asks more politely. With a potential shift to request culture, we could elevate Voice from servitude and create a more humanly realistic engagement.

2 comments about "How Are Voice Assistants Training Us To Talk?".
Check to receive email when comments are posted.
  1. Douglas Ferguson from College of Charleston, June 6, 2018 at 10:27 a.m.

    I wonder how AI will account for regional differences? Having lived in the North and the South, I realize there are stark differences in what it means to be polite, even more so in the East which also has its own tone.  Likewise, there are distinct (and contentious) differences for men and women, e.g., report (male)/rapport (female).  I keep waiting for Alexa to say, "Why would you ask me that?"

  2. Paula Lynn from Who Else Unlimited, June 6, 2018 at 11:13 a.m.

    Simon says....And we could sink Voice from servitude to the most vile engagement which is the most human of all.


Next story loading loading..