Google Urges Judge To Throw Out Privacy Claims Over Assistant

Google is urging a judge to dismiss a lawsuit alleging that the company's voice-activated Assistant illegally records people's conversations without their consent, and discloses snippets of conversations to outside contractors.

In papers filed Friday, Google says the service mainly captures conversations after people give the “OK, Google” or “Hey, Google” command, and that its privacy policy informs users that portions of their conversations might be disclosed.

“Plaintiffs complain that the Google Assistant ... did precisely what it is supposed to -- record when it detects a hotword -- which is necessary for the Assistant to provide its key function of processing a user’s command,” the company writes in a motion filed with U.S. District Court Judge Beth Labson Freeman in San Jose, California. “Google’s alleged use of those recordings to improve its speech recognition technology or target ads is expressly permitted under its privacy policy and thus also does not give rise to any claim.”

The company's papers come in response to a class-action lawsuit brought earlier this year by five consumers -- Asif Kumandan, Melissa Spurr and her child, identified only as “B.S.,” Loudes Galvan and Eleeana Galvan.

They brought suit several weeks after the Dutch radio broadcaster VRT reported that Google Home smart speakers and Google Assistant were transmitting consumers' conversations to Google, even when people hadn't first given the “Hey, Google” or “OK, Google” commands. (Those hotwords signal an intention to interact with the devices.)

VRT also reported on its website that Google sometimes sends portions of users' conversations to outside contractors who analyze language patterns. VRT said it had listened to more than 1,000 excerpts of conversations -- including 153 where participants hadn't said a hotword. An outside contractor shared the voice snippets with VRT, in apparent violation of Google's policies.

Google counters that even if its Assistant records conversations after mistakenly interpreting words or background noise as a command, the unintentional error doesn't violate any privacy laws or promises to consumers.

“An inadvertent error in hotword detection does not amount to a violation of federal or state privacy laws or breach any promise made to consumers,” Google writes.

The company adds that its privacy policy provides for disclosures for purposes including improvements and ensuring services are working properly.

“These provisions squarely permit the alleged disclosure of audio data to subcontractors for processing and analysis to improve the functionality of the Assistant,” Google writes.

Next story loading loading..