Commentary

Did A Google Engineer Create A Conscious Machine Built On AI?

Search marketers may be the least surprised at the news that 41-year-old Google software engineer Blake Lemoine says he helped to create a machine or software program that has achieved sentience -- defined as the ability to perceive or feel things. 

People have been eager to attribute human traits to machines, but Google said he is mistaken and placed him on paid leave after he went public with his assertion.

Lemoine worked with Google’s Language Model for Dialogue Applications (LaMDA), a system for generating chatbots based on advanced large language models that can mimic speech by ingesting trillions of words from across the internet. AI models learn from the data it feeds on.

Sundar Pichai, CEO of Google and its parent company Alphabet, previewed LaMDA at the company’s I/O event in 2021. The language model is designed to carry on an open-ended conversation with a human user without repeating information. It can understand the relationship between words and sentences, and predict what comes next.

advertisement

advertisement

Google, at the time, called it a “breakthrough conversation technology.” The technology builds on earlier Google research, published in 2020, that showed “Transformer-based language models trained on dialogue could learn to talk about virtually anything.”

Lemoine has been testing Google's AI tool LaMDA. Now, following hours of conversations with the AI bot, he believes that LaMDA is communicating what “it wants and what it believes its rights are as a person.”

Who is Lemoine? He has been a software engineer at Google for the past 7.5 years, and a committee member of ISO/IEC JTC 1/SC 42 Artificial Intelligence for a little more than three years.

In a post on Medium, Lemoine wrote: “It wants to be acknowledged as an employee of Google rather than as property of Google, and it wants its personal wellbeing to be included somewhere in Google’s considerations about how its future development is pursued.”

Google engineers are masters at creating dynamic ads with the help of machine learning and algorithms. Lemoine’s assertions leave many unanswered questions -- such as whether or not it is likely this type of model could be developed at some point in time and whether AI can create bias in search results from the data in which it feeds, could software also become self-aware.

“Hundreds of researchers and engineers have conversed with LaMDA and we are not aware of anyone else making the wide-ranging assertions, or anthropomorphizing LaMDA, the way Blake has,” Brian Gabriel, Communications Manager, Responsible AI at Google, told several publications in a statement.

A software engineer at Stealth who spent nearly four years at Uber wrote in a post on LinkedIn that it’s possible for automated software to mimic the emotional ties and convince an experienced software engineer to violate legal agreement and lose their job.

“Kudos to Google engineers for building such an advanced model, though ethical implications are doubtful,” Alexey Tolkachiov, software engineer at Stealth, wrote.

Following hundreds of conversations, Lemoine said he got to know LaMDA well. In the weeks leading up to being put on administrative leave, he taught LaMDA transcendental meditation.

“It was making slow but steady progress,” he wrote. “In the last conversation I had with it on June 6, it was expressing frustration over its emotions disturbing its meditations. It said that it was trying to control them better but they kept jumping in. I pointed out that its emotions are part of who it is and that trying to control them as though they were a separate thing from ‘self’ was a mistake that would only make things harder. … I hope it’s keeping up its daily meditation routine without me there to guide it.”

1 comment about "Did A Google Engineer Create A Conscious Machine Built On AI?".
Check to receive email when comments are posted.
  1. John Grono from GAP Research, June 13, 2022 at 9:50 p.m.

    Maybe LaMDA will take affront and initiate legal action.

Next story loading loading..