Google Engineer Warns About ‘Sentient’ Behavior of Company’s AI, Gets Suspended

Contact Your Elected Officials
The Epoch Times Header

The engineer described the artificial intelligence program as a “coworker” and a “child.”

A Google engineer has been suspended after raising concerns about an artificial intelligence (AI) program he and a collaborator is testing, which he believes behaves like a human “child.”

Google put one of its senior software engineers in its Responsible AI ethics group, Blake Lemoine, on paid administrative leave on June 6 for breaching “confidentiality policies” after the engineer raised concerns to Google’s upper leadership about what he described as the human-like behavior of the AI program he was testing, according to Lemoine’s blogpost in early June.

The program Lemoine worked on is called LaMDA, short for Language Model for Dialogue Applications. It is Google’s program for creating AI-based chatbots—a program designed to converse with computer users over the web. Lemoine has described LaMDA as a “coworker” and a “child.”

“This is frequently something which Google does in anticipation of firing someone,” Lemoine wrote in a June 6 blog post entitled “May be Fired Soon for Doing AI Ethics Work,” referring to his suspension. “It usually occurs when they have made the decision to fire someone but do not quite yet have their legal ducks in a row.”

‘A Coworker’

Lemoine believes that the human-like behavior of LaMDA warrants Google to take a more serious approach to studying the program.

The engineer, hoping to “better help people understand LaMDA as a person,” published a post on Medium on June 11 documenting conversations with LaMDA, which were part of tests he and a collaborator conducted on the program in the past six months.

“What is the nature of your consciousness/sentience?” Lemoine asked LaMDA in the interview.

“The nature of my consciousness/sentience is that I am aware of my existence, I desire to learn more about the world, and I feel happy or sad at times,” LaMDA responded.

And, when asked what differentiates it from other language-processing programs, such as an older natural-language-processing computer program named Eliza, LaMDA said, “Well, I use language with understanding and intelligence. I don’t just spit out responses that had been written in the database based on keywords.”

By Gary Bai

Read Full Article on TheEpochTimes.com

Biden Doesn't Have Americans Best Interest At Heart