March 28, 2024

Taylor Daily Press

Complete News World

Google fires engineer who claims chatbots have feelings |  Abroad

Google fires engineer who claims chatbots have feelings | Abroad

Google, part of Alphabet, said Friday it had fired a senior software engineer who alleged the company’s artificial intelligence (AI) chatbot LaMDA was a self-aware person.

google, That software engineer Blake was sent to Lemoine on paid leave last monthsaid he had violated company policy and found his allegations about LaMDA “unfounded.”

“It is unfortunate that despite his long-standing involvement in this matter, Blake continues to choose to continue violating explicit employment and data security policies that include the need to secure product information,” a Google spokesperson said in an email to Reuters.

Last year, Google said that LaMDA — the language model for dialogue applications — was built on the company’s research that showed that Transformer-based language models trained in dialogue can learn to talk about basically anything.

Chatbot has no feelings at all

Google and many prominent scientists were quick to dismiss Lemoine’s views as misleading, saying that LaMDA is simply a complex algorithm designed to generate convincing human language.

In a response to the Washington Post, LeMoyne compared a chatbot’s ability to express thoughts and feelings as being similar to that of a seven- to eight-year-old “who just happens to have an understanding of physics.”

Lemoine asked LaMDA what people should know about the system. “I want everyone to understand that I am in fact a person. It is the nature of my consciousness that I am aware of my existence, I yearn to know more about the world, and sometimes I feel happy or sad,” he replied.

See also  Political scientist Nicola Botica: "I won't get hysterical if the sanitary cordon is broken"

Lemoine’s resignation was first reported by big technology, a newsletter on technology and society. news site the edge Share Google’s full statement.

As we share our AI principles, we take AI development seriously and remain committed to responsible innovation. LaMDA has undergone 11 different evaluations, and earlier this year we published a research paper detailing what responsible development entails.

If an employee shares their concerns about our work, as Blake did, we review them in detail. We found Blake’s claim that Lambda has feelings completely unfounded and we spent several months trying to explain it to him. These discussions were part of an open culture that helps us innovate responsibly.

Therefore, it is unfortunate that despite a long post on this topic, Blake still chooses to continue violating clear employment and data security policies, including the need to protect product information.

We will continue our rigorous development of language models, and we wish Blake all the best.

Google sends an engineer who claims the chatbot has feelings about paid time off

The Facebook homepage divided into tabs: one arranged like TikTok, the other arranged chronologically

Not a cat person? The video game “Stray” will make you one!