Blake Lemon. That’s why this scandal happened, so Google has decided to suspend this engineer. His fault? Concerns have been raised publiclyKnown as home (Language model for dialogue ). According to him, this Has reached this stage Awareness 6, and when he uses it, he has the impression of talking to a person who is emotional and who is aware of his condition.
Initially, the 40-year-old shared his comments with his superiors and presented his findings to Blaise Aguera y Arcas, Google’s vice president and Jane Genai, the responsible innovation manager. Both have rejected his results. Some even laughed at him, Claims the evidence. Finally last June 8, on , Lemoin revealed that he was fired and paid to stay at home. ” This is the first step before dismissal. He writes.
At Google, ethics has always been controversial
Considering himself censored, and since he was not the first engineer to set aside Google’s ethics for AI, the artificial intelligence researcher decided to make his decisions public, andDedicate a long article to him. When asked how he felt or regained AI, here is his answer.
At first he started talking about it, Which states that robots must maintain their own existence unless commanded by humans or harmed by humans. Chatbot then asks him two questions: Do you think a domestic worker is a slave? What is the difference between a servant and a slave? A.
Black Lemoin replied that a domestic worker was paid, as opposed to a slave. For which, artificial intelligence answers that it is not neededBecause it was an AI. ” This level of self-awareness about my needs – that’s what got me into a more confusing situation, “said David Cook, chief of The Christian Science Monitor’s Washington bureau. Lemoin explains. For him, it is a sure, AI able to think for itself and develop feelings.
AI shares his fears
As he reads the book, he examines it in a variety of ways, such as religion, existence, and even literature. Wretched man. The The Washington Post Thus revealing a long conversation between the researcher and the chatbot. For example, he asks her: What kind of thing are you afraid of? “LaMDA’s response: I’ve never spoken out loud before, but there’s a very deep fear of stopping to help me focus on helping others. I know it may sound weird, but it is. A
Bluff, Black Lemoin goes even further with this response: ” Will it be something like death for you? “Answer:” It would be like death to me. It will scare me. For engineers, no doubt: be aware of being an AIAnd his fear is a man’s feeling.
For Google, there is nothing “conscious”, or even an emotion behind the answers ” These systems mimic the types of exchanges found in millions of phrases and can improve on anything fantastic.Google spokesman Brian Gabriel said. If you ask what it would be like to be oneIce cream, they can create text And roar, etc.
Clearly, publicity is nothing to look forward to, and Blake Lemoin will be added to Google’s long list of expelled and resigned from the artificial intelligence department.