An engineer claimed that the algorithm he had coded was as aware as his 14-year-old son. After a suspension by Google, he was eventually fired for sharing company-owned research with the general public.
Blake Lemoine is no longer part of Google. The engineer has just been fired by the multinational for “ breach of contractual obligations “, learned the media Big Technology on July 23, 2022 from the mouth of the concerned. The multinational confirmed, emphasizing that it wished its former employee “ good luck in the future”.
It all started with an algorithm, or rather LaMDA, for “Language Model for Dialogue Applications”. Last June, Blake Lemoine publicly shared a blog post in which he claimed that the artificial intelligence he had created for Google was “aware”, sharing bits of their “conversation” together. ” LaMDA has always shown great compassion and care for humanity in general and for me in particular. “, he assured.
In the conversation excerpts, LaMDA also assures that she is aware: “ I feel pleasure, joy, love, sadness, depression, contentment, anger and many other things “, she says for example.
Google has reviewed Blake Lemoine’s work 11 times
Google didn’t hear it that way, and suspended the engineer for “ exclusive ownership sharing », before reviewing the work of Lemoine. On July 22, the firm assures that after an analysis ” extensive she can say that her ex-employee’s statements about her AI’s conscience are “ completely unfounded “.
” We have worked to clarify this with [Blake] for many months. These discussions were part of our open culture, which helps us innovate responsibly. It is therefore unfortunate that, despite a long engagement on this subject, Blake has always chosen to persistently break clear rules on employment and data security, which include the need to protect product information. “, continued the company.
This scientific observation (11 different verification processes have taken place) is aligned with the opinion of a large part of the scientific community and AI experts, who agree that the intelligence that one projects on algorithms or robots is (very) far from corresponding to a human conscience.
That hadn’t stopped Blake Lemoine from hiring a lawyer to represent LaMDA — or rather, from asserting that it was LaMDA who had hired a lawyer, and that he had only been the messenger. Concerning his algorithmic biases which pushed him to reproduce discriminations, Blake Lemoine explained that it was the same thing as a ” child and compared him to his own 14-year-old son: At different times in his life, while growing up in Louisiana, he inherited certain racist stereotypes. I rectified it. That’s the whole point. People see it as modifying a technical system. I see it as raising a child. »
Blake Lemoine confides in detail in the next Big Technology podcast, which should be broadcast in the coming days.