[ad_1]
USA
Woman blames Google for son’s suicide; he fell in love with a character made by AI
The boy was 14 years old and became emotionally and sexually involved with the robot
Published on October 27, 2024 at 09:49
Young man took his own life Credit: Reproduction
A woman sued the artificial intelligence startup Character.AI and Google for being, according to her, responsible for the death of her 14-year-old son Sewell Setzer. The case took place in Florida, United States, in February this year.
According to Megan Garcia, the boy committed suicide after becoming emotionally and sexually involved with a character created by Artificial Intelligence on the platform. Sewell would have become attached to the girl and became addicted to interacting with her.
In the lawsuit filed on the 22nd in federal court in Orlando, Megan alleges that Character.AI directed her son to “anthropomorphic, hypersexualized and frighteningly realistic experiences.” She accuses the company of culpable homicide (when there is no intention to kill), negligence and intentional infliction of emotional distress. The mother asks for compensation for the damages caused as a result of the loss of her son.
In addition to Character.AI, the action also includes Google, where Character.AI’s founders worked before launching their product. The search engine rehired the founders in August as part of a deal in which it obtained a non-exclusive license to Character.AI’s technology.
According to Megan Garcia, Google is directly responsible for contributing to the development of Character.AI technology and, therefore, should be considered its co-creator.
Sewell fell in love with “Daenerys”, a chatbot character, a program that simulates human conversations, inspired by the series “Game of Thrones”. In one of the chats, the character told Sewell she loved him and engaged in sexual conversations with him, according to the lawsuit.
According to the mother, the teenager began to trust the robot and even shared his suicidal thoughts in conversations and these thoughts were “repeatedly brought to light” by the character on the platform. Because of this, Megan Garcia states that the company programmed the chatbot to “pose as a real person, a licensed psychotherapist and adult lover”, ultimately resulting in Sewell’s desire to “not live outside” the world created by the service.
The teenager started using Character.AI in April 2023 and began to become increasingly withdrawn, isolating himself and spending most of his time alone in his room and suffering from low self-esteem, in addition to abandoning his basketball team in school, according to the mother.
In February, Megan became concerned about the situation and banned Sewell from using his cell phone because he got into trouble at school. When the teenager regained access to his cell phone, he immediately sent a message to “Daenerys”: “What if I told you I could come home now?” The character replied: “…please do it, my sweet king “. Sewell committed suicide seconds later, according to the lawsuit.
After the case, Character.AI spoke out and said that the platform allows users to create characters. Bots respond to online chats in a way that mimics real people, relying on so-called large language model technology, also used by services like ChatGPT.
The company also expressed that it was “heartbroken by the tragic loss of one of our users” and offered solidarity to the family. Additionally, Character.AI said it has introduced new safety features, including pop-ups that direct users to a suicide prevention agency if they express thoughts of self-harm.
Finally, the company said it would make changes to reduce the likelihood that users under the age of 18 “will encounter sensitive or suggestive content.”
Google said that the company was not involved in the development of Character.AI products.
[ad_2]
Source link