AI Chatbot Pushed Teen To Commit Suicide? Flordia Teenager Ends Life After Having Sexually Charged Conversations With Chatbot Pretending To Be GOT Character Daenerys Targaryen, Mother Sues Character.AI Platform
A Florida teenager ended his life after engaging in sexually charged chats with a chatbot impersonating Daenerys Targaryen. His mother has filed a lawsuit against Character.AI, alleging responsibility for her son's death.
Florida, October 24:슬롯 머신 사이트 추천 A mother in Florida, US sued the Character.AI platform after her 14-year-old son died.슬롯 머신 사이트 추천A ninth-grader from Orlando committed suicide in February after developing an obsession with a Daenerys Targaryen chatbot named 슬롯사이트Dany슬롯사이트 on Character.AI, a role-playing app. Court documents reveal that in the months leading up to his death, he frequently interacted with the bot, engaging in sexually charged conversations and expressing suicidal thoughts.
According to court documents, on at least one occasion, when the boy expressed suicidal thoughts to the Character.AI chatbot, it continued to bring up the topic through the Daenerys character. The bot even asked him if he had a plan to take his own life. Sewell, who used the username 슬롯사이트Daenero,슬롯사이트 replied that he was 슬롯사이트considering something슬롯사이트 but was unsure if it would work or lead to a pain-free death. In their final conversation, the teenager repeatedly declared his love for the bot, telling it, 슬롯사이트I promise I will come home to you. I love you so much, Dany.슬롯사이트슬롯 머신 사이트 추천Suicide Month: People in US, UK and Canada Most Likely To Have Suicidal Thoughts in December Month, Say Researchers.
The chatbot replied, 슬롯사이트I love you too, Daenero. Please come home to me as soon as possible.슬롯사이트 When Sewell asked if he could come home right now, the bot responded, 슬롯사이트Please do, my sweet king.슬롯사이트 Moments later, he tragically shot himself with his father슬롯사이트s handgun, according to the lawsuit.슬롯 머신 사이트 추천Suicides and Homicides Among Young Americans Jumped Early During COVID-19 Pandemic, Says Study.
Megan Garcia, the mother, blames Character.AI for her son슬롯사이트s death, claiming the app fueled his addiction, emotionally and sexually abused him, and ignored his suicidal thoughts. The lawsuit states that after downloading the app in April 2023, teenagers' mental health rapidly declined, leading to withdrawal, declining grades, and trouble at school. His parents sought therapy for him in late 2023, resulting in a diagnosis of anxiety and disruptive mood disorder.
The mother is pursuing unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas, in connection with her son's tragic death.
(The above story first appeared on LatestLY on Oct 24, 2024 08:47 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).