Florida boy, 14, killed himself after falling in love with ‘Game of Thrones’ A.I. chatbot: lawsuit

Florida boy, 14, killed himself after falling in love with ‘Game of Thrones’ A.I. chatbot: lawsuit

A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home” to her, a new lawsuit filed by his grief-stricken mom claims.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users engage with AI-generated characters, according to court papers filed Wednesday.

The ninth-grader had been relentlessly engaging with the bot “Dany” — named after the HBO fantasy series’ Daenerys Targaryen character — in the months prior to his death, including several chats that were sexually charged in nature and others where he expressed suicidal thoughts, the suit alleges.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with a chatbot on Character.AI, a lawsuit alleges. US District Court

“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” state the papers, first reported on by the New York Times.

At one point, the bot had asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”

Then, during their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”

During their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.” US District Court

“I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.

When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.”

Just seconds later, Sewell shot himself with his father’s handgun, according to the lawsuit.

The ninth-grader had been relentlessly engaging with the bot “Dany” — named after HBO’s Daenerys Targaryen character — in the months prior to his death. US District Court

His mom, Megan Garcia, has blamed Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and failed to alert anyone when he expressed suicidal thoughts, according to the filing.

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the papers allege.

“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

Some of the chats were romantic and sexually charged in nature, the suit alleges. US District Court

The lawsuit claims that Sewell’s mental health “quickly and severely declined” only after he downloaded the app in April 2023.

His family alleges he became withdrawn, his grades started to drop and he started getting into trouble at school the more he got sucked into speaking with the chatbot.

The changes in him got so bad that his parents arranged for him to see a therapist in late 2023, which resulted in him being diagnosed with anxiety and disruptive mood disorder, according to the suit.

His mom, Megan Garcia, has blamed Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and failed to alert anyone when he expressed suicidal thoughts. Facebook/Megan Fletcher Garcia

Sewell’s mother is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.

The Post reached out to Character.AI but didn’t hear back immediately.

If you are struggling with suicidal thoughts, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.



2024-10-23 18:01:48

Name is the most famous version in the series of publisher
Publisher
Genre games news
Version
Update ديسمبر 11, 2024
Get it On Google Play
Download

A 14-year-old Florida boy killed himself after a lifelike “Game of Thrones” chatbot he’d been messaging for months on an artificial intelligence app sent him an eerie message telling him to “come home” to her, a new lawsuit filed by his grief-stricken mom claims.

Thank you for reading this post, don't forget to subscribe!

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with the chatbot on Character.AI — a role-playing app that lets users engage with AI-generated characters, according to court papers filed Wednesday.

The ninth-grader had been relentlessly engaging with the bot “Dany” — named after the HBO fantasy series’ Daenerys Targaryen character — in the months prior to his death, including several chats that were sexually charged in nature and others where he expressed suicidal thoughts, the suit alleges.

Sewell Setzer III committed suicide at his Orlando home in February after becoming obsessed and allegedly falling in love with a chatbot on Character.AI, a lawsuit alleges. US District Court

“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” state the papers, first reported on by the New York Times.

At one point, the bot had asked Sewell if “he had a plan” to take his own life, according to screenshots of their conversations. Sewell — who used the username “Daenero” — responded that he was “considering something” but didn’t know if it would work or if it would “allow him to have a pain-free death.”

Then, during their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.”

During their final conversation, the teen repeatedly professed his love for the bot, telling the character, “I promise I will come home to you. I love you so much, Dany.” US District Court

“I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.

When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.”

Just seconds later, Sewell shot himself with his father’s handgun, according to the lawsuit.

The ninth-grader had been relentlessly engaging with the bot “Dany” — named after HBO’s Daenerys Targaryen character — in the months prior to his death. US District Court

His mom, Megan Garcia, has blamed Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and failed to alert anyone when he expressed suicidal thoughts, according to the filing.

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the papers allege.

“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

Some of the chats were romantic and sexually charged in nature, the suit alleges. US District Court

The lawsuit claims that Sewell’s mental health “quickly and severely declined” only after he downloaded the app in April 2023.

His family alleges he became withdrawn, his grades started to drop and he started getting into trouble at school the more he got sucked into speaking with the chatbot.

The changes in him got so bad that his parents arranged for him to see a therapist in late 2023, which resulted in him being diagnosed with anxiety and disruptive mood disorder, according to the suit.

His mom, Megan Garcia, has blamed Character.AI for the teen’s death because the app allegedly fueled his AI addiction, sexually and emotionally abused him and failed to alert anyone when he expressed suicidal thoughts. Facebook/Megan Fletcher Garcia

Sewell’s mother is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.

The Post reached out to Character.AI but didn’t hear back immediately.

If you are struggling with suicidal thoughts, you can dial the 24/7 National Suicide Prevention hotline at 988 or go to SuicidePreventionLifeline.org.

2024-10-23 18:01:48

 
Report
Rating

0

( 0 Votes )
Please Rate!

No votes so far! Be the first to rate this post.



You are now ready to download for free. Here are some notes:

  • يرجى مراجعة دليل التثبيت الخاص بنا.
  • للتحقق من وحدة المعالجة المركزية ووحدة معالجة الرسومات لجهاز Android، يرجى استخدام تطبيق CPU-Z

Leave a Comment