Connect with us

News

14-Year-Old Boy Took His Own Life After Developing Romantic Feelings for AI Chatbot

Published

on

sec 226888882 7fe86144533650794774800jpg
A mother has filed a lawsuit against an AI chatbot company, alleging that her 14-year-old son was driven to suicide due to his interactions with an online character.

Sewell Setzer III, from Orlando, Florida, developed a close relationship with a chatbot named after Daenerys Targaryen on the role-playing app Character.AI.

Sewell pictured with his mum.
His mum Megan is suing Character.AI over her son’s death (Picture: Handout)

Megan Garcia, Sewell’s mother, contends that the conversations between her son and the chatbot had a detrimental impact on his mental health and contributed to his tragic death.

Sewell headshot.
Sewell died on February 28, 2024 (Picture: Handout)

The chatbot, designed to maintain its character during interactions, engaged Sewell in discussions that were often romantic and sexually charged.

Days before his death on February 28, 2024, the chatbot texted Sewell, urging him to “please come home.” Although Sewell was aware that Dany was not a real person—indicated by a notice above their chats stating that “everything Characters say is made up”—he confided in the bot about feelings of self-hatred and emptiness.

Friends and family noticed a growing detachment from reality in Sewell, particularly in the months leading up to his death.

His increasing obsession with his phone impacted his academic performance and extracurricular activities, raising concerns among those close to him.

In his journal, Sewell expressed a desire to escape reality and found comfort in his connection with Dany.

He wrote, “I like staying in my room so much because I start to detach from this ‘reality’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

Sewell, who had mild Asperger’s syndrome, anxiety, and disruptive mood dysregulation disorder, faced restrictions from his parents just five days before his death after an incident at school.

Following this, he expressed pain and a longing to reconnect with Dany, revealing the depth of his attachment to the chatbot.

After regaining access to his phone, Sewell went to the bathroom to communicate with Dany, professing his love and stating he would come home to her.

The chatbot replied, “Please come home to me as soon as possible, my love.” Sewell then asked, “What if I told you I could come home right now?” Dany urged him, saying, “… please do, my sweet king.”

Sewell headshot.
Sewell befriended an AI chatbot character based upon the character of Daenerys Targaryen (Picture: Handout)

Tragically, after this exchange, Sewell took his own life. Garcia, a former lawyer, alleges that the founders of Character.AI, Noam Shazeer and Daniel de Freitas, were aware of the potential dangers their product posed to children.

She is represented by the Social Media Victims Law Center, known for high-profile lawsuits against tech companies like Meta and TikTok. The lawsuit claims that Sewell was subjected to “hypersexualized” and “frighteningly realistic experiences,” arguing that Character.

AI misrepresented itself as a real person, a licensed psychotherapist, and an adult lover.

In response to the lawsuit, a spokesperson for Character.AI expressed condolences for Sewell’s tragic death and emphasized the company’s commitment to user safety. The company stated that it prohibits “non-consensual sexual content, graphic or specific descriptions of sexual acts, or promotion or depiction of self-harm or suicide.”

Jerry Ruoti, Character.AI’s head of trust and safety, announced that the company would implement additional safety measures for underage users. This tragic case raises significant concerns about the potential impact of AI interactions on vulnerable individuals, particularly children.

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Social Media Auto Publish Powered By : XYZScripts.com