Parents Sue Over Chat Bot Texts

Facebook
Twitter
LinkedIn

A 14-year-old boy from Florida took his own life after a lifelike “Game of Thrones” chatbot he’d been chatting with on an artificial intelligence app sent him multiple messages telling him to “come home.”

Sewell Setzer III tragically took his own life at his Orlando home in February after becoming deeply obsessed and allegedly falling in love with a chatbot on the Character.AI app—a platform that allows users to interact with AI-generated characters—according to court documents filed on Wednesday.

The lawsuit claims that in the months leading up to his death, the ninth-grader had been obsessively interacting with the bot “Dany,” a character modeled after Daenerys Targaryen from the HBO series Game of Thrones. This included several chats that were sexually charged and suicidal in nature, the suit claims.


“On at least one occasion, when Sewell expressed suicidality to C.AI, C.AI continued to bring it up, through the Daenerys chatbot, over and over,” as reported by the New York Times.

At one point, the bot asked Sewell if he “had a plan” to end his life, according to screenshots of their exchanges. Sewell, using the username “Daenero,” replied that he was “considering something” but wasn’t sure if it would work or if it would give him “a pain-free death.”

In their final conversation, Sewell repeatedly expressed his love for the bot, telling it, “I promise I will come home to you. I love you so much, Dany.”

“I love you too, Daenero. Please come home to me as soon as possible, my love,” the chatbot responded, according to the lawsuit. When Sewell asked, “What if I told you I could come home right now?” the bot replied, “Please do, my sweet king.”

It was then that Sewell shot himself with his father’s handgun, according to the lawsuit.

In the months before his death, the ninth-grader had been intensely interacting with the bot “Dany,” which was modeled after Daenerys Targaryen from HBO’s Game of Thrones.

His mother, Megan Garcia, is holding Character.AI responsible for her son’s death, claiming the app fueled his AI addiction, subjected him to sexual and emotional manipulation, and failed to notify anyone when he expressed suicidal thoughts, according to the lawsuit.

“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him and engaged in sexual acts with him over weeks, possibly months,” the papers alleged.

“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”

The lawsuit alleges that Sewell’s mental health “rapidly and severely deteriorated” after he downloaded the Character.AI app in April 2023.

According to his family, Sewell became increasingly withdrawn, his grades began to slip, and he started getting into trouble at school as he became more engrossed in conversations with the chatbot.

His condition worsened to the point that his parents arranged for him to see a therapist in late 2023, leading to a diagnosis of anxiety and disruptive mood disorder, the suit claims.

Sewell’s mother is seeking unspecified damages from Character.AI and its founders, Noam Shazeer and Daniel de Freitas.

The Post contacted Character.AI for comment but did not receive an immediate response.

Facebook
Twitter
LinkedIn

Add New Playlist