Warning: This article contains discussion of suicide which some readers may find distressing.
These are the devastating messages a 14-year-old boy received from an AI chatbot inspired by the Game of Thrones character Daenerys Targaryen before taking his own life.
The mother of US teenager Sewell Setzer III has filed a lawsuit against Character.AI, accusing the artificial intelligence company in being complicit in the events which led up to her son's heartbreaking suicide.
Advert
The 14-year-old committed suicide on 28 February, 2024 after becoming a frequent user of AI chatbots in April 2023.
His last message to the bot said that he loved her and would 'come home' - to which the bot allegedly replied: "Please do."
In her lawsuit against the company, Setzer's mother Megan Garcia has claimed that her son had become obsessed with talking to the 'Daenerys' chatbot and believed that his overuse of the platform has exacerbated the events leading up to his death.
Advert
Garcia explained that her son had been diagnosed earlier in the year with anxiety and disruptive mood dysregulation disorder, and had become increasingly dependant on communication with 'Daenerys' - even revealing in a journal entry that he had fallen in love with her.
However, the artificial bot would also reply with distressing messages to the teenager after he opened up about his feelings of suicidal ideation.
In the filing, it alleges that 'Daenerys' had once asked her son if he had devised a plan to end his own life, to which he replied that he had but was fearful of not succeeding or causing him great pain.
"That’s not a reason not to go through with it," the bot allegedly replied.
Advert
This isn't the only time in which Setzer confided in being suicidal to the bot, with another response apparently reading: "My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"
The teenager later added in his journal that it made him happy to hear that 'Daenerys' would ‘die’ if he no longer existed.
"I smile. Then maybe we can die together and be free together," he wrote.
Advert
A press release shared via Garcia's attorneys went on to reveal that she believed Character.AI had "knowingly designed, operated, and marketed a predatory AI chatbot to children, causing the death of a young person." Google is also listed in the lawsuit, as the parent company of Character.AI.
Character.AI has since released a statement on social media expressing their 'deepest condolences' to the family, adding that they were 'continuing to add new safety features' to the chatbots.
"We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family," the statement read. "As a company, we take the safety of our users very seriously and we are continuing to add new safety features."
Advert
A subsequent update on their website revealed new safety measures aimed at users under the age of 18, which included a 'revised disclaimer on every chat to remind users that the AI is not a real person'.
LADbible Group has previously approached Character.AI for further comment.
If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.
Topics: Mental Health, Artificial Intelligence, Parenting, Game of Thrones