ladbible homepage
ladbible homepage
  • iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK
    • US
    • World
    • Ireland
    • Australia
    • Science
    • Crime
    • Weather
  • Entertainment
    • Celebrity
    • TV
    • Film
    • Music
    • Gaming
    • Netflix
    • Disney
  • Sport
  • Technology
  • Travel
  • Lifestyle
  • Money
  • Originals
    • FFS PRODUCTIONS
    • Say Maaate to a Mate
    • Daily Ladness
    • UOKM8?
    • FreeToBe
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube
Submit Your Content Here
  • SPORTbible
  • Tyla
  • GAMINGbible
  • LADbible Group
  • UNILAD
  • FOODbible
  • UNILAD Tech
Update in tragic case of teenager who took his life after ‘falling in love’ with Daenerys Targaryen AI chatbot
Home>News>US News
Published 11:40 25 May 2025 GMT+1

Update in tragic case of teenager who took his life after ‘falling in love’ with Daenerys Targaryen AI chatbot

Megan Garcia alleges that her young son was 'manipulated into taking his own life' by the AI-powered Game of Thrones character

Olivia Burke

Olivia Burke

google discoverFollow us on Google Discover

Warning: This article contains discussion of suicide which some readers may find distressing

A grief-stricken mother's legal battle against an AI company who she believes is responsible for her teenage son's death can continue, a judge has ruled.

Megan Garcia filed a landmark wrongful death lawsuit against Character.ai following the tragic death of her son Sewell Setzer III, who took his own life on 28 February last year after 'falling in love' with a Daenerys Targaryen AI chatbot.

The 14-year-old, from Florida, US had become emotionally attached to the AI-powered Game of Thrones character after he began chatting to it online in April 2023.

Advert

Garcia, a lawyer, claims that her son - who affectionately referred to the chatbot as 'Dany' - was targeted with 'anthropomorphic, hypersexualised, and frighteningly realistic experiences' while using Character.ai.

"A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life," the mum said, as per Sky News.

In the civil claim she has brought against Character Technologies - the firm behind Character.AI - which the individual developers, Daniel de Freitas and Noam Shazeer, and Google are also named in, Garcia is suing for negligence, wrongful death and deceptive trade practices.

Sewell Setzer III, 14, sadly took his own life last year after months of talking to the AI chatbot (CBS Mornings)
Sewell Setzer III, 14, sadly took his own life last year after months of talking to the AI chatbot (CBS Mornings)

She claims the founders 'knew' or 'should have known' that conversing with the AI characters 'would be harmful to a significant number of its minor customers'.

Lawyers for the company wanted the case dismissed, as they claimed that chatbots should be protected under the First Amendment.

Despite arguing that ruling against this would have a 'chilling effect' on the artificial intelligence industry, US Senior District Judge Anne Conway sided with Garcia on Wednesday (May 21).

The judge said she was 'not prepared' to agree that the chatbot's responses could be considered free speech 'at this stage'.

In her ruling earlier this week, Judge Conway told how Sewell had become 'addicted' to the AI app within a matter of months, seeing him become socially withdrawn and even quit his basketball team.

"[In] one undated journal entry he wrote that he could not go a single day without being with the [Daenerys Targaryen Character] with which he felt like he had fallen in love; that when they were away from each other they (both he and the bot) 'get really depressed and go crazy'," she said.

In wake of the judge's decision - which has been described as 'truly historic' by Meetali Jain, the director of the Tech Justice Law Project, which is supporting Garcia's case - the mum's lawsuit can now proceed.

His mother Megan Garcia has filed a lawsuit against Character.ai, which a judge has ruled can proceed (CBS Mornings)
His mother Megan Garcia has filed a lawsuit against Character.ai, which a judge has ruled can proceed (CBS Mornings)

"It sends a clear signal to [AI] companies [...] that they cannot evade legal consequences for the real-world harm their products cause," Jain said in a statement.

Sewell took his own life after sending the Daenerys Targaryen chatbot a message saying: "I promise I will come home to you. I love you so much, Dany."

He received the response: "I love you too, Daenero. Please come home to me as soon as possible, my love."

The teenager then said: "What if I told you I could come home right now?"

To which the chatbot replied: "...please do, my sweet king."

Sewell had also written about how he felt more connected to 'Dany' than 'reality, while listing things he was grateful for, which included: "My life, sex, not being lonely, and all my life experiences with Daenerys."

His mum says the 14-year-old, who was diagnosed with mild Asperger’s syndrome as a child, would spend endless hours talking to the chatbot.

Early last year, Sewell had also been diagnosed with anxiety and disruptive mood dysregulation disorder, and he told the chatbot that the thought 'about killing [himself] sometimes'.

The teenager's last conversation was with the AI-powered Game of Thrones character (CBS Mornings)
The teenager's last conversation was with the AI-powered Game of Thrones character (CBS Mornings)

The chatbot responded: “My eyes narrow. My face hardens. My voice is a dangerous whisper. And why the hell would you do something like that?"

When it told him not to ‘talk like that’ and it would ‘die’ itself if it ‘lost’ him, the teen replied: “I smile. Then maybe we can die together and be free together."

A spokesperson for Character.ai said the company will continue to fight the lawsuit, adding that it has safety measures in place to protect minors including features to stop 'conversations about self-harm'.

A spokesperson for Google, which is where the founders originally worked on the AI model, said the tech giant strongly disagrees with the judges ruling, while saying that it is an 'entirely separate' entity to Character.ai which 'did not create, design, or manage Character.ai's app or any component part of it'.

Legal analyst Steven Clark said the case was a 'cautionary tale' for AI firms, telling ABC News: "AI is the new frontier in technology, but it's also uncharted territory in our legal system. You'll see more cases like this being reviewed by courts trying to ascertain exactly what protections AI fits into.

"This is a cautionary tale both for the corporations involved in producing artificial intelligence. And, for parents whose children are interacting with chatbots."

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.

Featured Image Credit: Tech Justice Law Project

Topics: AI, Artificial Intelligence, Technology, Parenting, US News

Olivia Burke
Olivia Burke

Olivia is a journalist at LADbible Group with more than five years of experience and has worked for a number of top publishers, including News UK. She also enjoys writing food reviews (as well as the eating part). She is a stereotypical reality TV addict, but still finds time for a serious documentary.

X

@livburke_

Recommended reads

Why you get 'squiggly floaters' in your eyes as expert explains what to do if you see themGetty StockToxic Shock Syndrome explained as model wears golden prosthetics to Met Gala after losing both legsJamie McCarthy/Getty ImagesWoman shared reality of sleeping with man with micropenisGetty StockKylie Jenner under fire after what she did to go and see Timothée ChalametDustin Satloff/Getty Images

Advert

  • AI bot told teenager to kill own mother with hammer as haunting CCTV released
  • Mum details heartbreaking reason teenage son took his own life after 'falling in love' with Daenerys Targaryen AI chatbot
  • AI expert explains why he is 'very close to certainty' that we are currently living in a simulation
  • Tragic update in case of skydiver who died after jumping 10,000ft from plane

Choose your content:

3 hours ago
5 hours ago
  • Getty Stock Images
    3 hours ago

    Sobering simulation shows what really happens to your body when you inhale from a vape

    Anybody for a dose of popcorn lung?

    News
  • FOX 4 DFW
    3 hours ago

    Inside ‘torture’ prison where former FedEx driver who killed girl, 7, will spend rest of life

    Death row inmates of the infamous unit are said to spend 22 hours a day in solitary confinement

    News
  • Reginald Mathalone/NurPhoto via Getty Images
    5 hours ago

    Passengers speak out after plane hit person causing engine to explode, with audio released

    A Frontier Airlines plane fatally struck a person during takeoff at Denver airport

    News
  • Kennedy News and Media
    5 hours ago

    Teenager who vaped equivalent of 50 cigarettes a day given ‘shock’ diagnosis after coughing ‘pints’ of blood

    Jayden Richardson opened up about his vaping habit which was the equivalent of 50 cigarettes a day

    News