ladbible homepage
ladbible homepage
  • iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK
    • US
    • World
    • Ireland
    • Australia
    • Science
    • Crime
    • Weather
  • Entertainment
    • Celebrity
    • TV
    • Film
    • Music
    • Gaming
    • Netflix
    • Disney
  • Sport
  • Technology
  • Travel
  • Lifestyle
  • Money
  • Originals
    • FFS PRODUCTIONS
    • Say Maaate to a Mate
    • Daily Ladness
    • UOKM8?
    • FreeToBe
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Archive
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube
Submit Your Content Here
  • SPORTbible
  • Tyla
  • GAMINGbible
  • LADbible Group
  • UNILAD
  • FOODbible
  • UNILAD Tech
Woman 'nearly died' after ChatGPT gave her incorrect advice over poisonous plant

Home> News> Technology

Published 13:35 23 Jan 2026 GMT

Woman 'nearly died' after ChatGPT gave her incorrect advice over poisonous plant

YouTuber Kristi is warning her followers against blindly following information they receive from ChatGPT and other AI chatbots

Emma Rosemurgey

Emma Rosemurgey

google discoverFollow us on Google Discover

Millions of people turn to ChatGPT every single day, to help them with anything from drafting an email to researching a DIY project, and pretty much everything in between.

However, one woman has revealed how the AI chatbot 'nearly killed' her best friend, proving the notion that a pinch of salt should be taken with every answer.

YouTuber Kristi took to Instagram to warn her followers of the dangers that could potentially come with blindly following information from the service, after her friend received advice that could've been fatal.

Kristi, who has nearly half a million followers on her account @rawbeautybykristi, shared the tale of how her pal nearly poisoned herself after ChatGPT reassured her that a poisonous plant in her backyard was actually completely harmless.

Advert

The friend sent the chatbot a photo of the unidentified plant, asking 'what plant is this,' only to be told it looks like carrot foliage. According to screenshots, ChatGPT went on to list several reasons it was 'confident' the plant was carrot foliage, including the 'finely divided and feathery leaves,' which is very 'classic' for carrot tops.

The friend sent photos of the plant to ChatGPT (@rawbeautybykristi/Instagram)
The friend sent photos of the plant to ChatGPT (@rawbeautybykristi/Instagram)

Interestingly, the chatbot went on to list some common lookalikes of carrot foliage, including parsley, cilantro (or coriander for us Brits), Queen Anne's lace and, shock horror, poison hemlock.

When Kristi's friend directly asked if the plant in the photo was poison hemlock, she was met with multiple reassurances it wasn't.

"I don't know if you guys know this, you eat it, you die. You touch it, you can die," Kristi told her followers, before sharing an answer she received on Google, which states that poison hemlock causes 'systemic poisoning' for which there is no antidote.

After sharing another photo with ChatGPT, the friend was reassured once again that the plant was not poison hemlock because it does not show smooth hollow stems with purple blotching, despite the image appearing to show exactly that.

What's even more concerning is the fact the friend was encouraged to incorrectly label the plant as carrot foliage on the assumption it might be in a shared garden in the school where she works.

When Kristi put the same photo into Google lens, another AI platform that allows you to search images, the responses immediately confirm it is in fact poison hemlock. Her friend then put the same images into a different ChatGPT window on her phone and was also immediately told the plant was poisonous.

"She's a grown adult and she knew to ask me beyond what ChatGPT said thank God, because what if she wasn't? They would literally be dead, there is no antidote for this," Kristi said.

"This is a warning to you that ChatGPT and other large language models and any other AI, they are not your friend, they are not to be trusted, they are not helpful, they are awful and they could cause severe harm."

LADbible has approached ChatGPT for comment.

Featured Image Credit: @rawbeautykristi

Topics: Instagram, ChatGPT, AI, Artificial Intelligence, Technology, Social Media

Emma Rosemurgey
Emma Rosemurgey

Emma is an NCTJ accredited journalist who recently rejoined LADbible as a Trends Writer. She previously worked on Tyla and UNILAD, before going on to work at the Mirror Online. Contact her via [email protected]

Recommended reads

Most important object Artemis II have is basic household item, says mission specialist (NASA/Joel Kowsky)Tiger Woods 'calls Donald Trump' in released bodycam footage from arrestMartin County Sheriff’s OfficeEaster weekend driving ban and £10,000 fine can be avoided through 20p hackSusan L. Angstadt/MediaNews Group/Reading Eagle via Getty ImagesCourtney Love issues message to Dave Grohl following years long feudStuart C. Wilson/Getty Images

Advert

  • Man creates cancer vaccine for 'best mate' dog using ChatGPT
  • Street interview with two girls out clubbing seriously creeps people out
  • Professor says there's a new way that gives away writing was done with AI
  • ChatGPT CEO makes dark admission over what happens when you search using AI

Choose your content:

an hour ago
12 hours ago
13 hours ago
  • (NASA/Joel Kowsky)
    an hour ago

    Most important object Artemis II have is basic household item, says mission specialist

    This £2 household item has proven invaluable on board the $20.4 billion (£15.4 billion) spacecraft

    News
  • Martin County Sheriff’s Office
    an hour ago

    Tiger Woods 'calls Donald Trump' in released bodycam footage from arrest

    Police bodycam footage shows Tiger Woods speaking on the phone before his arrest on suspicion of driving under the influence

    News
  • Susan L. Angstadt/MediaNews Group/Reading Eagle via Getty Images
    12 hours ago

    Easter weekend driving ban and £10,000 fine can be avoided through 20p hack

    If it affects all your tyres you can get a driving ban even if you had a clean licence

    News
  • Gila County Sheriff's Office
    13 hours ago

    13-year-old girl who went missing without a trace found alive 32 years later

    Christina Maria Plante had last been seen in 1994

    News