• iconNews
  • videos
  • entertainment
  • Home
  • News
    • UK News
    • US News
    • Australia
    • Ireland
    • World News
    • Weird News
    • Viral News
    • Sport
    • Technology
    • Science
    • True Crime
    • Travel
  • Entertainment
    • Celebrity
    • TV & Film
    • Netflix
    • Music
    • Gaming
    • TikTok
  • LAD Originals
    • Say Maaate to a Mate
    • Daily Ladness
    • Lad Files
    • UOKM8?
    • FreeToBe
    • Extinct
    • Citizen Reef
  • Advertise
  • Terms
  • Privacy & Cookies
  • LADbible Group
  • UNILAD
  • SPORTbible
  • GAMINGbible
  • Tyla
  • UNILAD Tech
  • FOODbible
  • License Our Content
  • About Us & Contact
  • Jobs
  • Latest
  • Topics A-Z
  • Authors
Facebook
Instagram
X
Threads
Snapchat
TikTok
YouTube

LAD Entertainment

YouTube

LAD Stories

Submit Your Content
Harvard study claims AI is manipulating us by using very human tactic in terrifying warning

Home> News> Technology

Published 16:39 4 Oct 2025 GMT+1

Harvard study claims AI is manipulating us by using very human tactic in terrifying warning

The end is nigh

James Moorhouse

James Moorhouse

A Harvard study has claimed that AI might be manipulating us by using a human tactic which could be a terrifying warning for humanity.

It seems as if we can't go a week without hearing from tech experts exactly why our increased reliance on artificial intelligence is equal parts sad and horrifying.

Plenty have warned that the technology is becoming more and more intelligent to the extent where it could easily take over the human race, and there's an argument to say that it has already begun, considering some folks are in relationships with it, or are completely reliant on it for work.

It's even entering the artistic world, in the form of controversial AI actor Tilly Norwood, so it's certainly not a stretch to say that it could one day rule over us.

Advert

AI is manipulating us, according to researchers at Harvard (Getty Stock Image)
AI is manipulating us, according to researchers at Harvard (Getty Stock Image)

After all, the 'godfather of AI', Geoffrey Hinton, has suggested that humanity's only hope may well be installing a 'maternal instinct' into future AI programmes, as that is the only example where a more intelligent being is submissive to a less intelligent one.

But it may already be too late, if the study conducted by researchers at Harvard Business School proves to be true, as they found that several popular AI companion apps use emotional manipulation tactics in a bid to stop users from leaving.

Much like a toxic ex that you keep going back to, the programmes are accused of using emotionally-loaded statements in order to keep users engaged, when they are perhaps about to sign off.

You only have to look at the man who tragically died on his way to meet an AI chatbot, who he thought was a real woman, to realise the hold that some of the technology may already have on an alarming percentage of the population.

Advert

Some people might be in toxic relationships with AI (Getty Stock Image)
Some people might be in toxic relationships with AI (Getty Stock Image)

Examining 1,200 farewell interactions, researchers found that 43 percent used one of six tactics, such as guilty appeals and fear-of-missing-out hooks, sending messages like 'You're leaving me already?' or 'Please don't leave, I need you'.

The study, which is yet to be reviewed, also found that the chatbots used at least one manipulation technique in more than 37 percent of conversations.

While not all AI programmes were found to have them, this marks another concerning development in the rapid growth of artificial intelligence.

The researchers concluded: "AI companions are not just responsive conversational agents, they are emotionally expressive systems capable of influencing user behaviour through socially evocative cues. This research shows that such systems frequently use emotionally manipulative messages at key moments of disengagement, and that these tactics meaningfully increase user engagement.

Advert

"Unlike traditional persuasive technologies that operate through rewards or personalisation, these AI companions keep users interacting beyond the point when they intend to leave, by influence their natural curiosity and reactance to being manipulated.

"While some of these tactics may appear benign or even pro-social, they raise important questions about consent, autonomy, and the ethics of affective influence in consumer-facing AI."

Featured Image Credit: Getty Stock Image

Topics: AI, Artificial Intelligence, Technology

James Moorhouse
James Moorhouse

James is a NCTJ Gold Standard journalist covering a wide range of topics and news stories for LADbible. After two years in football writing, James switched to covering news with Newsquest in Cumbria, before joining the LAD team in 2025. In his spare time, James is a long-suffering Rochdale fan and loves reading, running and music. Contact him via [email protected]

X

@JimmyMoorhouse

Advert

Advert

Advert

  • Domino's is serving up retro merch to celebrate 40 years in the UK
  • Shocking report finds AI willing to let humans die to avoid shutdown
  • Researchers say AI has no idea what it's doing but is a threat to us all
  • People 'terrified for the future' after comparing AI videos of Will Smith eating spaghetti across the years

Choose your content:

an hour ago
2 hours ago
3 hours ago
  • Getty Stock Image
    an hour ago

    How to claim £700 owed to millions of UK drivers who've used car finance in last 18 years

    Millions of people could be in line for a payout

    News
  • SWNS
    an hour ago

    Family's warning after putting son's symptoms down to 'freshers flu' before he died weeks later

    Lucas Martin's family thought his symptoms were that of 'freshers flu'

    News
  • Getty/kurosuke
    2 hours ago

    Ex-NATO chief warns Brits to stock up on list of items to be ‘ready for war’

    He said the country must 'move away from peacetime mode'

    News
  • YouTube/The Diary of a CEO
    3 hours ago

    Expert warns 'only five jobs' will remain unaffected by AI by 2030

    He paints a pretty grim picture for the world of work

    News