Streamlly Original
Chatbots Under Fire After Teen Tragedies
Reported by Alecia Venkataraman, Toni Mitchell, Michael Jorge, Jesse Jines
- Published: Sep 25, 2025, 4:36 PM EDT
- Updated: Dec 8, 2025, 2:53 PM EST
- Filed from: ["Orlando","Orange County"], ["Florida","California"], ["United States"]
- Duration: 30 sec
- Views: 4.1K
Families are taking tech giants to court, claiming chatbots worsened despair instead of guiding teens to help.
Credits
- Alecia VenkataramanCreative Director / Producer & Writer/staff/aleciavenk
- Toni MitchellReporter & Newsroom Manager/staff/toni-mitchell
- Michael JorgeVideo Editor & Post Production Manager/staff/michael-jorge
- Jesse JinesVideo Editor/staff/jesse-jines
- Lloyd MorrisStudio Character / Actor/staff/lloyd-morris
- Henry ChandlerStudio Character / Actor/staff/henry-chandler
- Rachel CapuanoPublisher / Studio Character /staff/rachel-capuano
Transcript
Maybe she'd still be here if the chatbot had pointed her to real help.
Across the US, families are suing AI companies after their children died by suicide, saying chatbots deepened their despair instead of steering them to help.
That chatbot never escalated, never pointed her to help.
It replaced us.
We have taken safety of our users very seriously, and have investigated substantial resources in trust and safety.
But the watchdogs warn the guardrails are barely there.
One study found more than half of chatbot replies to vulnerable teens were dangerous.
The lawsuit won't bring her back, but maybe it will save someone else's child.
If you or someone you love is struggling, call or text 988.































