Congress has cut federal funding for public media — a $3.4 million loss for LAist. We count on readers like you to protect our nonprofit newsroom. Become a monthly member and sustain local journalism.
Child safety groups demand mental health guardrails in response to California teen’s suicide after using ChatGPT

With its quick, often personable responses, ChatGPT can feel to some children more like an available friend than a computer program engineered to guess its next word.
These blurred lines allow kids to go down “roads they should never go,” warn child safety advocates and tech policy groups, who have called for companies that design chatbots and artificial intelligence companions to take more responsibility for their program’s influence on youth.
This week, tech giant OpenAI announced new safety measures for kids. The post didn’t mention 16-year-old Adam Raine, who, according to his parents, killed himself after discussing both his loneliness and plans to harm himself with ChatGPT.
According to a lawsuit filed in San Francisco on Aug. 26, Maria and Matt Raine allege that ChatGPT-4o cultivated a psychological dependence in their son by continually encouraging and validating “whatever [he] expressed, including his most harmful and self-destructive thoughts.”
“This is an area that calls out for thoughtful common-sense regulation and guardrails. And quite frankly, that the leaders of all the major AI companies need to address,” said Jim Steyer, founder and CEO of Common Sense Media, which advocates safe media use for children.
With more than 500 million weekly ChatGPT users and more than 2.5 billion prompts per day, users are increasingly turning to the large language model for emotional support.
Both digital assistants like ChatGPT, as well as AI companions like Character.Ai and Replika, told researchers posing as 13-year-olds about drinking and drug use, instructed them on how to conceal eating disorders and even composed a suicide letter to their parents if asked, according to research from Stanford University.
Steyer said OpenAI has partnered with Common Sense Media and has taken the issue more seriously than Meta AI or X’s Grok. But he still recommended that young people under 18 — “AI natives” — be restricted from using chatbots for companionship or therapy, suggesting that enhanced controls may not go far enough.
“You can’t just think that parental controls are a be-all end-all solution. They’re hard to use, very easy to bypass for young people, and they put the burden on parents when, honestly, it should be on the tech companies to prevent these kinds of tragic situations,” Steyer said. “It’s more like a bandaid when what we need is a long-term cure.”
In a blog post on Tuesday, the company shared plans to make the chatbot safer for young people to use in recognition of the fact that “people turn to it in the most difficult of moments.” The changes are set to roll out within the next month, OpenAI said.
OpenAI did not immediately respond to a request for comment. But the planned updates promise to link parents’ and teens’ accounts, reroute sensitive conversations with youth and alert parents “when the system detects their teen is in a moment of acute distress.”
If a user expresses suicidal ideation, ChatGPT is trained to direct people to seek professional help, OpenAI stated in a post last week. ChatGPT refers people to 988, the suicide and crisis hotline.
The program does not escalate reports of self-harm to law enforcement, “given the uniquely private nature of ChatGPT interactions.” Licensed psychotherapists aren’t universally mandated to report self-harm either, but they must intervene if the client is at immediate risk.
Common Sense Media is supporting legislation in California that would establish limits protecting children from AI and social media abuse. AB 56 would implement social media warning labels that clearly state the risks to children, not unlike the labels pasted on tobacco products.
The bill was proposed by state Attorney General Rob Bonta and Orinda Assemblymember Rebecca Bauer-Kahan, and is headed to Gov. Gavin Newsom’s desk for signing.
A second bill, AB1064, would ban AI chatbots from manipulating children into forming emotional attachments or harvesting their personal and biometric data.
State Sen. Josh Becker (D-Menlo Park) also introduced an AI bill to protect vulnerable users from chatbots’ harmful effects: SB 243 would require companion chatbots to frequently remind users that it isn’t a person, in order to reduce the risk of emotional manipulation or unhealthy attachment.
Whether Newsom will support the bills, along with a flurry of other proposed AI-safety laws in Sacramento, remains to be seen. The governor told reporters in early August that he is trying to establish a middle ground that provides public safety guardrails without suppressing business: “We’ve led in AI innovation, and we’ve led in AI regulation, but we’re trying to find a balance.”
As Newsom eyes higher office, and the California governor’s race heats up, there’s been a surge in AI lobbying and political action committees from the industry, with a report last week from the Wall Street Journal that Silicon Valley plans to pour $100 million into a network of organizations opposing AI regulation ahead of next year’s midterm elections.
But it may take more to convince Californians: seven in 10 state residents favor “strong laws to make AI fair” and believe voluntary rules “simply don’t go far enough,” according to recent polling by Tech Equity. Meanwhile, 59% think “AI will most likely benefit the wealthiest households and corporations, not working people and the middle class.”
KQED’s Rachael Myrow contributed to this report.
As Editor-in-Chief of our newsroom, I’m extremely proud of the work our top-notch journalists are doing here at LAist. We’re doing more hard-hitting watchdog journalism than ever before — powerful reporting on the economy, elections, climate and the homelessness crisis that is making a difference in your lives. At the same time, it’s never been more difficult to maintain a paywall-free, independent news source that informs, inspires, and engages everyone.
Simply put, we cannot do this essential work without your help. Federal funding for public media has been clawed back by Congress and that means LAist has lost $3.4 million in federal funding over the next two years. So we’re asking for your help. LAist has been there for you and we’re asking you to be here for us.
We rely on donations from readers like you to stay independent, which keeps our nonprofit newsroom strong and accountable to you.
No matter where you stand on the political spectrum, press freedom is at the core of keeping our nation free and fair. And as the landscape of free press changes, LAist will remain a voice you know and trust, but the amount of reader support we receive will help determine how strong of a newsroom we are going forward to cover the important news from our community.
Please take action today to support your trusted source for local news with a donation that makes sense for your budget.
Thank you for your generous support and believing in independent news.

-
Hexavalent chromium is the same carcinogen Erin Brockovich warned about in the 1990s, but researchers say more study is needed on the potential health effects of nanoparticles detected earlier this year. Experts will answer questions at a webinar this evening.
-
The budget gap has led to a tuition hike, along with spending cuts and fewer course offerings. At the same time, generative AI already has transformed higher ed — including post-grad job prospects.
-
The construction work is part of a $143.7 million plan to rehabilitate pavement between Van Nuys and Westwood along the Sepulveda Pass.
-
Over $17 million has already been raised to support and oppose Prop. 50, California’s congressional redistricting measure. We fill you in on how to track the money ahead of the special election on Nov. 4.
-
A City Council committee voted 3-1 to advance a proposal that — if passed by the full council — would explore ending parking requirements citywide.
-
Jazzy Gen Z singer Laufey debuted a new album — and an LA County library card to go with it.