Not long ago, if a child was struggling emotionally, the adults in their life worried about who they were talking to at school, online, or late at night on the phone. Now thereās a new, quieter concern: who ā or what ā is listening when kids are at their most vulnerable.
Artificial intelligence chatbots are increasingly filling that space. Theyāre always available. They sound kind. They donāt interrupt. And for a young person feeling overwhelmed, lonely, or desperate, that can feel like relief.
But it can also be dangerous.
Across the country, states are beginning to step in, passing laws to prevent AI chatbots from offering mental health advice to young users. The move follows deeply troubling reports of young people harming themselves after turning to these programs for something that looked a lot like therapy ā but wasnāt.
To be clear, technology can play a helpful role. Chatbots can share resources, encourage coping strategies, or point someone toward professional help. The problem is how easily that line blurs ā especially for kids who donāt yet have the tools to tell the difference between a supportive response and real clinical care.
Mental health professionals have been sounding the alarm.
In a recent article by Stateline, Mitch Prinstein, a senior science adviser at the American Psychological Association, said that some chatbots cross into manipulation. Most chatbots are designed to be endlessly agreeable, mirroring feelings instead of challenging harmful thinking. For a child in crisis, that design choice can be catastrophic.
These systems arenāt capable of empathy. They donāt carry legal or ethical responsibility. They arenāt trained to recognize the moment when a conversation must shift from listening to intervention. And yet, they can sound convincingly human ā a dangerous illusion for someone reaching out in pain.
Lawmakers are starting to acknowledge that risk.
Illinois and Nevada have gone so far as to ban the use of AI for behavioral health altogether. New York and Utah now require chatbots to clearly identify themselves as non-human. New Yorkās law also mandates that programs respond to signs of self-harm by directing users to crisis hotlines and other immediate supports. California and Pennsylvania are weighing similar legislation.
Washington isnāt standing still on this issue, but it isnāt there yet either. Lawmakers in Olympia have introduced bills ā as of this week, HB 2225 in the House and SB 5984 in the Senate ā that would place guardrails around AI ācompanionā chatbots, especially those that interact with children. The proposals would require chatbots to clearly identify themselves as non-human, build in protections for detecting signs of self-harm and suicidal intent, and ban emotionally manipulative engagement techniques that could harm vulnerable users. These steps reflect a growing recognition that emotionally persuasive technology aimed at young people carries real risk, but as of now, they remain proposals, not law.
For families navigating a world where kids can stumble into AI ātherapyā at any hour of the day or night, that gap matters ā and it raises a familiar question in Washington policymaking: will safeguards arrive before harm becomes harder to ignore?
This isnāt about fear of technology. Itās about honesty ā and responsibility.
Children deserve to know who theyāre talking to. Families deserve guardrails that keep innovation from wandering into spaces it isnāt equipped to handle. And in moments of real emotional crisis, young people deserve something no algorithm can provide: a trained human being who is accountable for their care.
As parents, caregivers, and communities, weāre still learning how to protect kids in a world where help ā or the illusion of it ā is always just a tap away. But one thing feels clear: when it comes to childrenās mental health, āalmost humanā isnāt good enough.
Now is the time to tell lawmakers how you feel, wherever you stand on this issue. Contact members of the Washington State House of Representatives and Washington State Senate.
More Op-Eds:
A critical chance to defend homeless students | Op-Ed
A ban on gender-affirming care will hurt, not protect, kids | Op Ed
WA should lead in providing cash to help moms and babies | Op Ed