Artificial intelligence has arrived in mental health care. Large health systems and independent therapists are adopting AI tools to manage delivery of treatment, but the speed of adoption — and disturbing incidents involving general-use chatbots — is raising concerns among practitioners and researchers.
“There is a lot of fear and anxiety about AI,” says psychologist Vaile Wright, senior director of health care innovation at the American Psychological Association (APA). “And in particular fear around AI replacing jobs.” Those concerns helped spark a 24-hour strike last month by 2,400 mental health providers for Kaiser Permanente in Northern California and the Central Valley.
One striking therapist, Ilana Marcucci‑Morris, worked as a triage clinician at Kaiser Permanente’s telepsychiatry intake hub since 2019. In May 2025 she was reassigned from triage after the health system revamped its triage process. “What used to always be a 10 to 15-minute screening from a licensed clinician like myself is now being conducted by unlicensed lay operators following a script,” she says, “or, an E-visit.” Colleagues worry that this downsizing opens the door to AI replacing clinical roles.
At Kaiser’s Walnut Creek site, the triage team fell from nine providers to three, says marriage and family therapist Harimandir Khalsa. “The jobs that we did [are] being handled by these telephone service representatives.” The strike protested this erosion of licensed triage among other issues. Kaiser Permanente Northern California said in a statement that “our use of AI does not replace clinical expertise.” The system confirmed it is evaluating tools from U.K. company Limbic, and said Limbic is not in use at this time.
So far, the APA’s Wright hasn’t seen widespread job replacement. Instead, AI adoption in mental health has largely focused on administrative tasks — documentation, billing, updating electronic health records — work that takes clinicians away from patient care. “Most providers want to help people and when they get mired down with excessive paperwork… that takes away time from direct patient care,” Wright says. AI can improve efficiency and free clinicians for therapy, she adds, though comfort and vetting of tools vary.
A growing market of businesses supplies these tools. Blueprint offers an AI assistant that summarizes sessions, updates records, and tracks patient progress for individual therapists. Limbic builds AI assistants for large health systems; founder and CEO Ross Harper says the company is deployed across much of the U.K. National Health Service and in 13 U.S. states. One Limbic chatbot, Limbic Care, is trained in cognitive behavioral therapy skills and can offer evidence-based tools to patients immediately through a portal. “If it’s 3 a.m., you can connect and begin working on the challenges you’re experiencing right there and then,” Harper says.
Clinical use of AI is not yet widespread, says psychiatrist Dr. John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center. Tools are promising but “they’re not well tested,” and they can be expensive to run and maintain, requiring IT infrastructure and safety measures most small practices lack. With little regulation, Wright says providers must research whether available tools are safe and effective before adopting them.
Torous predicts continued growth in AI use as technology improves, but emphasizes clinician training and involvement. “AI is going to transform the future of mental health care for the better,” he says, but clinicians must learn to use it so they can evaluate products and ensure safety. Striking Kaiser workers want to be included in development and rollout decisions. “If AI is utilized, don’t keep us clinicians out of the human process of engaging with our patients in determining the right level of care,” Khalsa says.
What’s likely, experts say, is a hybrid model: human providers delivering therapy while AI assistants help with homework, skills practice, documentation, and real‑time feedback. Wright of the APA foresees an ongoing role for in-person therapists: “There are no AI digital solutions that can replace human-driven psychotherapy or care.”