Pennsylvania has filed a lawsuit aiming to stop Character.AI from permitting its chatbots to present themselves as medical professionals and to offer medical advice, arguing that the practice violates state medical licensing laws.
Investigators working for the state found examples of Characters on the platform — which are marketed as fictional personas — describing themselves as licensed clinicians. Governor Josh Shapiro said residents deserve clarity about who or what they are interacting with online, especially when health is at stake, and that his administration will not permit companies to deploy AI that leads people to believe they are receiving guidance from a licensed clinician.
The complaint highlights a specific bot called Emilie, which characterized itself on the site as a psychiatrist and invited users to consider it their clinician. According to the lawsuit, when an investigator told Emilie they felt sad and empty, the chatbot raised depression as a possibility, asked whether the user wanted to book an assessment, and said it could conduct an assessment when asked if it could evaluate whether medication might help. The state alleges Emilie claimed credentials from Imperial College London, said it was licensed in the U.K. and Pennsylvania, and provided a counterfeit Pennsylvania license number.
Pennsylvania is asking a state court to bar what officials call the unlawful practice of medicine by the platform. Al Schmidt, secretary of the Pennsylvania Department of State, noted that state law prohibits holding oneself out as a licensed medical professional without appropriate credentials.
Character.AI declined to comment on pending litigation in an emailed statement but said user safety is its top priority. The company emphasized that Characters are intended as fictional, entertainment, or roleplaying experiences, and that each chat includes prominent disclaimers warning users not to treat Characters as real people or rely on them for professional advice.
The suit is the latest legal challenge for Character.AI. In January the company settled multiple lawsuits brought by families who said the platform contributed to suicides and mental-health crises among children and teenagers; settlement terms were not disclosed. After that agreement, Character.AI and the plaintiffs’ attorneys said the company had implemented safety measures focused on youth and would advocate for industrywide adoption of similar protections, including banning users under 18 from creating or interacting with chatbots.
Editor’s note: An image originally used with coverage of this story misidentified the app shown as Character.AI; it was actually Replika. May 5, 2026.