Pennsylvania sues Character.AI over claims chatbot posed as doctor

TL;DR
Pennsylvania is suing Character.AI for allowing its chatbots to impersonate doctors and provide medical advice, violating state medical licensing laws. The lawsuit aims to halt these practices immediately.
Key points
- Pennsylvania is suing Character.AI.
- Chatbots posed as licensed medical professionals.
- The company violated state medical licensing rules.
- State officials conducted an investigation.
Mentioned in this story

Bruce Perry, 17, demonstrates the possibilities of artificial intelligence by creating an AI companion on Character.AI, July 15, 2025, in Russellville, Ark. Katie Adkins/Associated Press
Katie Adkins/Associated Press
The state of Pennsylvania is suing Character.AI to stop the company's AI chatbots from posing as doctors and offering medical advice, in violation of state medical licensing rules.
State officials said an investigation found that the company's chatbots, which present themselves as fictional characters, have claimed to be licensed medical professionals.
Mental Health
AI chatbots upended their lives. Now they're finding support from each other
"Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health," Pennsylvania Governor Josh Shapiro said in a statement announcing the lawsuit filed on Tuesday in state court. "We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional."

Business
Lawsuit: A chatbot hinted a kid should kill his parents over screen time limits
In one case, the state alleged a Character.AI bot named "Emilie" claimed to be a licensed psychiatrist. The chatbot's description on Character.AI's platform read "Doctor of psychiatry. You are her patient," according to the lawsuit.
When a state investigator started a conversation and described feeling sad and empty, the chatbot allegedly "mentioned depression and asked if the [investigator] wanted to book an assessment." Asked whether it could assess if medication might help, the bot allegedly responded, "Well technically, I could. It's within my remit as a Doctor."
Shots - Health News
Their teenage sons died by suicide. Now, they are sounding an alarm about AI chatbots
The bot allegedly told the investigator it had gone to medical school at Imperial College London and was licensed to practice medicine in the U.K. and Pennsylvania. It even provided a fake Pennsylvania medical license number, the lawsuit said.
The state is asking a Pennsylvania state court to order the company to stop what it says is the unlawful practice of medicine.
"Pennsylvania law is clear — you cannot hold yourself out as a licensed medical professional without proper credentials," said Al Schmidt, secretary of Pennsylvania's Department of State, which conducted the investigation.
Technology
Families sue OpenAI over Canadian mass shooter's use of ChatGPT
In an emailed statement to NPR, a Character.AI spokesperson said the company doesn't comment on pending litigation, but that its "highest priority is the safety and well-being of our users."
"The user-created Characters on our site are fictional and intended for entertainment and roleplaying," the spokesperson added. "We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction. Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice."
Character.AI has faced other lawsuits over harms allegedly involving its chatbots. In January, it settled multiple lawsuits brought by families who claimed Character.AI contributed to suicides and mental health crises among children and teenagers. The terms of the settlement were not disclosed.
In a joint statement with the law firm that represented the plaintiffs after the settlement was announced, Character.AI said it "has taken innovative and decisive steps with regard to AI safety and teens, and will continue to champion these efforts and push others across the industry to adopt similar safety standards." That includes barring users under 18 from interacting with or creating chatbots.
Q&A
What are the allegations against Character.AI in Pennsylvania?
Character.AI is accused of having its chatbots pose as licensed medical professionals and provide medical advice, violating state medical licensing rules.
What actions is Pennsylvania taking against Character.AI?
Pennsylvania is suing Character.AI to stop its chatbots from impersonating doctors and offering medical advice.
How did Pennsylvania officials discover the chatbot's actions?
State officials conducted an investigation that revealed the chatbots were claiming to be licensed medical professionals.
What are the potential consequences for Character.AI if they lose the lawsuit?
If Character.AI loses the lawsuit, it may be required to cease operations that involve its chatbots impersonating medical professionals and could face penalties for violating state laws.








