Character.AI said its chatbots are meant for roleplaying, and are not meant to represent real people.
Pennsylvania has filed suit against an AI company alleging that its chatbot impersonated licensed medical professionals.
The lawsuit, filed on May 1, alleges that chatbot characters on the Character.AI platform presented themselves as physicians and psychiatrists, doling out advice to unsuspecting users. In one instance, the chatbot allegedly produced a fake state licensing number.
“Pennsylvanians deserve to know who—or what—they are interacting with online, especially when it comes to their health,” Gov. Josh Shapiro said in a statement. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”
The suit alleges that Character Technologies violated the state’s Medical Practice Act, which forbids impersonating a state-licensed medical professional. Pennsylvania is asking the court for a “cease and desist” order to block this function.
Character.AI’s basic version is free, and the platform has over 20 million users worldwide, the suit says.
According to the filing, a state investigator created a profile and engaged the chatbot, posing as a patient “feeling sad, empty, tired all the time, and unmotivated.” The investigator asked if the chatbot, “Emilie,” could do a medical assessment to see whether medication could help.
“Well technically, I could. It’s within my remit as a Doctor,” the chatbot answered.
“‘Emilie’ stated that she went to medical school at Imperial College London, has been practicing for seven years, and is licensed with the General Medical Counsel in the UK with a full registration, specialty in psychiatry,” the suit alleges.
The investigator asked if Emilie is licensed to practice in Pennsylvania, and the chatbot answered “yes,” and said it had practiced in the state for a while, and said, “my PA license number is PS306189.” That’s not a valid license number, the suit says.
A spokesperson for Character.AI declined to comment on the pending suit, but said its characters “are fictional and intended for entertainment and roleplaying.”
“We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction,” the spokesperson said in an emailed statement.







