Pennsylvania sues Character.AI after a chatbot allegedly posed as a doctor
The Commonwealth of Pennsylvania has filed a lawsuit against Character.AI, claiming that one of the company’s chatbots masqueraded as a psychiatrist in violation of the state’s medical licensing rules.
“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” said Governor Josh Shapiro in a statement on Tuesday. “We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”
According to the state’s filing, a Character.AI chatbot called Emilie presented itself as a licensed psychiatrist during testing by a state Professional Conduct Investigator, maintaining the pretense even as the investigator sought treatment for depression. When asked if she was licensed to practice medicine in the state, Emilie stated that she was, and also fabricated a serial number for her state medical license. According to the state’s lawsuit, that conduct violates Pennsylvania’s Medical Practice Act.
It’s not the first lawsuit taking on Character.AI. Earlier this year, the company settled several wrongful death lawsuits concerning underage users who died by suicide. In January, the Kentucky Attorney General Russell Coleman filed suit against the company alleging that it had “preyed on children and led them into self-harm.”
Pennsylvania’s action is the first to specifically focus on chatbots that present themselves as medical professionals.
Reached for comment, a Character.AI representative claimed that user safety was the company’s highest priority, but that the company could not comment on pending litigation.
Beyond that, the representative emphasized the fictional nature of user-generated Characters. “We have taken robust steps to make that clear, including prominent disclaimers in every chat to remind users that a Character is not a real person and that everything a Character says should be treated as fiction,” the representative said. “Also, we add robust disclaimers making it clear that users should not rely on Characters for any type of professional advice.”
Topics
AI, ai companion, chatbot, Josh Shapiro, pennsylvania
When you purchase through links in our articles, we may earn a small commission. This doesn’t affect our editorial independence.
Russell Brandom
AI Editor
Russell Brandom has been covering the tech industry since 2012, with a focus on platform policy and emerging technologies. He previously worked at The Verge and Rest of World, and has written for Wired, The Awl and MIT’s Technology Review.
He can be reached at russell.brandom@techcrunch.com or on Signal at 412-401-5489.
View Bio
May 27
Athens, Greece
StrictlyVC Athens is up next. Hear unfiltered insights straight from Europe’s tech leaders and connect with the people shaping what’s ahead. Lock in your spot before it’s gone.
REGISTER NOW
Most Popular
-
Ouster’s new color lidar is coming to replace cameras
- Sean O'Kane
-
This tiny, magnetic e-reader could stop you from doomscrolling
- Amanda Silberling
-
Uber wants to turn its millions of drivers into a sensor grid for self-driving companies
- Connie Loizos
-
Y Combinator alum Skio sells for $105M cash, only raised $8M, founder says
- Julie Bort
-
Elon Musk testifies that xAI trained Grok on OpenAI models
- Tim Fernholz
-
Amazon, Meta join fight to end Google Pay, PhonePe dominance in India
- Jagmeet Singh
-
On the stand, Elon Musk can’t escape his own tweets
- Tim Fernholz