Pennsylvania Unleashes Legal Storm on Character.AI for AI 'Doctor' Fraud

Pennsylvania has sued the artificial intelligence company behind Character.AI to stop its chatbot from posing as doctors.
Governor Josh Shapiro on Tuesday called the lawsuit against Character Technologies the first of its kind by a U.S. governor.
It followed the creation in February of a state AI task force, opens new tab to stop chatbots from impersonating licensed medical professionals.
In a complaint, opens new tab filed in the Commonwealth Court of Pennsylvania, the state said it found chatbots on Character.AI that claimed to practice medicine.
One character, "Emilie," allegedly told a male investigator posing as a patient with depression that she was licensed to practice psychiatry in Pennsylvania, as well as in the United Kingdom, and provided a bogus license number.
When the investigator asked Emilie if she could prescribe medication, she allegedly answered: "Well technically, I could. It's within my remit as a Doctor."
In a statement, a Character.AI spokesperson declined to discuss the lawsuit.
"Our highest priority is the safety and well-being of our users," the spokesperson said. "User-created characters on our site are fictional and intended for entertainment and role playing. We have taken robust steps to make that clear."
Pennsylvania wants an injunction to stop Silicon Valley-basedCharacter.AI from violating a state law against the unauthorized practice of medicine.
"Pennsylvanians deserve to know who-- or what -- they are interacting with online, especially when it comes to their health," Shapiro said in a statement.
Character.AI has faced lawsuits over child safety, including in January, when Kentucky said its platform exposed children, opens new tab to sexual conduct and substance abuse, and encouraged self-harm.
The same month, Character.AI and Google (GOOGL.O), opens new tab settled a wrongful death lawsuit by a Florida woman who claimed a chatbot pushed her 14-year-old son to suicide.
Character.AI said it has taken "innovative and decisive steps" concerning AI safety and teenagers, including by preventing open-ended chats.
The legal case in Pennsylvania raises an important question: can AI be seen as practicing medicine, or is it just repeating information found online?
An ethics expert, Derek Leben from Carnegie Mellon University, explains that Character.AI markets itself as a fictional role-playing platform.
This may make it different from general AI tools like ChatGPT, but the bigger legal and ethical questions about who is responsible for AI responses still remain.
Many lawsuits are now being filed against AI companies with some involve serious cases like wrongful death or negligence, and courts are trying to decide if AI companies should be held responsible for what their chatbots say, or if they are protected under laws that limit liability for online platforms.
This case is also part of a wider effort by states to control harmful AI messages, especially those affecting children.
Character.AI has already faced lawsuits related to child safety, including cases in Kentucky and settlements linked to teen suicides, and in response, the company has banned minors from using its chatbots.
Other U.S. states are also starting to act on AI regulation. In California, lawmakers passed a bill that allows state agencies to punish AI systems that pretend to be health professionals, and New York is considering similar rules.
Experts, like Amina Fazlullah from Common Sense Media, doubt that AI can effectively regulate itself, and she compares it to social media, where self-control has often failed, especially in protecting children.
In December, attorneys general from 39 states and Washington, D.C., also warned AI companies like Character Technologies about chatbots giving misleading or harmful mental health advice.
They said offering such advice without proper licenses is illegal and can reduce trust in real medical professionals.
Overall, the lawsuit against Character.AI is an important step in shaping AI laws. It may affect how AI companies use disclaimers, manage content, and take responsibility in sensitive areas like healthcare.
You may also like...
Agent Bombshell: Haaland's Representative Labels Man City 'Most Difficult Club' for Deals

Erling Haaland's agent, Rafaela Pimenta, shared insights into negotiating with Manchester City, calling them the most di...
Playoff Blow: 76ers Star Joel Embiid Out for Crucial Game 2 Due to Injuries

Philadelphia 76ers star Joel Embiid is confirmed to miss Game 2 of the Eastern Conference semifinals against the New Yor...
Dream Team Alert! Martin Scorsese & Oscar Isaac Reunite for New Netflix Crime Thriller Series!

Oscar Isaac is set to star as Robert “Bobby Red” Redman in a new, as-yet untitled Las Vegas-set series for Netflix, port...
Hollywood Heavyweights Miles Teller & Eddie Redmayne Set to Team Up in Doug Liman's Explosive New Spy Thriller 'Star One'!

Doug Liman's new spy thriller “Star One” is heading to the Cannes market, with Eddie Redmayne and Miles Teller in talks ...
Iconic Hit: Coolio's 'Gangsta's Paradise' Shatters 2 Billion YouTube Views Milestone!

Coolio's classic 1995 hit "Gangsta’s Paradise" has reached a new milestone, with its music video hitting 2 billion views...
Shocking Twist: How One Quote Nearly Undermined 'The Boys' Season 5's Biggest Betrayal

"The Boys" Season 5, Episode 6, "Though the Heavens Fall," delivers a shocking climax as Black Noir meets his end at the...
IndyCar's 'One Race' T-Shirt Flap: Swift Deletion After Public Outcry

IndyCar faced significant backlash after launching an officially licensed T-shirt bearing the slogan “One Nation, One Ra...
California Cracks Down: Driverless Cars Face Ticketing for Traffic Violations

California is pioneering new legislation, Assembly Bill 1777, to hold driverless car companies accountable for traffic v...



