A health-systems-backed tech company has reached a deal with Texas’ attorney general to settle allegations that a generative AI product used by “at least four major Texas hospitals” was less accurate than claimed.
Dallas-based Pieces Technologies’ tech drafts AI-generated clinical summaries and documentation within the electronic health record for multidisciplinary care teams. It recently announced that it had generated more than 5.4 million inpatient clinical summaries for its multiple health system clients.
To reach its clients, Pieces Technologies advertised several metrics and benchmarks suggesting its generative AI products were highly accurate, Attorney General Ken Paxton’s office wrote in the assurance of voluntary compliance unveiled Wednesday. Key among these were claims that the products had a critical hallucination rate of less than 0.001% and a severe hallucination rate of less than 1 per 100,000.
However, the Texas attorney general's office alleged an investigation it conducted “found that these metrics were likely inaccurate and may have deceived hospitals about the accuracy and safety of the company’s products.”
Its announcement also noted that the hospitals relying on the AI tools have been providing their patients’ data to the company in real time to generate the summaries and that any deceptive claims regarding accuracy are “putting the public interest at risk.”
Per the agreement shared by Paxton’s office, Pieces Technologies denies any wrongdoing and contends that it has not violated Texas’ consumer protection laws nor inaccurately represented its hallucination rate.
The agreement is not a financial settlement and does not ascribe any financial penalties related to the company’s alleged conduct. However, it does require that Pieces Technologies more clearly disclose how it is describing its tools’ performance and that it provide more explicit disclosures to its customers and users covering how the products are to be used and could potentially lead to harm.
“AI companies offering products used in high-risk settings owe it to the public and to their clients to be transparent about their risks, limitations, and appropriate use. Anything short of that is irresponsible and unnecessarily puts Texans’ safety at risk,” Paxton said in a release announcing the settlement. “Hospitals and other healthcare entities must consider whether AI products are appropriate and train their employees accordingly.”
In a statement, Pieces Technologies said the attorney general office’s press release "misrepresents" and is "wholly inconsistent with" the parties' assurance of voluntary compliance. The company reasserted the safety and accuracy of its products were correctly communicated and said it has been focused on developing a risk classification system for generative AI hallucinations that is so far lacking for the broader industry.
"Pieces strongly supports the need for additional oversight and regulation of clinical generative AI, and the company signed this [assurance of voluntary compliance] as an opportunity to advance those conversations in good faith with the Texas [Office of Attorney General]," it said in a statement.
"Despite the disappointing and damaging misrepresentation of this agreement in the Texas [Office of Attorney General]’s press release, Pieces will continue to work collaboratively at both state and national levels with organizations that share a common commitment to advancing the delivery of high quality and safe health care across communities."
Word of the company’s deal with Paxton’s office comes about a week after it announced a $25 million growth financing round. Of note, health systems Children’s Health and OSF HealthCare were prominent participants in the raise.
"We are honored to receive support from major health systems and leading healthcare investors at such an exciting inflection point,” CEO Ruben Amarasingham, M.D., said at the time.
A representative for Pieces Technologies told Fierce Healthcare that it "has been and will continue to be transparent with our stakeholders, including on the matter of this [assurance of voluntary compliance]."