As the digital health industry continues to grow, two federal agencies have developed a symbiotic working relationship to ensure that data obtained by apps and mobile devices remains private and secure.
The Department of Health and Human Services’ Office for Civil Rights (OCR) is the primary regulator when it comes to protecting health data and enforcing HIPAA provisions. But given the constraints of a 20-year-old law within a rapidly evolving industry, the Federal Trade Commission (FTC) has stepped in with its own enforcement role.
Both regulatory bodies have very distinct and prescriptive oversight responsibilities. OCR stays within the limited confines of HIPAA, which regulates covered entities handling patient health information. Likewise, the FTC has its own nuanced approach, focusing on mobile health applications that don’t live up to their privacy promises.
Representatives from OCR and the FTC provided insights into each agency’s enforcement strategy during a session at Health Datapalooza on Friday, responding to several mock scenarios that digital health startups might face. Their responses offered a perspective on the common concerns and issues that regulators encounter in the digital health industry.
OCR’s oversight is fairly constrained. Digital apps that are totally consumer facing aren’t considered covered entities under HIPAA. It’s only when those apps enter into agreements with providers or insurance companies that they fall under OCR’s enforcement purview, according to Deven McGraw, deputy director of health information privacy at OCR.
But even those partnerships are rife with ambiguity. A manufacturer that sells 100,000 wearable devices to a health plan is likely considered a business associate. Under HIPAA, that designation requires an agreement to establish parameters for using member data. But a provider that simply recommends a device or app may not enter OCR's limited radius.
During a speech at Datapalooza on Thursday, OCR Director Roger Severino expressed the need to balance privacy with innovation, adding that he wants the agency to provide “assistance on new and emergent technology so it can actually work within the rules to share information and be interoperable, but still protect safety and privacy.”
The FTC takes a slightly different enforcement approach by focusing on situations in which companies make deceptive statements or actions related to information privacy. Section 5(a) of the Federal Trade Commission Act prohibits business practices that mislead consumers—like a company that claims to encrypt user data but leaves some information unencrypted.
Although privacy policies are generally the first place regulators will look, the FTC’s enforcement goes beyond that to cover deceptive practices within the user interface, privacy settings or potentially deceptive statements, said Cora Han, a senior attorney in the FTC’s division of privacy and identity protection. That includes notifying patients of any changes to the way their data is used if a digital startup is acquired or decides to pivot their business line.
“We have had an enforcement case involving a company that changed course and did something different, and that was problematic,” she said. She recommended that companies issue clear, concise notification of any changes to avoid confusion.
If a digital health company partners with a health plan or a provider, their business associate agreement will dictate how they can use that data, McGraw said. But she added that companies that shirk transparency often feel more backlash from consumers than regulators.
“I always feel like if you’ve got something you’re worried customers will find out about and you frame it in a way they won’t know, … you’re already headed down a bad path,” she said.