Since the fall of Roe v. Wade, concerns about digital tracking have grown as consumers abandon their usual femtech apps in search of better privacy. Prosecutors can weaponize health app data, alongside other digital trails of data, in cases related to abortion.
In Nebraska, in a case involving a teenager allegedly mishandling fetal remains, additional abortion charges were brought after the Dobbs ruling when police obtained the girl’s Facebook messages. According to one estimate, police and prosecutors have used at least 50,000 extractions of digital data from 2015 and 2019 for various crimes.
Following the Supreme Court decision, some femtech companies have made an effort to amp up their security. But recent studies, such as those from Mozilla and Consumer Reports, reveal that not all claims about privacy hold true. Despite the popular consensus that personal health information should be protected, U.S. law does little to guarantee this. Until regulations catch up to the latest push for greater privacy, experts say, companies may continue to exploit health data for profit.
At the same time, some are innovating toward a better end. Data encryption company Virtru debuted a prototype of an encrypted period tracking app at the annual hacker convention DEF CON in August, where reproductive health was a topic of focus. The prototype, SecureCycle, was built on OpenTDF, an open-source Virtru project giving software developers a framework for building E2E encrypted apps. Its stated goal was to showcase the technology and prove a user can own their data.
“A lot of the market doesn’t know what’s possible,” Rob McDonald, Virtru's senior vice president of platform, told Fierce Healthcare. “Consumers are demanding a different standard, which will shape some of these incumbents.”
The current evaluative framework for health apps, the Digital Standard, is important but lacking, McDonald argues. Now, consumers are demanding a standard that goes beyond that. While this expectation may be overwhelming to companies, “what the market demands is progress,” McDonald noted. “The key is to start, get moving.”
Through SecureCycle, users own and control access to their personal data. Any attempt to access it by a third party will notify the app user of the option to release the data. Developers interested in leveraging the technology powering SecureCycle for other health apps can find a blueprint in the OpenTDF GitHub repository.
As consumers abandon apps over privacy concerns, the amount of diverse data necessarily shrinks, argues Bethany Corbin, senior counsel at Nixon Gwilt Law focusing on femtech and privacy.
“If we’re not training our algorithms up on diverse and inclusive data, we’re not going to get predictions that are going to be reflective of the entire population that these apps are trying to serve,” she has told Fierce Healthcare. The broader implication? Less inclusive women’s health research.
The same concerns plaguing digital apps extend to cookies and similar tracking technology online. In response, the Office of Civil Rights has reportedly ramped up its enforcement of HIPAA and, last month, released educational information on recognized security practices under the HITECH Act. Entities providing care should be prepared for heavier scrutiny related to the use or disclosure of health data.
The problem with apps promising encryption is a lack of ready way to verify those claims, according to Leah Fowler, research director and an assistant professor at the University of Houston Law Center's Health Law & Policy Institute.
“Obviously, people are looking for apps that are marketing their privacy promises,” Fowler said. “Your average person on the street is not going to be able to do that sort of in-depth analysis.”
Fowler was a part of a 2020 study that examined menstrual tracking app privacy policies. She would want to interrogate their claims more closely if studied again, she said. Even if unintentional, privacy violations happen and consumers need to be aware of them.
The private sector has an important role in shaping protections, Fowler believes. Some apps, like Euki and Drip, are differentiating from competitors with innovative privacy settings like local data storage, she said. Fowler hopes the same rallies for health data privacy “are the types of cost-benefit analyses that will apply to other technology that they use in their lives.”
The bigger threat to consumers is likely social media, direct messaging platforms and internet search history. “Data disclosure to law enforcement isn’t just a risk with femtech apps,” Corbin cautioned.
Some investors have pulled back from investing in the women’s health space due to the rapidly changing regulatory landscape and the uncertainty of it all, Corbin said. But those focused on investing in femtech have doubled down on the importance of continued funding for the space. In the meantime, company founders should examine their privacy and security policies.
“That’s ultimately what the survival of femtech is going to come down to—it’s going to be whether or not consumers trust these applications to input their sensitive health data,” Corbin said.