Post-Dobbs data privacy law in flux as experts, providers and tech companies rethink how to protect patient information

Following the Supreme Court ruling that ended federal abortion protections, legislators, tech companies and users are asking how complicit tech should be in the prosecution of those seeking the procedure where it’s banned. In the foggy landscape of a post-Dobbs world, eyes are turning to geolocation data, message encryption and period-tracking apps.

Last week, Flo, the No. 1 period-tracking app in the Apple store, made “anonymous mode” live after widespread calls to protect users’ data. Thirty senators also called for the strengthening of federal privacy protections under the Health Information Portability and Accountability Act (HIPAA), further prohibiting providers from sharing patients’ reproductive health information without consent. The American Data Privacy and Protection Act (ADPPA), if passed, would also work in protecting health data.

These moves follow two data privacy cases earlier this summer. A Nebraska woman was charged with an illegal abortion after authorities found substantiating evidence in private Facebook messages. The Federal Trade Commission sued Idaho-based Kochava Inc., claiming the company sold the geolocation data of millions of users.

“All that sensitive personal information will likely at some point be the basis of search warrants from law enforcement in states where jurisdictions have criminalized and are trying to prosecute people seeking, offering or facilitating abortion,” said Logan Koepke, program director at Upturn, an organization which investigated ways technology reinforces inequities. “As a result, I think companies should be taking the step to limit how they collect, retain or otherwise use data that could be used by law enforcement to glean information about someone's reproductive health.”

Tech companies can receive thousands of warrants a year from law enforcement seeking access to user data, Koepke said in an interview. Companies can go to court to avoid a warrant, often arguing the warrant’s scope is too broad or it changes a central feature of their technology. They can also simply not retain the data being sought.

Tech companies look to address gaps in patient privacy

For messaging apps, companies can avoid retention of data through end-to-end encryption or scheduled deletion of messages, both features in the Signal app but not currently in Facebook Messenger. If a message is encrypted, meaning only the devices sending and receiving the message have the “key” to unlock it; law enforcement can still use mobile device forensic tools to access messages, experts say.

“Educating users as to what end-to-end encryption does protect and does not protect and what options there still are to protect privacy communications is probably one thing that can be done better by companies,” Koepke said.

There are other ways companies can be less complicit in abortion prosecution, Koepke said. Google announced it would cease collecting geolocation data around abortion facilities, a July announcement that has yet to be followed by action at a technical or policy level.

Without concrete steps, geofencing warrants issued to Google are a formidable tool for law enforcement. A digital fence can be placed around an abortion clinic, revealing to authorities every Google Maps or Android user that stepped within its bounds.

“I would say geofence warrants are, from my perspective as a researcher advocate, flatly unconstitutional because they've flipped the script on how search warrants are supposed to go,” Koepke said. “Search warrants are supposed to be based off very specific probable cause that someone has committed a crime. Some courts have held geofence warrants as unconstitutional.”

Lzbeth Malig, Concord Technologies' information security and privacy officer, emphasizes that even if a company says they have user data protection at heart, deidentified data can be reidentified and policies can change.

“Data itself can be hacked and the privacy policy of different companies can change any time,” Malig said. “I think the emphasis should be on the consumers, on how we may protect our data and take a more hands-on approach to how we want to handle that. I think that would be my approach, rather than relying on the companies themselves to protect our own data.”

A University of Oxford study mapping third-party tracking of almost a million apps in the Google Play store found that, on average, apps contain 10 third-party tracker hosts.

“Companies should minimize the data they collect, only keep what's needed; use strong data encryption,” Malig said. “I think in the future, they may be letting users keep their own encryption keys. So that even if the company surrenders the data, it's still encrypted.”

Once data reach a third party, they can be sold ad infinitum. Privacy-conscious apps like Drip, Euki and Periodical do not allow third-party hosting while also keeping all data on the user’s phone. Previously, Flo allowed third-party tracking.

Flo announced earlier this month that its new feature “anonymous mode” allows users to access the app without attaching their name, email address or technical identifiers. The app assures its 48 million users that all data are encrypted. In August, it received ISO 27001 certification to protect data from cyberattacks, hacks and data leaks.

"If Flo were to receive an official request to identify a user by name or email, Anonymous Mode would prevent us from being able to connect data to an individual, meaning we wouldn't be able to satisfy the request," said Susanne Schumacher, data protection officer for Flo, in an email to users regarding “anonymous mode.”

When it comes to period-tracking apps, Koepke and his colleagues at Upturn are less concerned. In a blog post by fellow policy analyst Emma Weil, it was highlighted that “prosecutors must be able to prove their case beyond a reasonable doubt—data from a period tracker app is not enough on its own to prove this, even if it’s relevant.”

Weil and her co-authors wrote that period-tracking apps gained attention because controlling those data gives users a sense of control over personal information. “But in the context of criminalization, security is a function of community, not something that is perfectly controlled by an individual.”

Efforts to pass federal data privacy protections

Last week, U.S. Senator Patty Murray and 29 senators released a letter to Secretary Xavier Becerra of the Department of Health and Human Services (HHS) calling for the government to strengthen HIPAA. Lawmakers want the regulation to “broadly restrict” providers from sharing patients’ reproductive health information with law enforcement or in legal proceedings related to abortion without their explicit consent.

“Stakeholders have told us about providers who have felt uncertain about whether they must turn over personal health information to state and law enforcement officials, including cases where providers believed they had to turn over information when doing so is only permitted—but not required—under the HIPAA Privacy Rule,” the letter reads. “In other cases, providers did not know that certain disclosures are actually impermissible. Stakeholders have even described clashes between providers and health care system administrators on whether certain information must be shared. Many of these issues seem to arise from misunderstandings of what the HIPAA Privacy Rule requires of regulated entities and their employees.”

The ADPPA would work to supplement protections of reproductive data. The legislation, a federal data security and digital privacy measure, passed the U.S. House Energy and Commerce Committee in July and is headed to the House floor. Outside of HIPAA-protected interactions with healthcare professionals, the ADPPA is designed to protect a broad range of consumer data.

Deven McGraw, the lead for data stewardship and data sharing at Invitae, previously held the position of deputy director for health information privacy at the HHS Office for Civil Rights. She says the ADPPA protects reproductive data that HIPAA does not. She also supports the bill’s hotly contested federal preemption stance.  

The ADPPA would create a federal standard for data privacy, setting a floor and a ceiling for relevant legislation. Any changes to data protection would then need to pass through Congress as opposed to state legislatures. The bill represents a major step forward by Congress in its two-decade effort to develop a national data security and digital privacy framework that would establish new protections for all Americans, according to the American Bar Association.

“I do think that it's been very difficult to get Congress to pass legislation on privacy, but to some degree preemption is a key to get right,” McGraw said. “It may be better to establish strong protections that exist everywhere, versus just strong protections that exist for certain pockets of the country even if we will lose the ability for states to experiment with greater protections down the road.”

The long-held expectation from a technical standpoint has been that tech companies do not want to bifurcate platforms, meaning data privacy in the most rigid state is data privacy for all.

McGraw thinks that while in the past, tech companies may have been hesitant to invest in bifurcation in accordance with state-by-state privacy laws, that is not the case for the future. She points to varying iterations of application between the U.S. and Europe. If the logic holds, she thinks a federal floor is a way to ensure national data privacy, protecting all users who regularly digitally traverse state lines.

The Electronic Frontier Foundation, a data privacy rights nonprofit, is more concerned with the speed of response to data threats. In a blog post following the introduction of the bill in July, senior legislative activist Hayley Tsukayama wrote that if states like California with robust protections lose the stringent standards they have established, future responses to digital threats will be stymied.

“Some states will lose some privacy standards that they already have, and then you can't build on that,” Tsukayama said. “It makes it very hard to respond to these things that come up in states. You have to rely on Congress to pass whatever strengthening or changing, amending, you might want to exist.”

Talks in the house regarding the ADPPA have yet to begin. House Speaker Nancy Pelosi along with other members of California’s congressional delegation have openly opposed the bill based on its superseding of their state’s law.

With midterms elections around the corner, if moves are not made soon, the country might slip further into another decade without a data privacy law.