3 reasons for the demise of patient privacy

Several factors have contributed to the demise of patient privacy in recent years, according to software analyst and healthcare blogger Shahid Shah (a.k.a., The Health IT Guy).

For example, Shah said at a recent discussion hosted by the Patient Privacy Rights Foundation on the best privacy practices for electronic health records in the cloud, patients tend to not "demand" privacy as the cost of doing business with providers.

"It's rare for patients to choose physicians, health systems or other care providers based on their privacy views," Shah said in a blog post summarizing thoughts he shared at the event. "Even when privacy violations are found and punished, it's uncommon for patients to switch to other providers."

Three other factors shared by Shah included:

  • The expense of creating privacy-aware solutions: Technology vendors oftentimes don't think about privacy until the end of the development process, due to the high cost of inclusion early on. "Privacy can no more be added on top of an existing system than security can," Shah said, adding that "because it's cheaper to leave it out, it's often left out."
  • Government incentives that focus on functionality: Meaningful Use certification puts too much emphasis on how a product works, as opposed to how patient privacy will be maintained, according to Shah. "Privacy is difficult to define and even more difficult to implement, so the testing process doesn't focus on it at this time," he said.
  • Patient understanding of privacy: Many patients simply assume that their information is protected and secure, when in fact, the exact opposite is true. "The digital health IT world of today is like walking into a patient's room in a hospital in which it's a large shared space with no curtains, no walls, no doors, etc.," Shah said. "In this imaginary world, every private conversation occurs so that other scan hear it, all procedures are performed in front of others … without the patient's consent and their objections don't even matter."

At the American Bar Association's Health Law Section's Annual Washington Health Law Summit last month, Iliana Peters, a privacy specialist with the U.S. Department of Health & Human Services Office for Civil Rights, said that all providers updating software to comply with the Meaningful Use program's 2014 edition of certification, or upgrading for any other reason, should conduct a security risk analysis to test for vulnerabilities that may compromise patients' electronic data.

OCR Director Leon Rodriguez noted at the ABA's annual Emerging Issues Conference last year that the audit program found that entities were lax about encrypting data, with many of them not even thinking about doing so. HIPAA's security rule considers encryption to be "addressable," meaning that either the covered entity encrypts the data or opts not to, but documents its rationale for not doing so. The topic, he said, can't simply be ignored.

To learn more:
- read Shah's full post