Dr. Keely Cofrin Allen: 'Strict protocols' needed for data security to work

In a privacy survey conducted by the Michigan-based Ponemon Institute, many healthcare IT professionals admitted that their organization may have failed to comply with various regulations such as the Health Insurance Portability and Accountability Act (HIPAA). Specifically, more than half (51 percent) of the professionals said they do not protect patient data used in software development and testing, and 78 percent reported that they are not confident or unsure whether their organization could detect the theft or accidental loss of real data during development or testing.

For more insight on the matter, FierceEMR talked with Dr. Keely Cofrin Allen, director of the Office of Health Care Statistics (OHCS), Utah Department of Health and Executive Secretary for the Utah Health Data Committee. OHCS is responsible for the management of nine administrative rules requiring statewide collection and dissemination of health care data from Utah hospitals, ambulatory surgical centers, emergency rooms, health maintenance organizations (HMOs), preferred provider organizations (PPOs), Children's Health Insurance Plan (CHIP) and state Medicaid, and Cofrin Allen oversees the development of annual and special ad hoc reports from the facility datasets, health plan satisfaction surveys and HEDIS reporting.

FEMR: Why do you think so many health IT professionals aren't protecting data used in development or testing?

Cofrin Allen: I think that part of it might just be an overly high sense of trust. Nobody wants to think that anybody is out to do nefarious things and so there's this feeling that somehow health data are different. It's sort of baffling to me.

I think the other thing is that people feel as if they're standing behind a mandate of improving people's health, which of course is a very worthy cause and very important. But it seems to me to somehow overshadow the importance of securing the data. As a health entity you may have a right to the data, but that doesn't mean that you have a right to house it in a manner that's not secure, and I think that sometimes in the follow up on that--because people are hesitant to question the collection of data for such an important mission (to improve health)--some of the details get missed.

FEMR: How could these findings affect the push for the digitization of health records?

Cofrin Allen: One of my reactions to the results of this survey was fear that this is going to swing the pendulum too far the other way. I think it is entirely possible to do the types of data analysis that we need to do with the types of identified data that we need, and to do so in a completely secure manner.

[However] I think that one possible result of this is a kind of unfortunate, more draconian approach to data security which is saying to various entities 'if you can't secure the data, then we'll just simply see to it that you don't have it.' That's one way to protect the data, but it has unfortunate consequences.

FEMR: With consequences already so high for data breaches, what more is necessary to ensure that it isn't continually overlooked? What needs to be done to motivate the health IT pros to keep data safe?

Cofrin Allen: I think building [consequences] into the mandates would be important. Many states are moving toward creating administrative rules or a legislative mandate to create various IT tools--all-payer claims database is one of them, and the one with which I'm most familiar, but I think my comments would apply to others.

In Utah, it was always part of the conversation. It wasn't something that came up late, it wasn't something that we felt we had to scramble to address, and we never felt threatened. It was an expectation up front that was put on us, and it was an expectation that we met, and while we've been asked some hard questions, I've never felt like there was a threat that the data would be pulled or we would be prevented from getting it.

I think that's because we--the agency that I represent--have a 15 year history of collecting and securely housing this data. We've never had a breach, we've never had to apologize, we've never had to write letters. So that creates confidence.

I would welcome anyone to come in here and see how it is we house data [through regular audits]. Obviously we can always do it better, and there may be something we haven't thought of. But I would welcome that, and any entity that deals with data at this level should welcome it, too.

I think because we're the health department, because we're a government entity, it seems that that has provided us with a little bit more trust up front, which is nice, but you also have to make sure that you don't take advantage of that or forget how important that is.

FEMR: How regular do you think would be sufficient for regular audits, if that were to be the case?

Cofrin Allen: Again, it would depend on the entity and the level of trust everybody had. I really think it should be up to the stakeholders on how often that would be, but I think yearly. I think more than that would probably be a bit burdensome.

FEMR: Do you think regulations regarding medical information are too relaxed?

Cofrin Allen: I don't think so, at least, speaking purely from my point of view here in Utah. I feel like we have the right balance of freedom to collect the data and responsibility to report to people how we are going to house it. You may get a completely different perspective from somebody else, though.

I think it's important to continually have [data regulations] conversations so that it is something that is not thought of once a year when an auditor shows up or in an eye rolling manner when somebody questions how the data are going to be secured. It should become a regular part of your daily operations; it has to.

FEMR: Interesting that you point out how well your state is doing. Why do you think some other states have struggled?

Cofrin Allen: I think that it's very easy to get lax, and it's very easy to think, 'well, this is an exception.' I think you need to have strict protocols in place, and you can't just simply trust that things will work out OK.

You just simply don't transport sensitive data in your private vehicle, or on an unsecured piece of equipment. You just don't do it. And you need to have protocols in place that say you just don't do it. You need to have your employees signing contracts and confidentiality agreements and regulation rules that simply make it so that those kinds of very common and very human mistakes simply aren't available as an option. Data that gets lost on laptops or that gets left in somebody's personal vehicle--which is then stolen--that to me is just an inexcusable breach of data handling. A lot of these mistakes are utterly avoidable by simply never allowing the situation to come about in the first place.

It's very easy to think of yourself as an entity that has a right to this data, to then suddenly forget that with that right comes a sacred trust to see to it those data are used in a way that is appropriate, protected and responsible.

My fear is that this kind of thing is going to start shutting down some very important projects. It's unfortunate that people feel like the only way to secure the data is to see to it that nobody has access to it. That's as harmful to everybody as the data breaches are, so it's a matter of finding the right balance.

This interview has been condensed and edited.