Google’s “Project Nightingale” with the Ascension healthcare network, first revealed by The Wall Street Journal, gave the tech firm access to millions of people’s medical information without their knowledge.
Separately, a lawsuit against Google and the University of Chicago accuses them of violating patient privacy rights through their data-sharing partnership.
Across the country, hospitals are increasingly granting firms like Google, Microsoft, IBM and Amazon access to their valuable data. Even smaller providers are being courted constantly. As the CEO of one 14-hospital system recently said, executives right now are “inundated” with requests.
The deals stay on the right side of privacy laws and have an obvious allure for hospitals that are eager to start taking advantage of data without spending much. But they’re exposing hospitals to a different sort of crisis: one of trust among patients who are rightly worried about the fate of their personal health information in the hands of big tech companies.
Moreover, the partnerships highlight the fact that hospitals and health systems don’t understand the value of their own data when used correctly.
The whole reason hospitals are being “inundated” with data-sharing proposals is that Big Tech understands exactly how valuable those data are. Data are the source of a massive disruption underway in the healthcare industry. By leaping into data-sharing partnerships without first defining their own data strategy, hospitals are effectively giving away the keys to the kingdom.
If hospitals aren’t using their data to improve outcomes, raise revenues and reduce costs, someone else will.
Hospitals’ problem isn’t a lack of data; it’s finding the right data, putting the information in front of the right people at the right time and knowing how to use it to get results. Despite the growing centrality of data to the healthcare industry, a surprising number of hospitals haven’t put in place the processes, infrastructure and systems needed to identify what’s relevant to their targets. Their data tends to be siloed in different systems rather than combined into a coherent whole. Hospitals need to identify a path forward for the use of their data assets to become truly “data driven.”
A lot of institutions have been looking at the same data for years and expecting different results, meeting Einstein’s definition of insanity. It leaves hospitals stuck in a rut in terms of operational efficiency and patient quality outcomes. There are big, dark holes in their data landscape, and you can’t innovate or improve upon what you can’t see.
The key is to build systems and expertise that allow you to look at data in a different and more comprehensive way. The silos need to be broken down so that the data can be correlated and used to empower a team. Often hospitals have a plethora of “dark data” sitting around unnoticed on their systems that can provide valuable insights into improving operational efficiency and quality outcomes; establishing systems that surface the value in these data is one key to success.
When we ask hospital executives how much of their data they use, the response is often around 25% or even less. Simply moving that utilization up above 50% can yield transformative gains, but getting there is a challenge.
One critical access hospital we’ve worked with in the past was having difficulty understanding the productivity levels of its staff. Once they started using their data in a way that correlated staff movements with patient care, they were able to reallocate 5% of their budget in the right direction.
Data used the right way can sometimes shine a harsh light on situations, exposing problems that were flying under the radar. At one long-term care facility, that meant discovering that nurses were spending only 30% of their time with residents and the rest at their stations—a shocking dose of reality for an institution that advertised its main value proposition as an added level of care that its competitors couldn’t offer.
Hospitals’ approach to data may have been adequate when the payer system was based on volume and transactions. But the transition to a value-based model is making it essential for institutions to develop more sophisticated data systems. The focus on quality metrics has left many hospital executives feeling overwhelmed and dependent on assessments from the Centers for Medicare & Medicaid Services or their accountable care organizations.
Hospitals tend to lack the expertise to go into their electronic health record systems and pull the data that will help them address blind spots and better predict their quality numbers. Providers often have a hard time understanding their practice through a quality lens. What percentage of their patients haven’t come in for wellness tests, for example? Most could probably answer this question but may not have a way to improve upon it. How many times have the patients been contacted, and what are their average wait times when they do come in?
Partnerships with big tech firms indeed have the potential to provide value to hospitals. But in order to choose wisely, hospitals need to unlock and leverage the value of their data from within. Too many healthcare institutions are using systems that are built around yesterday’s numbers, and those aren’t going to help them solve today’s problems.
Jon Ault is a principal in the technology division of Eide Bailly LLP.