Crowdsourced hospital ratings from sites like Yelp can provide patients with insights on the care experience, but fall short on effectively ranking quality, according to a new study.
Researchers at Indiana University compared crowdsourced rankings from Facebook, Google and Yelp to data from the Centers for Medicare & Medicaid Services’ Hospital Compare site and found that the two data sets often aligned on patient experience.
However, that wasn’t the case for clinical quality. The team found that 20% of hospitals listed as the best on public ratings sites were marked as the worst in their markets for outcomes by Hospital Compare.
Since these online reviews may be the first place a patient looks, it’s crucial that he or she knows the scope of what that information can actually represent, Victoria Perez, Ph.D., an assistant professor in the IU School of Public and Environmental Affairs and one of the study’s authors, told FierceHealthcare.
“You only go to Hospital Compare or Nursing Home Compare to look at potential care, but people are going to these other sites for other things so naturally, that would be their first stop,” Perez said. “And then maybe you learn about Hospital compare along the way.”
A website like Facebook, Google or Yelp can paint a useful picture of patient experience—in fact, some research suggests these sites do better than the federal Hospital Consumer Assessment of Healthcare Providers and Systems (HCAHPS) survey—but the reviews are too individualized to paint a complete picture of quality or safety, Perez said.
For example, many of Hospital Compare’s measures are related to specific conditions that patient’s writing reviews may not have to include in their posts, she said. Or, a patient’s symptoms may improve more quickly than expected, boosting their experience but creating an outlier scenario.
“It’s hard to say what the clinical quality of that service was,” Perez said. “The data may be too limited to really capture patient safety or quality.”
The study comes on the heels of another report released by RAND Corporation that calls on Hospital Compare and other public ratings systems to make the data more useful and digestible for patients.
A customizable tool, according to RAND, would provide a better picture of how regional hospitals perform relative to each patient’s needs—making the quality data more usable for people who may not be able to fully parse out what the measures mean.
In the meantime, Perez said that the sites included in the study can offer valuable information to providers. Her team was surprised to find that many people provided average reviews instead of seeing a highly “bimodal” data set—meaning, mostly extremely satisfied or dissatisfied reactions.
“We didn’t see the same pattern of people reporting really awful or really good experiences,” she said. “Most people were willing to report an average experience.”