Insurers must address a new dimension of data quality--data credibility--according to a recent blog post published to Insurance Networking News by enterprise information management expert Malcolm Chisholm. Data credibility, he says, is having data that's relied on to represent what the data is supposed to represent.
Chisholm points to insurance companies doing business in Europe, where insurers must produce reports using the data they use to run their businesses as well as show evidence that the reports are really based on that data.
To prove credible in the "golden age of data," he says insurers must implement data management practices that include checks and balances and robust processes. Even without intent to misrepresent the data, having sloppy data management practices or under-resourcing could damage confidence in the data, Chisholm adds.
With insurers facing increasing reporting needs, regulatory compliance and special requests, New York-based MVP Health Care sought to improve its data management. It initiated a "data turnaround" to centralize all of its information and implement rigorous data governance standards.
"Poor data quality is the number one reason business intelligence projects fail. Not only is user satisfaction directly tied to data quality, but the better the data, the more it will be used," Linda McCann, MVP's vice president of business intelligence, told FierceHealthPayer in a previous interview.
Data sharing is the foundation upon which accountable care organizations can meet their goals of reducing costs and increasing payer-provider collaboration. However, some healthcare providers are reluctant to share data with payers, increasing the need for data credibility.
To ease concerns, some insurers already are making their systems more transparent and available to other industry players. Aetna, for example, has a forward thinking approach to sharing its information, granting its provider partners a high level of access to its claims data.
- read Chisholm's blog post