Keeping score with revenue: The 2-step-back feedback

We want value. Radiologists, that is.

We want it for our patients, we want it for our referring clinicians and we want it for our hospital administrators. You know the buzz-phrases: Volume-to-value, Imaging 3.0, utilization optimization, etc. This verbiage outlines the collective beliefs of radiology.

As a profession, there is no arguing that these phrases, campaigns and calls-to-action are sincere. National radiology organizations, led by the American College of Radiology, are insistently supporting clinical decision support software at a federal legislative level. Our leaders are pushing radiologists to be immersed in their local accountable care organizations and determine the methodologies for shared-savings-reimbursement distribution. Radiology is aligned with wellness and screening--demonstrated most recently by our efforts to validate the value of CT lung cancer screening. Many practices have promoted increased patient contact and consult lines for referring clinicians. Radiation awareness and dose-reduction efforts are at an all-time high.

These are just a few demonstrable ways that radiology is buying-in to a value focused health care system. Better, faster, safer, cheaper. Radiology is all-in.

But are radiologists all-in? And if not, why?

What is our feedback?

Anyone who has done work in the field of quality improvement is painstakingly familiar with feedback. When performance needs to be improved, first a goal (directly related to the desired performance) must be set by all involved stakeholders. Then, feedback must be delivered regularly throughout the project with increasing individual accountability (until the goal is reached) and then extended into and throughout the sustainability phase of a project. Early in quality improvement projects, feedback is intended for learning. Common errors are addressed by project leaders at a group level. As group performance improves (but some individuals lag behind), more individual, customized feedback is used to improve performance for substandard performers. This can all be done in a variety of ways, but invariably and essentially, feedback is directly related to the performance goal and delivered regularly and timely.

What is our feedback in radiology at a department level? At the hospital level? Is it related to the areas in which we want to improve? Are our performance metrics directly related to improving value-focused care?

I know many academic departments have salary structures based partially on academic and education contributions. And I realize our government has constructed (very) small quality improvement incentives with PQRS and the value-based payment modifier. These are overall (but not without faults) good things, as they relate to feedback. Baby steps in the right direction, if you will.

But on a large scale, what is the feedback that we receive vastly composed of? Relative value units? Turn-around-time? Service-line-growth? Return on investments? Net revenue? Who's thought of as the most productive radiologist in your group? Yep, the one who reads the most studies, or the one with the most RVUs. Extra money in the budget for an equipment upgrade? Who gets it? Yep, the section that has been the most profitable or has the greatest potential for expansion.

Even our conversations reflect this historically-embedded, misaligned, revenue-focused feedback:

"Did you know nuclear medicine is credited with the revenue for radioembolization administration even if IR does the whole case? It really bails out their bottom-line."

"No, I didn't. But I heard that IR does post-TACE non-con CTs using their flat-panel detectors in their suites so that they're credited with the RVUs."

"Really? How many RVUs is a TACE worth, anyway?"

"It's like more than 50. It's really not fair. They get all those RVUs for one case--and we're busting our humps here in fluoro doing barium enemas all day at 1-2 RVUs per pop."

    Sound familiar? It is no fault of individuals. This is the environment in which we trained and the feedback we receive. I recognize that this type of feedback and these kinds of conversations are largely a by-product of data that are readily available, easily comparable and effortlessly analyzed. And no one person, or group of people, is/are to be blamed. "Produce, produce, produce," we are told. Our everyday vernacular includes phrases such as, churn-and-burn, minimize interruptions, and keep the list clean. Feedback like this hampers our profession's quest for value.

    • How many in-person referring clinician consults per radiologist did our department have last month?
    • Are we getting better?
    • Have we optimized the radiation dose to patients on our CT scanners? If not, are we improving? 
    • As a group, are we decreasing the incidence of the most common "misses" identified by our education-focused peer-review process?
    • How much have we decreased costs for patients?
    • Have we improved communication with our patients beyond the "point-of-imaging?"
    • How often did our referring clinicians feel that we actually helped?

    If we want value, we need to be answering and investigating these questions, discussing them amongst each other, and measuring our performance. If we want value, department leaders have to shun the all-too-tempting feedback traps laid by the incorrigible fee-for-service model. In order for radiology groups and individual radiologists to improve our performance and learn to deliver value-focused care, we have to give ourselves appropriate feedback--consistent, reliably available feedback related to value-centered questions.

    Only then will we begin to actually improve the value that we, as a profession, exhaustively promote and so genuinely want to provide.

    Matt Hawkins, M.D., is a vascular interventional radiology fellow at the University of Washington/Seattle Children's Hospital. Follow him on Twitter at @MattHawkinsMD.

    Prior articles by the author:
    It's time to end diagnosis fragmentation
    RSNA13: Business analytics, clinical decision support take center stage for radiologists