From competition to collaboration
Sharing data for quality improvement
A conversation with Stephen Plume, MD, Professor of Surgery, of Community and Family Medicine, and of The Dartmouth Institute; co-founder, the Northern New England Cardiovascular Disease Study Group (NNE)
In an unprecedented initiative, competing hospitals in Maine, New Hampshire and Vermont came together in 1987 to pool their data, study results and learn from each other. The result was lowered cardiac surgery deaths and improved outcomes throughout the region. We talked with Dr. Plume about how comparative variation data can be used to study practice patterns and improve quality of care.
Q: You co-founded NNE (www.nnecdsg.org), which today serves as a national model for improving cardiac care. Tell us about the launch of this innovative group over 25 years ago.
A: In 1986, the Health Care Financing Administration (HCFA) issued a report which was the first that actually identified and scored hospitals by mortality. In reviewing the report, we thought the rankings were unfair because they failed to adjust for differences in severity of patient comorbidity or “case mix.” So we set out to gather our own data and prove that HCFA’s back-room calculations were wrong.
Q: What initiatives did you take to improve outcomes? How did the data reflect improvements?
A: We took trips to each other’s institutions to observe processes — from admission to the OR to transfer to the ICU. We also observed every part of a procedure, from where medications were placed to how IVs were prepared.
Then, we’d meet and describe variations in how we performed cardiac surgery and in outcomes statistically associated with those variations. We developed prediction rules for CABG risk factors and mortality and worked to integrate them into our clinical practices for shared decision-making with patients and their families. We also trained teams in continuous quality improvement techniques.
By 2000, the mortality rate for CABG had fallen by 24% throughout the region, and there was no statistical difference among participating institutions. Over the 25 years since NNE’s launch, we estimate our efforts contributed to the survival of a few hundred patients.
Q: Your approach was a unique one, given that those invited to the NNE consortium were essentially your competitors. How did you get competing organizations to collaborate and share their experience?
A: We realized each of us had something to teach the other. No one was overburdened with ego. We were more concerned about the well-being of our patients than about our rankings in the region. We knew we could accomplish much more as a group than as individuals in addressing quality concerns and saving lives.
Q: What guidance can you offer health care leaders in learning from the NNE experience?
A: Point 1 is that there is an ongoing, pressing clinical need for the kind of data the NNE collects. Point 2 is that none of us trusts externally generated data that may not be representative of our daily work. Point 3 is that the work of understanding the clinical processes that generate the observed data, especially the observed differences, has the benefit of exposing people to data analysis, to tests of change and to system interdependencies.
I have come to agree with those who say that the ability to work on work, to understand what is happening and how we might change it, are core competencies of any organization hoping to survive in a changing world.