Page 121 - Read Online
P. 121
Page 2 of 21 Carr et al. Vessel Plus 2020;4:12 I http://dx.doi.org/10.20517/2574-1209.2020.01
Keywords: Coronary artery bypass graft surgery, risk assessment, outcomes research, survival, mortality
INTRODUCTION
Over the past 60 years, much has changed in the healthcare field. Increasingly, attention is being paid to
healthcare quality with the goals of improving clinical outcomes and increasing value of care delivered. A
special emphasis in quality improvement has been placed on high volume procedures such as coronary
artery bypass grafting (CABG). Although CABG volumes have declined from ~213,700 procedures (2011) to
~156,900 procedures (2016), it remains the most common cardiac surgical procedure performed in the United
[1-3]
States . To evaluate the true value of CABG, longer-term outcomes are necessary to establish the durability
of the procedure. Accordingly, the baseline patient risk factors associated with short-term (< 1 year) and
longer-term (≥ 1 year) CABG mortality were compared.
Interpreting CABG clinical outcomes data can often be challenging, as there may be a wide range in pre-
CABG patient’s severity of coronary disease or comorbidity-related disease complexity, variations in CABG
operative techniques used or post-CABG pre-discharge patient care management, as well as provider-
based variations for annual CABG volumes performed. In 1972, the Department of Veterans Affairs (VA)
healthcare system began internally reporting national unadjusted outcome rates (e.g., “observed” in-
hospital mortality rates) for patients undergoing cardiac surgery at its institutions; these first VA reports
[4]
focused upon observed CABG mortality and post-CABG complication rates .
After US hospitals’ CABG mortality reports were made publicly available by the Department of Health
and Human Services in 1985, Congress in 1986 mandated that the VA report risk-adjusted cardiac
[5]
surgery mortality rates and compare these VA rates to national standards . Given these legislation-driven
mandates, VA clinicians and scientists began looking for ways to “level the playing field” using statistical
risk models to permit more meaningful comparisons between centers and surgeons; these risk-adjusted
outcome reports were used in their local VA medical centers’ quality improvement endeavors.
Initiated in April 1987, the VA Continuous Improvement in Cardiac Surgery Program (CICSP) was
founded; CICSP was one of the first registries to report risk-adjusted CABG 30-day operative mortality
[4]
and major morbidity across all participating VA hospitals . The VA CICSP identified a set of Veteran
risk characteristics associated with CABG adverse outcomes; based on gathering 54 patients’ risk, cardiac
surgical procedural details, and hospital-related outcomes, the VA CICSP calculated the “expected”
mortality occurrence for each Veteran undergoing a CABG procedure. Across providers and “high-
risk” patient sub-groups, therefore, “observed” to “expected” outcome rates were compared to identify
[6]
opportunities to improve their local VA cardiac surgical care .
Some of the earliest lists of pre-CABG patient risk factors associated with mortality were developed entirely
based on expert consensus. As different national, regional, and state-wide databases originally gathered
different sets of patient risk factors, an early consensus conference was held to identify the minimal set of
[7,8]
“core” risk variables required to be captured . Given challenges encountered with CABG records’ data
completeness, however, these earliest mathematical approaches to calculate risk-adjusted outcome rates
[9]
made use of Bayes theorem . Since the VA’s programmatic expansions in 1992, dramatic improvements
were made in the VA completeness of CABG data captured; thus, logistic regression emerged as the most
common analytical approach used. Other approaches have been reported, including applications of neural
networks and Cox regression [10,11] . Given both the ease of clinical interpretation and superior statistical
model performance, however, logistic regression remains the standard analytical approach used to predict
post-CABG short-term (ST) and longer-term (LT) mortality [12-14] .