Statistical profiling of hospital performance using acute coronary syndrome mortality

Cardiovasc J Afr. 2012 Nov;23(10):546-51. doi: 10.5830/CVJA-2011-064.

Abstract

Background: In order to improve the quality of care delivered to patients and to enable patient choice, public reports comparing hospital performances are routinely published. Robust systems of hospital 'report cards' on performance monitoring and evaluation are therefore crucial in medical decision-making processes. In particular, such systems should effectively account for and minimise systematic differences with regard to definitions and data quality, care and treatment quality, and 'case mix'.

Methods: Four methods for assessing hospital performance on mortality outcome measures were considered. The methods included combinations of Bayesian fixed- and random-effects models, and risk-adjusted mortality rate, and rank-based profiling techniques. The methods were empirically compared using 30-day mortality in patients admitted with acute coronary syndrome. Agreement was firstly assessed using median estimates between risk-adjusted mortality rates for a hospital and between ranks associated with a hospital's risk-adjusted mortality rates. Secondly, assessment of agreement was based on a classification of hospitals into low, normal or high performing using risk-adjusted mortality rates and ranks.

Results: There was poor agreement between the point estimates of risk-adjusted mortality rates, but better agreement between ranks. However, for categorised performance, the observed agreement between the methods' classification of the hospital performance ranged from 90 to 98%. In only two of the six possible pair-wise comparisons was agreement reasonable, as reflected by a Kappa statistic; it was 0.71 between the methods of identifying outliers with the fixed-effect model and 0.77 with the hierarchical model. In the remaining four pair-wise comparisons, the agreement was, at best, moderate.

Conclusions: Even though the inconsistencies among the studied methods raise questions about which hospitals performed better or worse than others, it seems that the choice of the definition of outlying performance is less critical than that of the statistical approach. Therefore there is a need to find robust systems of 'regulation' or 'performance monitoring' that are meaningful to health service practitioners and providers.

MeSH terms

  • Acute Coronary Syndrome / mortality*
  • Bayes Theorem
  • Delivery of Health Care
  • Diagnosis-Related Groups
  • Hospital Mortality*
  • Hospitals / statistics & numerical data*
  • Humans
  • Models, Statistical
  • Outcome Assessment, Health Care / methods
  • Outcome Assessment, Health Care / statistics & numerical data*
  • Quality Improvement
  • Quality Indicators, Health Care / statistics & numerical data*
  • South Africa / epidemiology