A new study in Health Affairs shows that hospital quality comparison tools, even from apparently reputable sources, are wildly inconsistent.
Payors increasingly consider giving patients rating tools to compare hospital quality, but who is checking on these rating systems and their accuracy and transparency?
The study looked at U.S. News & World Report’s Best Hospitals; Healthgrades’ America’s 100 Best Hospitals; Leapfrog’s Hospital Safety Score; and Consumer Reports’ Health Safety Score.
“To better understand differences in hospital ratings, we compared four national rating systems,” study authors stated in the abstract. “We designated ‘high’ and ‘low’ performers for each rating system and examined the overlap among rating systems and how hospital characteristics corresponded with performance on each. No hospital was rated as a high performer by all four national rating systems. Only 10 percent of the 844 hospitals rated as a high performer by one rating system were rated as a high performer by any of the other rating systems.”
The researchers concluded that the problem is that we lack a consistent and reliable definition of quality, and different rating systems focus on different metrics. They stated that “the lack of agreement among the national hospital rating systems is likely explained by the fact that each system uses its own rating methods, has a different focus to its ratings, and stresses different measures of performance”.
Though for hospitals the inconsistency between lists seems to be a benefit, as nobody has to end up at the bottom, or at least not on all the lists
To read the study’s abstract, please click here.