Tampa General Hospital was rated the number 1 hospital in Florida, and number 38 nationally, on US News and World Report's 2014 list of Best Hospitals. Yet TGH didn't even make the top 100 on Healthgrade's annual list of America's 100 Best Hospitals.
Out of 2,500 hospitals, Bolivar Medical Center in Cleveland, MS, landed at the bottom of the list in the 2014 Consumer Reports Hospital Safety Report. That same year, the hospital earned The Joint Commission's Gold Seal of Approval, which is recognized nationwide as a symbol of quality and safety. And in 2013, the hospital received an "A" Hospital Safety Score from the Leapfrog Group, which grades hospitals on how safe they keep their patients from errors, injuries, accidents and infections.
A multitude of ratings
In an article in Modern Healthcare, Dr. Rusty Holman, Chief Medical Officer for LifePoint, which owns Bolivar, said that the hospital has seen significant quality-of-care improvements since it started participating in the federal Partnership for Patients program in 2012. He also said that Consumer Reports used old data to rate the hospital that did not reflect the recent improvements. Holman said that overall, the proliferation of different hospital ratings is complex and confusing for hospital leaders. "One can only imagine how dizzying it is for consumers," he said.
A recent survey of more than 230 healthcare executives conducted by Modern Healthcare found that 53% of respondents said their facilities had received a poor rating from at least one rating entity and a high rating from another group on similar measures over the same time period. So how do the hospital rating systems come up with such vastly different grades for the same hospitals? According to the Association of American Medical Colleges (AAMC), the differences in measures, data sources and scoring methodologies between the rating systems often produce contradictory results that confuse the public, health providers and governing boards. "The issue is that the number of sites that are publicly reporting data continues to grow," Jennifer Faerberg, MHSA, AAMC director of clinical transformation, told AAMC Reporter. "There is incredible variability across these sites. You can do well on one and not the other and not necessarily know why. It's an area not necessarily easily understood."
Making sense of the ratings
To help hospital make sense of their ratings, the AAMC has developed a set of guiding principles that can be used to evaluate quality reports. The principles are organized into three broad categories:
1. Purpose
Each website that reports performance data should explicitly state its target audience and the intended purpose of the report. The data, measures and data display should fit the report's stated purpose.
2. Transparency
Transparency requires that all information necessary to understand the data be available to a reader; this information includes measure specifications, data collection methods, data sources, risk adjustment methodologies and their component parts, composite score methodologies and reporting methods used to translate results into graphical displays. Details should be sufficient for independent replication of the results. Limitations in the data collection and methodology and relevant financial interests also should be disclosed.
3. Validity
Ideally, measures, as well as composite and scoring methodologies, should be supported by clinical evidence, field-tested and, where appropriate, have National Quality Forum endorsement
Vinita Bahl, DMD, MPP, Assistant Professor of Surgery at the University of Michigan Health System and a member of the AAMC workgroup says these principles will help hospitals more quickly identify valuable rankings and decrease time spent on flawed and unclear data. She also hopes the rating organizations will use the principles to assess their own ranking processes.