Revenue Cycle

From the 2019 CDI Summit: Understanding Methodology Behind Quality Report Cards Is Key

The numerous quarterly or annual quality report cards can evoke angst, celebration, and every reaction in between for hospital chief financial officers, boards of directors, health information management (HIM) departments, coders, and clinical documentation improvement (CDI) specialists. But if the Leapfrog, Hospital Compare, or US News and World Report report card results are less than wonderful, it’s the HIM departments, coders, and CDI specialists that often bear the brunt of a provider’s reaction.

In her Sunday presentation at AHIMA’s two-day CDI Summit, July 14-15  in Chicago, “Navigating Public Quality Report Cards,” Kristen Geissler, MS, PT, CPHQ, of the Berkeley Research Group, broke down the differences between healthcare quality report cards, how CDI can impact specific report card quality measures, and the factors that can help put the report card in context for stakeholders.

Over the last several years, the Affordable Care Act and market forces have pushed public and for-profit entities to publish lucrative hospital safety ratings. While these reports tend to get a lot of press for hospitals, the methodology behind the ratings gets less scrutiny, Geissler says.

For example, a huge part of Leapfrog’s ratings methodology relies on self-reported surveys completed by each hospital. One attendee in Geissler’s session noted that her facility jumped from a “D” rating to a “B” rating because they didn’t complete the survey the first time they participated, which is an important piece of information for hospital leadership.

Another factor that the media and providers should consider is that most of these ratings are based on Centers for Medicare and Medicaid Services (CMS) claims data, which has a lag time of one or two years. In fact, the oldest CMS data can be four or five years old when it’s reported.

“Each hospital board has its sweet spot—what it was embarrassed by in the media and becomes their laser focus. Make sure your leadership knows the context. They may say ‘that’s just an excuse,’ but give them just enough information that they know the subtleties. Probably start with where you’re falling down—mortality, hospital-acquired infections, etc. Start there and create a story about ‘how we’re going to improve that.’ It doesn’t fall just to coding and CDI—quality also belongs to physicians and nurses,” Geissler said.

Regardless of the methodology used by the main players, including Healthgrades, Leapfrog, Hospital Compare, Truven/IBM Watson, and the Joint Commission, CDI specialists need to keep the following in mind:

  • Document and code completely
  • Document to highest specificity
  • Ensure every clinical indicator has a related diagnosis
  • Ensure every documented diagnosis has clinical indicator validation
  • Make certain that documentation supports timing of onset of condition
[author] [author_image timthumb='on']/Portals/0/uploads/content_hub/Mary-Butler-author-photo.jpg[/author_image] [author_info]Mary Butler is associate editor at Journal of AHIMA.[/author_info] [/author]