By Karen G. Youmans, MPA, RHIA, CCS
Collaboration between clinical documentation integrity (CDI) specialists and health information management (HIM) coding professionals is key to a successful CDI program as well as the evaluation of the revenue impact on the healthcare facility. When a CDI specialist’s final MS-DRG (or APR-DRG) does not agree with the coder’s final MS-DRG (or APR-DRG), then there is a “mismatch.” Many organizations struggle with finding a balance between these two teams, but with collaboration, communication, and education, there can be a significant revenue impact by performing mismatch reviews on a pre-bill concurrent basis.
Organizations need the appropriate data to accurately validate their CDI program as well as to account for their coder’s post-discharge pre-bill queries. This challenge goes hand in hand with trying to determine the financial impact of the concurrent CDI program, bill holds for the coder’s post-discharge pre-bill queries, and alignment with the healthcare facility’s goals. An effective analysis of data trends and statistics surrounding both CDI and coding mismatches and the effectiveness of their physician query process is essential in determining the positive effect on the organization’s quality documentation and financial impact.
In order to identify the revenue impact of a CDI and coding mismatch program as well as physician queries, key data must be analyzed including trends in MS-DRGs, principal diagnoses (PDx), complications and comorbidities (CC), major complications and comorbidities (MCC), and ICD-10-PCS code assignments. In addition, analyzing the coder’s post-discharge pre-bill queries, physician query compliance, and turn-around time for answered queries is needed. These data elements provide deeper insights and assist in an accurate validation of a CDI and coding mismatch program and its impact on the bottom line as well as the analysis of physician queries themselves.
Examples of key findings from CDI and coding mismatch reviews as well as physician query outcomes are provided below. Each example has pre-formulated questions that will assist decision makers in finding insights towards continuous improvement and the program’s validation.
Example #1: CDI vs. Coder DRG Mismatch Summary Report
When the CDI specialist’s and coder’s final DRG does not match, then the account is placed on hold for an auditor (internal or external) to make the final DRG determination.
|CDI vs. Coder DRG Mismatch
Q1 2020 Summary Report
|Reviewer Findings||Total Cases||% of Total||DRG $ (estimated)|
|Agreed with Coder DRG||3,800||74.51%||$7,600,000|
|Agreed with CDI DRG||800||15.69%||$2,400,000|
|Neither – different DRG||500||9.80%||$1,250,000|
In analyzing the above CDI vs. coder DRG mismatch summary report, the organization should:
- Examine the reasons why the reviewer agreed with coding/disagreed with CDI. Is there additional ICD-10-CM/PCS coding training or official coding guideline education needed for the CDI team?
- Evaluate the specific MS-DRGs where the reviewer agreed with CDI and/or did not agree with either (neither category). Is there an opportunity for additional clinical training for the coders? Or additional queries required? Or re-education on official coding guidelines?
Example #2: CDI Query Outcomes Summary
|CDI Queries MS-DRG Changes Q1 2020|
|CDI Query Outcomes||# Queries||% of Queries||
DRG $ (estimated gain)
In analyzing this CDI query outcomes summary report, the organization should:
- Assess the reasons why there was no change in the MS-DRG 47.64 percent of the time. There could be many reasons for nonfinancial impact including queries for PSI, HAC, or SOI/ROM. Or is there additional clinical training or coding guideline education needed for the CDI team regarding writing an appropriate query? Were there any specific MS-DRGs in this grouping category that need further analysis? Was there a specialty group that was more involved in the queries (e.g., cardiology, neurology, internal medicine, general surgery, etc.)? Was there a specific physician more involved in the queries (e.g., not answering queries in a timely manner)?
- Identify specific principal diagnoses changes. Were there any specific MS-DRGs in this grouping category to take a deeper dive? Is there documentation training needed for specific specialty groups (e.g., hospitalists, cardiology, nephrology, internal medicine, etc.)?
- Examine the specific CCs/MCCs added for opportunity for additional clinical training for the coders or additional documentation education to physicians.
Example #3: MS-DRG Query Distribution – Original MS-DRG
|Date Range: 10/01/18 – 09/30/19|
|Original DRG||Total Queries||Total Impact||Average Impact|
In analyzing this original MS-DRG query distribution summary report, the organization should:
- Analyze the top original MS-DRGs. Is there additional clinical training or coding guideline education needed for the CDI or coding teams regarding specialties? Was there a specialty group that was more involved in the queries (e.g., urology, internal medicine, hospitalists, etc.)? Is there a need for a more focused study on a clinical topic (e.g., sepsis)?
- Identify specific MS-DRG triplets/pairs for further analysis (e.g., 291/292, 689/690).
- Examine the specific CCs/MCCs added for opportunity for additional clinical training for the coders, or additional documentation education to physicians.
Example #4: MS-DRG Query Distribution – Revised MS-DRG
|Date Range: 10/01/18 – 09/30/19|
|Revised DRG||Total Queries||Total Impact||Average Impact|
In analyzing this revised MS-DRG query distribution summary report, the organization should:
- Analyze the top revised finalized MS-DRGs after query. Is there a need for a more focused study on a clinical topic (e.g., sepsis)? Is there an added opportunity for additional clinical training for the CDI or coding teams, or additional documentation education for physicians?
- Identify specific MS-DRG triplets/pairs for further analysis (e.g., 871/872, 682/683)
Example #5: Post-Discharge Pre-Bill Query Summary
|CODER QUERY RESULTS|
|First Quarter 2020|
|Facility||Total Queries||Agreed||% Agreed*||Dollars Gained||Canceled or Not Answered||% Canceled or Not answered||Potential Dollars Lost||Average Days to Respond|
|A||40||39||97.50%||$ 32,600||1||2.50%||$ 600||4.08|
|B||291||261||89.69%||$ 223,500||30||10.31%||$ 6,903||4.93|
|C||62||52||83.87%||$ 108,471||10||16.13%||$ 2,297||5.08|
|D||472||399||84.53%||$ 404,726||73||15.47%||$ 124,341||6.66|
|E||102||83||81.37%||$ 65,894||19||18.63%||$ 11,224||7.26|
|F||398||374||93.97%||$ 412,662||24||6.03%||$ 26,896||4.30|
|G||183||179||97.81%||$ 152,235||4||2.19%||$ 5,320||3.76|
|H||441||421||95.46%||$ 508,316||20||4.54%||$ 8,073||3.21|
|I||113||109||96.46%||$ 195,260||4||3.54%||$ 1,400||3.61|
|J||167||161||96.41%||$ 176,244||6||3.59%||$ 2,100||2.99|
|K||26||22||84.62%||$ 19,432||4||15.38%||$ 1,349||3.73|
|L||242||211||87.19%||$ 223,066||31||12.81%||$ 23,262||4.67|
|Total||2,537||2,311||91.09%||$ 2,522,404||226||8.91%||$ 213,766||4.63|
|*Agreed = Physician answered and $ changed; Physician answered but no $ change (i.e. POA or undetermined)|
In analyzing this coding professional’s post-discharge pre-bill query summary report, the organization should:
- Examine the reasons for the volume of the coder’s post-discharge pre-bill queries. Was the coder providing additional queries that the CDI team did not originally present during the patient’s hospitalization, including items such as additional documentation post-discharge (e.g., discharge summary, pathology reports, etc.)? Or was the coder carrying on the initial CDI query post-discharge?
- Drill down to specific hospitals and specific physicians who answer the query appropriately and those that go unanswered. Is there additional clinical training needed for the coding team regarding writing an appropriate query? Was there a specialty group or individual physicians that need coaching from a physician advisor? Is there a process improvement plan to be implemented regarding the average days to respond per facility or per specialty group?
Example #6: Query Response Time
|Physician Response Time to Queries (days)|
|Time Frame||2018||2019||Change (%)|
|120 Bed (Hospital)||14.94||5.86||9.08 (61%)|
|465 Bed (Hospital)||4.85||4.13||0.72 (15%)|
|2460 Bed (Healthcare System)||6.57||5.04||1.53 (23%)|
In analyzing this physician response time to queries summary report, the organization should:
- Examine the reasons for the 61 percent change in physician response time in the 120-bed hospital. Was this due to a positive change in processes? Was additional training provided to all coding professionals, CDI staff, and/or physicians/physician advisors? Was an electronic system implemented?
- Analyze the impact of the dollar bill hold for each day the physician does not respond to the concurrent or pre-bill query.
These examples, statistics, and trends can be used to identify missed opportunities to enhance a CDI and coding DRG mismatch program, prove the positive effect on your organization’s quality documentation, create process improvement plans for physician queries, and measure financial impact.
AHIMA. “Guidelines for Achieving a Compliant Query Practice.” Journal of AHIMA 84, no.2 (February 2013): 50-53.
Robinson, Steven. “Bridging the Gap between HIM Coding and CDI Professionals.” Journal of AHIMA. April 26, 2017. https://journal.ahima.org/bridging-the-gap-between-him-coding-and-cdi-professionals/.
Wieczorek, Michelle M. and Cheryl Ericson. “Importance of DRG Reconciliation in the CDI and Coding Processes.” ICD-10Monitor. March 2, 2017. https://www.icd10monitor.com/importance-of-drg-reconciliation-in-the-cdi-and-coding-processes#:~:text=DRG%20reconciliation%20was%20a%20way,those%20without%20a%20coding%20background.
Karen Youmans (email@example.com) is president at YES HIM Consulting, Inc.
Continuing Education Quiz
Review quiz questions and take the quiz based on this article, available online.
- Quiz ID: Q2039108
- Expiration Date: August 1, 2021
- HIM Domain Area: Revenue Cycle Management