Revenue Cycle

The Revenue Impact of CDI and Coding Mismatches and Physician Query Analysis

Collaboration between clinical documentation integrity (CDI) specialists and health information management (HIM) coding professionals is key to a successful CDI program as well as the evaluation of the revenue impact on the healthcare facility. When a CDI specialist’s final MS-DRG (or APR-DRG) does not agree with the coder’s final MS-DRG (or APR-DRG), then there is a “mismatch.” Many organizations struggle with finding a balance between these two teams, but with collaboration, communication, and education, there can be a significant revenue impact by performing mismatch reviews on a pre-bill concurrent basis.

Organizations need the appropriate data to accurately validate their CDI program as well as to account for their coder’s post-discharge pre-bill queries. This challenge goes hand in hand with trying to determine the financial impact of the concurrent CDI program, bill holds for the coder’s post-discharge pre-bill queries, and alignment with the healthcare facility’s goals. An effective analysis of data trends and statistics surrounding both CDI and coding mismatches and the effectiveness of their physician query process is essential in determining the positive effect on the organization’s quality documentation and financial impact.

In order to identify the revenue impact of a CDI and coding mismatch program as well as physician queries, key data must be analyzed including trends in MS-DRGs, principal diagnoses (PDx), complications and comorbidities (CC), major complications and comorbidities (MCC), and ICD-10-PCS code assignments. In addition, analyzing the coder’s post-discharge pre-bill queries, physician query compliance, and turn-around time for answered queries is needed. These data elements provide deeper insights and assist in an accurate validation of a CDI and coding mismatch program and its impact on the bottom line as well as the analysis of physician queries themselves.

Examples of key findings from CDI and coding mismatch reviews as well as physician query outcomes are provided below. Each example has pre-formulated questions that will assist decision makers in finding insights towards continuous improvement and the program’s validation.

Example #1: CDI vs. Coder DRG Mismatch Summary Report

When the CDI specialist’s and coder’s final DRG does not match, then the account is placed on hold for an auditor (internal or external) to make the final DRG determination.

CDI vs. Coder DRG Mismatch

Q1 2020 Summary Report

 
Reviewer Findings Total Cases % of Total DRG $ (estimated)
Agreed with Coder DRG 3,800 74.51% $7,600,000
Agreed with CDI DRG 800 15.69% $2,400,000
Neither – different DRG 500 9.80% $1,250,000
Total 5,100 100.00% $11,250,000

In analyzing the above CDI vs. coder DRG mismatch summary report, the organization should:

  • Examine the reasons why the reviewer agreed with coding/disagreed with CDI. Is there additional ICD-10-CM/PCS coding training or official coding guideline education needed for the CDI team?
  • Evaluate the specific MS-DRGs where the reviewer agreed with CDI and/or did not agree with either (neither category). Is there an opportunity for additional clinical training for the coders? Or additional queries required? Or re-education on official coding guidelines?
Example #2: CDI Query Outcomes Summary
CDI Queries MS-DRG Changes Q1 2020  
CDI Query Outcomes # Queries % of Queries  

DRG $ (estimated gain)

PDx Changed 598 32.08% $1,794,000
CC Added 187 10.03% $504,900
MCC Added 158 8.48% $489,800
PCS Charged 33 1.77% $95,700
No change 888 47.64% $0
Total 1,864 100.00% $2,884,400

In analyzing this CDI query outcomes summary report, the organization should:

  • Assess the reasons why there was no change in the MS-DRG 47.64 percent of the time. There could be many reasons for nonfinancial impact including queries for PSI, HAC, or SOI/ROM. Or is there additional clinical training or coding guideline education needed for the CDI team regarding writing an appropriate query? Were there any specific MS-DRGs in this grouping category that need further analysis? Was there a specialty group that was more involved in the queries (e.g., cardiology, neurology, internal medicine, general surgery, etc.)? Was there a specific physician more involved in the queries (e.g., not answering queries in a timely manner)?
  • Identify specific principal diagnoses changes. Were there any specific MS-DRGs in this grouping category to take a deeper dive? Is there documentation training needed for specific specialty groups (e.g., hospitalists, cardiology, nephrology, internal medicine, etc.)?
  • Examine the specific CCs/MCCs added for opportunity for additional clinical training for the coders or additional documentation education to physicians.
Example #3: MS-DRG Query Distribution – Original MS-DRG
Date Range: 10/01/18 – 09/30/19
Original DRG Total Queries Total Impact Average Impact
690 147 $284,666.65 $1,936.51
871 129 $97,344.01 $754.60
193 108 $263,273.45 $2,437.72
689 106 $462,376.91 $4,362.05
603 98 $269,014.43 $2,745.05
190 98 $143,982.52 $1,469.21
392 89 $160,315.17 $1,801.29
872 86 $312,900.75 $3,638.38
194 77 $291,647.73 $3,787.63
189 70 $173,337.29 $2,476.25
292 53 $137,784.89 $2,599.71
191 51 $76,905.98 $1,507.96
682 48 $47,529.09 $990.19
291 48 $123,880.42 $2,580.84
192 47 $139,707.27 $2,972.50
378 47 $159,825.63 $3,400.55
683 45 $98,068.01 $2,179.29
379 45 $102,333.44 $2,274.08
313 41 $106,410.95 $2,595.39
948 40 $49,365.33 $1,234.13

In analyzing this original MS-DRG query distribution summary report, the organization should:

  • Analyze the top original MS-DRGs. Is there additional clinical training or coding guideline education needed for the CDI or coding teams regarding specialties? Was there a specialty group that was more involved in the queries (e.g., urology, internal medicine, hospitalists, etc.)? Is there a need for a more focused study on a clinical topic (e.g., sepsis)?
  • Identify specific MS-DRG triplets/pairs for further analysis (e.g., 291/292, 689/690).
  • Examine the specific CCs/MCCs added for opportunity for additional clinical training for the coders, or additional documentation education to physicians.
Example #4: MS-DRG Query Distribution – Revised MS-DRG
Date Range: 10/01/18 – 09/30/19
Revised DRG Total Queries Total Impact Average Impact
871 449 $1,178,830.96 $2,625.46
872 208 $111,412.30 $535.64
698 116 $190,472.30 $1,642.00
291 113 $280,999.21 $2,486.72
853 109 $1,411,966.59 $12,953.82
189 76 $87,314.51 $1,148.88
699 71 $102,258.34 $1,440.26
177 69 $233,081.30 $3,377.99
193 62 $49,833.21 $803.76
280 57 $153,328.68 $2,689.98
194 50 $9,828.73 $196.57
854 47 $175,227.36 $3,728.24
190 46 $62,180.20 $1,351.74
378 42 $65,448.44 $1,558.30
392 39 -$19,885.57 -$509.89
191 38 -$10,740.03 -$282.63
314 37 $81,000.69 $2,189.21
683 36 -$22,024.35 -$611.79
917 34 -$48,360.72 -$1,422.37
682 34 $101,379.09 $2,981.74

In analyzing this revised MS-DRG query distribution summary report, the organization should:

  • Analyze the top revised finalized MS-DRGs after query. Is there a need for a more focused study on a clinical topic (e.g., sepsis)? Is there an added opportunity for additional clinical training for the CDI or coding teams, or additional documentation education for physicians?
  • Identify specific MS-DRG triplets/pairs for further analysis (e.g., 871/872, 682/683)
Example #5: Post-Discharge Pre-Bill Query Summary
 
CODER QUERY RESULTS  
First Quarter 2020  
Facility Total Queries Agreed % Agreed*  Dollars Gained Canceled or Not Answered % Canceled or Not answered  Potential Dollars Lost Average Days to Respond  
A 40 39 97.50%  $ 32,600 1 2.50%  $ 600 4.08  
B 291 261 89.69%  $ 223,500 30 10.31%  $ 6,903 4.93  
C 62 52 83.87%  $ 108,471 10 16.13%  $ 2,297 5.08  
D 472 399 84.53%  $ 404,726 73 15.47%  $ 124,341 6.66  
E 102 83 81.37%  $ 65,894 19 18.63%  $ 11,224 7.26  
F 398 374 93.97%  $ 412,662 24 6.03%  $ 26,896 4.30  
G 183 179 97.81%  $ 152,235 4 2.19%  $ 5,320 3.76  
H 441 421 95.46%  $ 508,316 20 4.54%  $ 8,073 3.21  
I 113 109 96.46%  $ 195,260 4 3.54%  $ 1,400 3.61  
J 167 161 96.41%  $ 176,244 6 3.59%  $ 2,100 2.99  
K 26 22 84.62%  $ 19,432 4 15.38%  $ 1,349 3.73  
L 242 211 87.19%  $ 223,066 31 12.81%  $ 23,262 4.67  
Total 2,537 2,311 91.09%  $ 2,522,404 226 8.91%  $ 213,766 4.63  
*Agreed = Physician answered and $ changed; Physician answered but no $ change (i.e. POA or undetermined)  
 

In analyzing this coding professional’s post-discharge pre-bill query summary report, the organization should:

  • Examine the reasons for the volume of the coder’s post-discharge pre-bill queries. Was the coder providing additional queries that the CDI team did not originally present during the patient’s hospitalization, including items such as additional documentation post-discharge (e.g., discharge summary, pathology reports, etc.)? Or was the coder carrying on the initial CDI query post-discharge?
  • Drill down to specific hospitals and specific physicians who answer the query appropriately and those that go unanswered. Is there additional clinical training needed for the coding team regarding writing an appropriate query? Was there a specialty group or individual physicians that need coaching from a physician advisor? Is there a process improvement plan to be implemented regarding the average days to respond per facility or per specialty group?
Example #6: Query Response Time
Physician Response Time to Queries (days)
Time Frame 2018 2019 Change (%)
120 Bed (Hospital) 14.94 5.86 9.08 (61%)
465 Bed (Hospital) 4.85 4.13 0.72 (15%)
2460 Bed (Healthcare System) 6.57 5.04 1.53 (23%)

In analyzing this physician response time to queries summary report, the organization should:

  • Examine the reasons for the 61 percent change in physician response time in the 120-bed hospital. Was this due to a positive change in processes? Was additional training provided to all coding professionals, CDI staff, and/or physicians/physician advisors? Was an electronic system implemented?
  • Analyze the impact of the dollar bill hold for each day the physician does not respond to the concurrent or pre-bill query.

These examples, statistics, and trends can be used to identify missed opportunities to enhance a CDI and coding DRG mismatch program, prove the positive effect on your organization’s quality documentation, create process improvement plans for physician queries, and measure financial impact.

References

AHIMA. “Guidelines for Achieving a Compliant Query Practice.” Journal of AHIMA 84, no.2 (February 2013): 50-53.

Robinson, Steven. “Bridging the Gap between HIM Coding and CDI Professionals.” Journal of AHIMA. April 26, 2017. https://journal.ahima.org/bridging-the-gap-between-him-coding-and-cdi-professionals/.

Wieczorek, Michelle M. and Cheryl Ericson. “Importance of DRG Reconciliation in the CDI and Coding Processes.” ICD-10Monitor. March 2, 2017. https://www.icd10monitor.com/importance-of-drg-reconciliation-in-the-cdi-and-coding-processes#:~:text=DRG%20reconciliation%20was%20a%20way,those%20without%20a%20coding%20background.

 

Karen Youmans (kyoumans@yes-himconsulting.com) is president at YES HIM Consulting, Inc.

[box type="info"]

Continuing Education Quiz

Review quiz questions and take the quiz based on this article, available online.

  • Quiz ID: Q2039108
  • Expiration Date: August 1, 2021
  • HIM Domain Area: Revenue Cycle Management

[/box]