The revenue cycle is ripe for innovation, and new artificial intelligence (AI) technologies continue to draw the attention of health leaders. While the various applications promise to improve clinical care and billing and coding efficiencies, integrating AI into already complex and data-rich workflows is no easy feat.
Despite these concerns, many health systems are moving forward with revenue cycle AI tools out of necessity. In a 2023 AHIMA survey, 83 percent of respondents reported an uptick in persistent staffing shortages at their workplaces over the past year, contributing to lower patient data quality and increased claims denials. The shifting workforce dynamics contributed to nearly half of respondents saying their departments had already adopted AI and machine learning (ML).
AI will affect how medical coders and health information (HI) professionals do their jobs, but experts say they can leverage their knowledge to ease AI adoption roadblocks for their organizations. By partnering with physicians and other key stakeholders, they can overcome documentation challenges, grow their understanding of AI technology, and contribute to meaningful data collection that enhances reimbursement and patient care.
Dial In Documentation
AI-based systems depend on predictive modeling, so if the current quality of coding and documentation is subpar, it will drag down the efficiency of any AI program, says Barbara Auguste, MS, RHIA, CPC, CPMA, healthcare operations and compliance manager at Weill Cornell Medicine in New York City. Certified medical coders are instrumental in establishing better baseline coding accuracy and, together with HI colleagues, can spearhead education initiatives.
"You definitely need to have a collaborative education effort among doctors and revenue cycle, plus others like information technology (IT), because everyone looks at data from a different angle," she says. IT may primarily focus on the "legwork" of how raw data translates to ML algorithms. Meanwhile, Auguste says that medical coders want to ensure billing information populates correctly within the electronic health record (EHR), and physicians want to know they can access patient care data.
According to Auguste, an interdisciplinary team can get everyone on the same page regarding the need for consistent, clear, concise documentation. "Because if I can't see it on paper, I'm going to have a hard time accurately coding and billing for the services rendered, and AI can't learn from information that isn't there, either."
"Thinking in ink" is a prompt she refers to often during education sessions to remind physicians and other clinicians to document more thoroughly. Auguste, who also has experience training AI systems, says when HI professionals understand what the machine must learn to code claims correctly, they can engineer that to guide documentation education and ensure the desired data is fed into the EHR.
Become a Pro at Free AI Tools
Although discussions of AI in healthcare are everywhere right now, it's not always easy for HI professionals and providers to visualize what that means for their jobs and patient care, says Megan Pruente, MPH, RHIA, director of professional services at Harris Data Integrity Solutions in Carlsbad, CA.
"Many people are wary of AI and uncertain about how and from where data is captured, especially in large language models (LLM) that are trained on significant subsets of data and can communicate back and forth with you," she says.
Pruente predicts AI will soon touch every health professional's daily work, whether in direct patient care, administration, IT, or HI, so now is the time to upskill and expand AI proficiency. ChatGPT, given its free and widespread availability, is an excellent place to start, she says. The application is not secure and, therefore, not suitable for protected health information (PHI), so Pruente suggests using it to streamline administrative tasks, such as drafting department policies, emails, and appeal letters. Providers may explore how detailed prompts can create a script for difficult patient conversations or support clinical decision-making by quickly summarizing available medical literature.
These experiences allow users to:
- Identify AI's benefits and shortfalls;
- Critically analyze the quality of the output;
- Contribute to internal governance oversight conversations within their organizations; and
- Assess vendors' AI solutions.
"Tools similar to ChatGPT are eventually going to become safe for PHI, like Microsoft Copilot, so anyone who has already utilized the free version can translate those skills into a secure setting and be miles ahead," says Pruente.
Don't Replace Human Knowledge and Skill
Physicians and HI professionals may feel frustrated by evolving workflows, so introducing yet another technology may not receive a warm welcome. Complicating these feelings is the reality that AI adoption decisions often originate in the C-suite, leaving employees to wonder how the implementation will impact their jobs, says Michael Gao, MD, co-founder and CEO of New York-based SmarterDx, an AI prebill documentation and coding review software.
Many organizations may find autonomous coding for outpatient and professional fee billing very effective because only spot-checks are necessary once AI reaches a certain confidence level. However, he says the complexity of inpatient coding and clinical documentation improvement still requires consistent human involvement.
"Our tool is designed to help clinical documentation improvement and coders do their jobs even better, not disrupt or automate them," Gao says. "After the encounter has been reviewed, our AI does another scan through the chart, like a spell-checker, and finds potential additional diagnoses."
With payers already using AI to deny claims, he says hospitals and other health organizations can counter with human-assisted AI. "If a payer says the pneumonia diagnosis isn't justified, our algorithms can parse clinical data like labs, medications, orders, and vitals, and automatically pull the chest x-ray, oxygen levels, and respiratory rate to template out a response for the denial team to review," he says.
Given their significant impact on revenue cycle AI, HI professionals and providers are in a prime position to leverage their expertise in holding vendors accountable. "Vendors promise a half dozen things and then offer relatively weak measurements for success that are easily confounded," Gao says. "So the health information or clinical people might say to leadership, 'These are the measurements I would use instead to assess if this AI program is working at the level I'd expect.' "
Looking ahead, physicians and coders will need to reimagine their roles and envision the future of their work. For example, Gao explains that ambient clinical AI may capture the physician's notes during patient encounters and populate within the EHR in real time. When an HI professional reviews those notes, they'll need to consider that something may have been misheard or misinterpreted and query the physician accordingly. He says coders will likely take on more supervisory roles, utilizing AI tools to strengthen documentation and data collection and maximize claim reimbursement.
Algorithm accuracy is an essential component, though Gao warns that even the best algorithms may not pick up all the necessary information. Merging the data-processing capabilities of AI with human decision-making can be the best of both worlds.
Pruente agrees: "We can't trust everything that AI produces. There still has to be an adult in the room analyzing the output and making sure the technology has the right guardrails."
Steph Weber is a Midwest-based freelance journalist specializing in healthcare and law.
Take the CE Quiz