Health Data, Workforce Development
What Machines Can’t Do without the Human Touch
Imagine that a health information (HI) director is handed a vendor demo for an artificial intelligence (AI)-assisted coding tool. The model promises 95 percent accuracy. The training data is proprietary. The audit trail is thin. She has two weeks to make a recommendation to the C-suite.
That’s not a technical problem. It’s a judgment call. And no algorithm is making it for her.
AI is rapidly reshaping healthcare operations, natural language processing supports coding and clinical documentation integrity (CDI) programs, predictive analytics drive population health and fraud detection, and automation is embedded across the data lifecycle. For HI professionals, long recognized as stewards of data quality, privacy, compliance, and governance, this evolution does not diminish the profession’s value. It sharpens it. As automation expands, the competencies that remain distinctly human become more visible and more essential. They are not soft skills. They are the essence of HI practice.
Emotional Intelligence: The Human Layer of Data
HI professionals sit at the intersection of technology, clinicians, administrators, and patients, a position that requires reading more than what’s on the screen. AI can flag documentation anomalies, but it cannot sit with a clinician exhausted by a new electronic health record (EHR) workflow and understand what’s actually driving her resistance.
Healthcare data is only valuable when its trusted. That only happens when people have confidence in the individuals managing the data. And that confidence is not built through dashboards, it is built through consistency: the release-of-information manager who never bends a process under pressure, the coding director who flags a discrepancy that would have gone unnoticed. AI governance committees, vendor negotiations, regulatory advisory roles, all of them turn, at some point, on whether the people in the room are trusted. That reputation is not assigned. It is earned, slowly, through principled action.
Regulations evolve more slowly than technology, which means HI professionals regularly navigate terrain no policy fully covers. How should a health system govern a generative AI model trained on its own patient records? What do you do when two compliance frameworks point in different directions? Systems can surface options. Humans weigh those options against professional codes, organizational mission, and community expectations. The ability to make a defensible decision in the presence of ambiguity is central to HI leadership. It always has been.
Inspiring Change, Not Just Managing It
Digital transformation runs on belief. HI professionals translate technical requirements into patient-centered stories. Strengthening a patient identity matching program isn’t an IT upgrade, it’s fewer wrong-patient errors in the operating room. Rebuilding a data quality workflow isn’t process improvement, it’s the integrity of the record a physician trusts when making a clinical decision. When teams understand those stakes, change moves faster and sticks harder.
HI work is inherently interdisciplinary, privacy officers partner with legal counsel, informaticists work alongside IT, and coding leaders coordinate with clinical departments whose documentation habits directly affect reimbursement and quality metrics. AI can summarize meeting notes. It cannot tell when a physician’s silence during a CDI discussion signals confusion rather than agreement, or when a long-standing interdepartmental tension is about to derail a governance initiative. Asking the right question at the right moment, and genuinely listening, often surfaces the risk no system would have caught.
As entry-level tasks become more automated, preparing the next generation of HI professionals becomes more consequential, not less. Emerging competencies, AI auditing, algorithmic governance, advanced privacy analytics all require more than technical training. They require judgment, and judgment is developed through relationships. Mentorship is how institutional knowledge survives system transitions and leadership turnover, and how the profession’s values reach practitioners who will face challenges we haven’t encountered yet. No algorithm teaches that.
Personal Courage in High-Stakes Decisions
Not every call is popular. Delaying a system go-live because data integrity concerns haven’t been resolved. Flagging an algorithmic output that clinical leadership is eager to trust. Raising a compliance issue when the revenue pressure is real and the timeline is tight. These moments require something no system can supply: the willingness to act in the best interest of patients when the evidence is incomplete and the easier path is silence. That courage is what defines trusted HI leadership.
AI will continue to transform how health information is captured, analyzed, and exchanged. The HI director facing that coding tool decision will face harder versions of it next year, and more of them. The profession’s value will not be measured by how much it automates, but by how well it governs what automation cannot.
Emotional intelligence. Ethical judgment. Mentorship. Courage. These are not residual human traits waiting to be automated. They are the core of what HI professionals do, and in an era shaped by algorithms, precisely what makes the profession indispensable.
The machine can process the record but it can’t be accountable for it.
Jennifer Mueller, MBA, RHIA, SHIMSS, FACHE, FAHIMA, FACHDM, is Senior Vice President: Health Information Career Advancement & Academic Affairs at AHIMA.