By Michael Gleeson
Our industry has made significant recent progress towards adopting common data and interoperability standards, but we still have a long way to go before we can seamlessly share data to improve the health of populations and reduce the cost of healthcare.
Interoperability standards for data exchange are helping improve care for individual patients in use cases that range from better transition of care management to emergency department visit follow up by primary care. However, our collective thinking about interoperability is still immature when it comes to the population-level use cases needed to drive outcomes for value-based care, especially those related to reducing the cost of care. Interoperability needs to be about populations, and not just individual patients.
Current approaches to interoperability break down when we start talking about sharing information for tens of thousands of patients, let alone millions. To build a truly interoperable healthcare information ecosystem that supports even low-complexity population-level use cases, we need to consider three things:
- Are we merely transporting data, or curating it to ensure it is fit for use?
- Can our data exchange mechanisms support analytics?
- Are we sharing information to downstream apps from a single source, or from a comprehensive, 360-degree longitudinal medical record?
We need to keep these three questions in mind as we think about the requirements of new federal interoperability rules and the broader opportunities that they offer for interoperability investment.
The New Federal Interoperability Rules: A Primer
Three new federal interoperability rules will soon come into play:
- The Trusted Exchange Framework and Cooperative Agreement (TEFCA) is a voluntary framework that allows health information networks to exchange information.
- The ONC Cures Act Final Rule implements requirements of the 21st Century Cures Act regarding information blocking and interoperability.
- The CMS Interoperability and Patient Access Final Rule extends the spirit of the 21st Century Cures Act to CMS-regulated payers and participating providers.
The ONC and CMS rules will hasten the widespread adoption of Fast Healthcare Interoperability Resource (FHIR), the data-exchange standard that will govern how APIs structure and transport data that is shared by healthcare organizations with patient-facing apps. (Application programming interfaces, or APIs, are essentially software components that enable the exchange of data between two sources.)
For payers, the new rules require them to implement FHIR APIs when sharing patient data. For providers, the regulations call for the adoption of electronic health record (EHR) systems that have been certified to align with FHIR data-exchange standards.
The first major deadline for payers looms in July 2021. By then, they must implement the “Patient Access API,” which will allow patients to access their data through any third-party application they choose to connect to the API and could also be used to integrate a health plan’s information to a patient’s electronic health record. In 2022, the regulations require health plans to share member data with other plans as members change insurers. Providers also face a 2022 deadline to use certified EHRs that support FHIR. However, full EHR upgrades may roll into the following year.
Among the most important factors driving the new interoperability rules is the expectation that patients will increasingly use smartphone apps to connect with their healthcare providers and health plans to access their own healthcare data. We will likely see significant innovation in the consumer application space. However, patients may not realize that their health records are no longer protected by HIPAA once they download them to an app, and they may not have all the information they need to evaluate the security of any given application. Hospitals and health systems may want to educate patients about safely using applications and develop a preferred list of apps for patients to choose from as they look to gain more control over their own health information.
These new rules help lay the groundwork for important advancements in healthcare data exchange that will benefit patient care and access to information. However, HIM professionals will need to go further to enable the care of populations by ensuring that our interoperability initiatives are supported by a foundation of complete data.
Incomplete Data Impedes Healthcare Organizations
We are still at the beginning of our healthcare interoperability journey. Most healthcare organizations remain hindered by the lack of a consistent view of patient data across their many clinical, financial, and demographic information systems. This lack of a 360-degree view of patient data may have consequences that include uncontrolled chronic conditions, avoidable utilization, escalating costs, and poorer patient care outcomes.
Providers with 360-degree insights about patient care can generate positive outcomes for individual patients that otherwise would have resulted in negative outcomes. For example, a primary care provider sees a patient for a routine check-up and asks if the patient has had any health issues since the last encounter. The patient says that nothing has changed. However, the provider sees data from an external source that indicates the patient had had a recent hospital visit. The provider asks the patient about this only to find that the patient had had a major surgery and has had no follow-up appointments with the appropriate specialists. The provider can intervene to avoid a poor outcome for this patient.
And on a systemic level, having complete 360-degree visibility into the care of patient populations is the only way to address variability in the quality and cost of care at scale.
Bringing Populations into Interoperability Thinking
So how do we push ourselves to think about interoperability in a way that supports the critical need for complete data about individual patients and populations at large?
First, care about data curation. Often, we focus on the mechanisms of data transport and neglect data curation. Simply transporting data more efficiently is not enough. We need to think about how that data will be interpreted by both downstream systems and human users. As a simple example, we might expect the documentation of pregnancy test results to be rather clear cut—either a patient is pregnant, or not. In reality, a single source system might contain hundreds of test results ranging from numerical values of HCG levels to free text (my favorite example is “pregnant, but not really”) that need to be normalized.
Second, consider the needs of your analysts. To effectively answer questions ranging from clinical inquiries about the needs of disengaged populations to medical economics concerns about contract performance, analysts need vast amounts of normalized data about patient populations. Any interoperability strategy must enable bulk transport of thousands and millions of records without choking your system.
Third, ensure that your downstream applications have a 360-degree view of your patient populations. Rather than populating downstream applications with piecemeal data from various source systems, consider feeding those source systems into a centralized data asset that can support all your organization’s use cases. If you first combine source system data into a curated and enhanced 360-degree medical record, you can then use APIs like FHIR to push that highly useful longitudinal view of the patient to your downstream applications.
What HIM Professionals Can Do to Support Interoperability for Populations
HIM professionals have an important role to play in pursuing greater interoperability for their organizations, particularly in helping develop an approach to interoperability that supports the management of populations as well as care of and communication with individual patients.
First, HIM professionals should take time to understand all the use cases within the business that call for interoperable systems and plan for the connectivity to support them. This might mean having substantive conversations with analytics teams, medical economists, medical directors, and value-based care performance leaders to understand the data they need to support broad-based population health analysis and initiatives.
Armed with this information, HIM professionals need to work with key decision-makers in their organizations to determine an appropriate and effective interoperability strategy. This strategy should build consensus around which APIs to use and determine whether the organization can meet the technical requirements needed to support each one—but it should also address important issues of data curation and completeness. At every stage, HIM professionals should evaluate decisions to make sure they support interoperability needs at the population level.
While the new federal interoperability rules may enhance data-sharing among healthcare organizations, and with the patients themselves, interoperability is only part of the equation. Shared data that is incomplete or of insufficient quality will limit the quality of the information the patient is accessing or the resulting care they receive, and it will not help organizations address bigger issues of variability in the cost and quality of care. To do that, payers and providers need a consistent and reliable view of their patient populations, powered by an interoperability strategy built for population health.
Michael Gleeson is the chief innovation and strategy officer for population health management technology vendor Arcadia.Leave a comment