The Health Insurance Portability and Accountability Act (HIPAA) was signed into law in 1996. In 2003, the first of its 11 parts, the Privacy Rule, was implemented.
Since then, several new provisions have become part of HIPAA. But several types of healthcare environments, including those related to new technologies, still do not fall under HIPAA. This raises the question of the need for new updates. But, first, let’s review.
Six months after the Privacy Rule was introduced in 2003, the Transactions and Code Sets Rule was introduced. The next year, 2004, saw the Standard Unique Identifier for Employers Rule implemented. This was followed in 2005 by the Security Rule. Five months later, the Claims Attachment Standards Rule was introduced. Having all these rules out, in 2006, the Enforcement Rule was introduced to establish how the government would enforce the parts of HIPAA. A year later in 2007, the Standard Unique Healthcare Provider Identifier Rule, also known as the National Provider Identifier (NPI) number, was introduced.
There was a small break in new HIPAA rules before 2009, when the Health Information Technology for Economic and Clinical Health (HITECH) Act was passed, which increased fines for HIPAA violations. Along with HITECH, in 2009, the Breach Notification Rule mandated that, depending on how the size of the breach, various groups needed to be notified. Finally, in 2013, the Omnibus Rule was passed, which, 10 years later, strengthened the Privacy Rule and introduced the Genetic Information Nondiscrimination Act (GINA). As genetics use became more mainstream, GINA was implemented to stop insurers from using genetic information to price insurance policies.
A Notice of Proposed Rulemaking was issued in 2020 with impending changes for this year. Even with those proposed modifications, very little has changed—meanwhile, the healthcare and information technology fields have changed dramatically. The following healthcare and technological situations merit HIPAA changes to ensure that the law keeps up as the environment around it changes.
Cash-Only or Free Facilities
As of now, only organizations that conduct one of the 11 HIPAA transactions, as stated in the Transactions and Code Sets Standards, must be compliant with HIPAA. The most prominent of these 11 transactions is electronically billing for services; thus, if a facility takes cash only or is free, it is not under HIPAA despite collecting patient health information (PHI). For example, free healthcare clinics that provide physical exams, administer vaccines, and give well baby checks-up collect PHI. Yet, because they do not have insurer transactions, they are not under the stringent federal HIPAA law (although they are under state confidentiality laws). With so much health information collected, the threat of a data breach is real. Having that patient health information under HIPAA would help ensure greater privacy and security.
HIPAA must include greater security protections for live patient telemedicine visits, which have increased during the pandemic. Telemedicine visits generate more PHI than either paper or electronic medical records of in-person visits—such as the patient’s facial image and the full dialogue between the patient and provider—thus, the protections must follow suit. The use and protections of facial recognition data also involve the issue of the patient’s rights.1 How facial recognition data might be used and who has access to it are two of the questions currently under debate. These are ethical questions; and ethics are at the core of HIPAA.
From TikTok to Instagram to Facebook, these unregulated platforms can contain PHI, as patients use them to convey their medical information. Facebook data breaches have included health data.2 Providers also find them to be a part of everyday use but must refrain from formally using them with patients, as they are not under the HIPAA Security Rule. Updating the HIPAA Security Rule and social media platforms’ security features could give patients and providers another way to communicate that is secure and HIPAA-compliant. In the meantime, because it can be a method of contact with patients, healthcare personnel should be trained on appropriate and HIPAA-protected uses of social media.
Wearable devices that collect PHI, such as FitBits and smart watches, are not under HIPAA, because the wearable belongs to the wearer—in this case, the patient. But where does that data go? Who sees that data? How are the data transferred? Healthcare personnel, working with patients using these devices, need to put controls on how data are collected and shared to not compromise the privacy and security of the patient data. HIPAA protects data for healthcare providers, health plans, and clearinghouses, but companies manufacturing wearables are not included in this group. Because of this, data from wearables is not under the Security Rule and can be compromised and used in ways the patient may not even realize.
Artificial intelligence (AI) in healthcare is not the future, but the present. The workforce needs to be expanded; thus, HIPAA must concurrently be in the present alongside it. Are AI computer programs that analyze patient data considered business associates under HIPAA? Not as HIPAA is presently written. In 1996, AI uses in healthcare were very different and much more minimal than they are now. The World Economic Forum reported that 97 million jobs will be available due to AI by 2025,3 yet the workforce is not trained, and the supply does not meet the demand. In addition, the organizational culture needs to support the artificial intelligence movement. Leadership has to set short- and long-term plans to incorporate AI into their operations and have a workforce ready to meet demand.
The Need to Evolve
As HIPAA enters its next 25 years, new initiatives supported by leadership are needed to protect PHI from all sources, including the latest information technology. This will help the healthcare field to continue to be inventive and innovative in delivering high-quality care.
1. Council of Europe. “Ensure that facial recognition does not harm fundamental rights.”
January 28, 2021
2. Davis, Jessica. “Facebook Accused of Exposing User Health Data in Complaint to FTC.” February 20, 2019.
3. Ascott, Emma. “AI Will Create 97 Million Jobs, But Workers Don’t Have the Skills Required (Yet).” November 19, 2021. https://allwork.space/2021/11/ai-will-create-97-million-jobs-but-workers-dont-have-the-skills-required-yet/
Joan M. Kiel (firstname.lastname@example.org) is the chair of University HIPAA Compliance and a professor in the Health Administration and Public Health department at Duquesne University.