Hospital Employees Fired for Improperly Viewing Smollett Medical Records
After actor Jussie Smollett was treated at a Chicago, IL hospital in January, dozens of hospital employees were reportedly terminated for improperly viewing his medical records, according to local news reports.
In January, Smollett presented at Northwestern’s emergency department saying that he’d been attacked by two men. Investigators later alleged that Smollett paid two individuals to stage the attack. As is sometimes the case in high-profile celebrity hospital stays, healthcare workers were apparently unable to resist the temptation to peek at a patient’s records.
Last week a nurse from Northwestern Memorial Hospital told the local CBS news station that she was one of several healthcare workers fired for allegedly viewing Smollett’s records. The nurse claims, however, that she did not intentionally seek out Smollett’s record. Rather, she says she simply scrolled by his name when searching for another patient’s name, and she suspects this was the case for other terminated employees.
But how plausible is the nurse’s claim? Some health privacy experts aren’t buying the nurse’s explanation that Smollett’s records were viewed accidentally or that workers merely saw his name while scrolling lists of patients. After all, improperly viewing any patient records—not just those of celebrities and high-profile patients—is grounds for termination for most HIPAA-covered entities.
Kelly McLendon, RHIA, CHPS, managing director of CompliancePro Solutions, says the hospital’s electronic health record (EHR) audit log would reveal the timing of when and how long certain pages or cases were viewed. This, combined with a description of what was viewed and how the patient’s name was searched for, could explain what actually happened.
“Some users might have the ability to search for lists of patients and his name could have been on the list. If his name was scrolled past, PHI might have popped up that would probably trigger an audit log entry too. Say someone saw the patients on a list for four seconds each, his was reviewed for five seconds. Then, there probably would then be no issue. But say his review was 30 seconds, all others five seconds, then an issue may have occurred. But again, the audit logs—which, clearly, they are reviewing—probably tells which way it happened,” McLendon says.
He adds that all audit logs are different and that there are two likely use cases present in Northwestern’s audit log. First, the most likely is that a logged-on user searched for Smollett’s name and took a peek at what came up, probably within the EHR or perhaps an emergency department app. They would have to have permission, but many sites do allow any clinical workforce members to log in to view any patient record. Although his name may have eventually been flagged and permissions allocated only to his care team, this may not have been in place early on. Once the search was done, the e-chart was opened and then logged—and that can be grounds for termination. This scenario is by far the most common occurrence for these kinds of incidents, according to McLendon.
The second scenario is that someone had permission to create a list of patients and could have been perusing the list, and that may also have been audit-logged (assuming the system could log that kind of thing, which is not a given, but technically it’s possible). Just perusing the list could show an audit log entry.
Breaches such as this are common and provide an example of how the biggest threat to security is often coming from inside an organization. “Insider” threats, or security risks posed by people who already have access to a provider’s system, are a recurring threat, according to the Office for Civil Rights.