Systemic problems from the opioid crisis to mass shootings are linked to the overarching problem of underfunded and underworked mental health services, a problem that Silicon Valley is seeking to address with mobile health apps. At the same time, legislators are introducing legislation aimed at improving privacy protections of unregulated health apps.
In Washington, Senators Amy Klobuchar (D-MN) and Lisa Murkowski (R-AK) have proposed legislation that would require federal health officials to issue regulations for health tracking apps and home DNA testing kits that currently are not regulated by HIPAA, but which carry significant privacy concerns.
“New technologies have made it easier for people to monitor their own health, but health tracking apps and home DNA testing kits have also given companies access to personal, private data with limited oversight,” Klobuchar said in a statement. “This legislation will protect consumers’ personal health data by requiring that regulations be issued by the federal agencies that have the expertise to keep up with advances in technology.”
This development poses challenges in an environment where states such as California are looking to mobile technologies to reduce stress on the state’s mental health resources. The state has partnered with two companies, the tech firm Mindstrong and the mental health network 7 Cups, to create apps that help detect when individuals already seeking mental health treatment start displaying patterns of smartphone use associated with mental health crises.
According to the New York Times, Mindstrong’s algorithms “can establish a person’s normal, or baseline, activity across a number of measures, including how frequently the phone is used and how quickly the person types…If several measures begin to stray wildly from average, Mindstrong’s app triggers a message to the user. It takes the company about 24 hours to register a disrupted routine. The app also summarizes usage in graph form, so users can view trends over days or weeks.”
Officials see this as creating a virtual “fire alarm” that would help vulnerable people seek treatment, and trigger resources designed to help them. But launching this program has hit delays for, among other things, privacy concerns. For the problems to succeed the app makers need access to patient data and local officials have resisted providing medical records to them.
“If we’re excited about the potential of data, we should be equally worried about the risks, and those seem to be evolving faster than the scientific benefit,” Dr. John Torous, director of the division of digital psychiatry at the Beth Israel Deaconess Medical Center in Boston, told the New York Times. “There may be guarantees the companies make about not sharing data, but if the company is sold to another company, that data goes with it.
“A lot of apps have that clause buried in 13 pages in mouse print.”
Mary Butler is associate editor at Journal of AHIMA.