A Delicate Balance: Proposed Regulations May Upset the Tension Between Accessibility and Privacy of Health Information

Promises and Perils of Emerging Health Innovations Blog Symposium

We are pleased to present this symposium featuring commentary from participants in the Center for Health Policy and Law’s annual conference, Promises and Perils of Emerging Health Innovations, held on April 11-12, 2019 at Northeastern University School of Law. Throughout the two-day conference, speakers and attendees discussed how innovations, including artificial intelligence, robotics, mobile technology, gene therapies, pharmaceuticals, big data analytics, tele- and virtual health care delivery, and new models of delivery, such as accountable care organizations (ACOs), retail clinics, and medical-legal partnerships (MLPs), have entered and changed the healthcare market. More dramatic innovations and market disruptions are likely in the years to come. These new technologies and market disruptions offer immense promise to advance health care quality and efficiency, and improve provider and patient engagement. Success will depend, however, on careful consideration of potential perils and well-planned interventions to ensure new methods ultimately further, rather than diminish, the health of patients, especially those who are the most vulnerable.

In his piece for the Promises and Perils of Emerging Health Innovations blog symposium, Oliver Kim emphasizes the important role trust plays in the provider-patient relationship. Kim unpacks the challenges that come with the introduction and incorporation of new health technology, and further cautions against the potential for erosion of trust when introducing third-parties into the relationship.

A Delicate Balance: Proposed Regulations May Upset the Tension between Accessibility and Privacy of Health Information

by Oliver Kim

Northeastern University School of Law’s “Promises and Perils” conference allowed my coauthor and me to continue the exploration of legal, policy, and ethical issues in the development and use of “disruptive technologies” in health care. See Ne. Univ. Sch. L., Promises and Perils of Emerging Health Innovations, (last visited Oct. 4, 2019); Oliver J. Kim, The Devil is in the Data, Balkinization (Nov. 3, 2018, 11:00 AM). The bedrock of the provider-patient relationship is trust, and that same level of trust must exist in the world of disruptive technology, such as digital health, if disruptive technologies will be welcomed by patients and consumers.

Consultant and health technology expert Susannah Fox argues that a “trust gap” has emerged because the narrative around digital health has negatively impacted people’s sense of trustworthiness due to “a steady drip-drip-drip of articles documenting how health apps are sharing data with third parties.” Susannah Fox, Trust Gap: Health Apps and Data Sharing (Apr. 29, 2019). Moreover, observers have raised concerns about how digital technologies affect women, people of color, and those of limited means in areas such as privacy, security, and criminal justice. See Emily Chang, What Women Know About the Internet, N.Y. Times (Apr. 10, 2019); Cat Zakrzewski, The Technology 202: Advocate Urges Congress to Protect Digital Rights of People of Color as it Crafts Privacy Bill, Wash. Post (Feb. 6, 2019); Mary Madden, The Devastating Consequences of Being Poor in the Digital Age, N.Y. Times (Apr. 25, 2019). For example, many questions about privacy arose when police were able to use a private company’s DNA ancestry tool to identify the Golden State Killer through partial matches from relatives’ genetic data. Avi Selk, The Ingenious and ‘Dystopian’ DNA Technique Police Used to Hunt the ‘Golden State Killer’ Suspect, Wash. Post (Apr. 28, 2018); Clare Wilson, Serial Killer Suspect Identified Using DNA Family Tree Website, New Scientist (Apr. 27, 2018). Some law enforcement agencies are building up their own DNA databases, and while the DNA may be taken consensually, individuals may not realize their DNA could be retained or used for other purposes. Jay Stanley, The Police Want Your DNA to Prove You’re Innocent. Do You Give it to Them?, ACLU (Sept. 16, 2016). As one physician noted after police swabbed his son, “My concern… is that it’s not just Adam’s DNA…. It’s my DNA, it’s my wife’s DNA, and our parents. Not to sound bad, but you just get nervous.” Lauren Kirchner, DNA Dragnet: In Some Cities, Police Go from Stop-and-Frisk to Stop-and-Spit, ProPublica (Sept. 12, 2016).

While improved healthcare interoperability is a priority for stakeholders, achieving it continues to be a vexing problem. Kate Monica, Top 5 Challenges to Achieving Healthcare Interoperability, EHR Intelligence (Aug. 14, 2017). After an extended comment period, the Centers for Medicare and Medicaid Services (CMS) and the Office of the National Coordinator (ONC) recently closed the long-awaited proposed regulations on interoperability as directed by the 21st Century Cures Act. Press Release, U.S. Dep’t of Health & Human Serv., HHS Extends Comment Period for Proposed Rules to Improve the Interoperability of Electronic Health Information (Apr. 19, 2019); Elise S. Anthony & Michael Lipinski, 21st Century Cures ACT: Interoperability, Information Blocking, and the ONC Health IT Certification Program Proposed Rule, The Office of the Nat’l Coordinator for Health Info. Tech. (last visited Oct. 8, 2019). These proposed regulations require certain payers— Medicare Advantage private plans, Medicaid and Children’s Health Insurance Program managed care organizations, Medicaid state agencies, and qualified health plans within Federally Facilitated Exchanges—to create open APIs (Application Programming Interfaces) that patients could use through a third-party app to access and compile their health data. Patient Protection, Interoperability and Patient Access for Medicare and Medicaid Programs, 84 Fed. Reg. 7610 (proposed Mar. 4, 2019) (to be codified at 40 C.F.R. pt. 156); 21st Century Cures Act: Interoperability, Information Blocking, and the ONC Health IT Certification Program, 84 Fed. Reg. 7424 (proposed Mar. 4, 2019) (to be codified at 40 C.F.R. pt. 170 & 171).

While the use of APIs and third-party apps may make health data more accessible and help to eliminate the digital divide on how different racial and ethnic groups tend to access the Internet, there is concern about utilizing third-party apps because of their potential for harnessing consumers’ data. Eva Chang et al., Racial/Ethnic Variation in Devices Used to Access Patient Portals, 24(1) Am. J. of Managed Care e7 (2018); Rebecca Pifer, HHS Officials Defend Interoperability Rules to Senate Critics, Health Care Dive (May 7, 2019). Indeed, one of the reasons cited for the extension of the comment period was confusion over whether providers would be liable for how patients use their health data under HIPAA’s privacy law (which ironically does not contain the word “privacy” in its full title, the Health Insurance Portability and Accountability Act of 1996). Jessica K. Cohen, HHS Extends Comment Period for Interoperability Rules, Modern Healthcare (Apr. 19, 2018); Health Insurance Portability And Accountability Act of 1996, Pub. L. No. 104-191, § 110 Stat. 1936, (1996).

HIPAA protections do not necessarily apply to a third-party app simply because the app has received health information via the consumer. See Health App Developers, What Are Your Questions About HIPPA? (last visited Oct. 8, 2019). In some cases, it would be no different than if a patient handed a paper file to a stranger who promised to take care of the information. See Focal Point Insights, When Does HIPAA Apply to Health Apps?, Focal Point Blog (Oct. 3, 2018). In this instance, the consumer has voluntarily taken their patient data outside HIPAA’s world of covered entities. Instead, it would be up to the Federal Trade Commission (FTC) to take enforcement actions against those who may have violated an app’s terms and conditions with the consumer. See Federal Trade Commission, Privacy and Security Enforcement (last visited Oct. 8, 2019); G. S. Hans, Privacy Policies, Terms of Service, and FTC Enforcement: Broadening Unfairness Regulation for a New Era, 19 Mich. Telecomm. & Tech. L. Rev. 163 (2012).

As argued earlier, see Kim, supra, and as discussed in presentations, trust is a key component of ensuring that the digital health system will work, as well as building the data foundation necessary for new advances in healthcare, such as artificial intelligence. See Ne. Univ. Sch. L., supra; Ariz. St. Univ. C. L., Governance of Emerging Technologies & Science (GETS) (last visited Oct. 8, 2019). Many communities of color share concerns about the use of their health data due to historical inequities and unjust treatment by the medical system; yet, their data is needed to ensure that there is no digital divide in our healthcare databases. See J. Corey Williams, Black Americans Don’t Trust our Healthcare System – Here’s Why, The Hill (Aug. 24, 2017); Research America, New National Public Opinion Poll Shows Majority of Americans Would Participate in Clinical Trials if Recommended by Their Doctor, ResearchAmerica! Polls (June 12, 2013); Graham MacDonald & Ajjit Narayanan, We Need Better Tools to Measure Bias in Data that Drive Decisionmaking, Urban Institute (Mar. 5, 2019). The proposed regulations are unlikely to close that trust gap, particularly when consumer groups have raised concerns about certain apps, wearables, and their relationships with marketers and insurers. See Drew Harwell, Is Your Pregnancy App Sharing Your Intimate Data with Your Boss?, Wash. Post (Apr. 10, 2019); Allison V. Smith, With Fitness Trackers in the Workplace, Bosses can Monitor your Every Step – and Possibly More, Wash. Post (Feb. 16, 2019); Kaitlyn Tiffany, Period-Tracking Apps are Not for Women, Vox (updated Nov. 16, 2018); Smartphone Contraception: Policy Issues, National Women’s Health Network (last updated Oct. 2018).

Ideally, everyone would be an informed consumer, but the truth is most of us do not read or do not understand the terms and conditions that go along with downloading an app. Caroline Cakebread, You’re Not Alone, No One Reads Terms of Service Agreement, Business Insider (Nov. 15, 2017). And the current Administration has not signaled a willingness to intervene on behalf of consumers. For instance, the new head of the Consumer Financial Protection Bureau said the agency will help consumers “to help themselves [to] protect their own interests” rather than on focus on enforcement. David Lazarus, ColumnL CFPD head, Charged with Protecting Consumers, Says People Need ‘to Help Themselves’, LA Times (Apr. 19, 2019). Similarly, ONC director Don Rucker said it is up to individual patients to decide what types of third-party apps to use. See Cohen, supra.  Let the downloader beware!

While CMS and ONC do not have regulatory authority over third party apps, they could allow or require APIs to restrict third-party apps’ access based on whether they agree to limit the use of patients’ health data. As one consumer group has argued, that would give patients some real measure of choice in their notice and consent. And to build trust—particularly for women and people of color—in this digital revolution in healthcare, the government owes patients and consumers some sense of safety and security. See Michell Richardson, Notice and Choice Are No Longer a Choice, CDT Blog (Mar. 1, 2019).

Bio: Oliver Kim is an adjunct law professor at University of Pittsburgh School of Law and a principal with Mousetrap Consulting in Washington, D.C.

Handle: @mousetrapdc