ARGUMENT: TRUST & TRACING APPSHow Our Outdated Privacy Laws Doomed Contact-Tracing Apps

Published 13 January 2022

Last spring, when the disease first started its rapid spread, contact-tracing apps were heralded as a promising way to control it by tracking diagnoses and exposure through self-reporting and location tracking. Jessica Rich writes that these apps have had mixed success worldwide, but “they’ve been a huge failure in the United States.” He adds: “A key reason for this failure is that people don’t trust the tech companies or the government to collect, use, and store their personal data, especially when that data involves their health and precise whereabouts.”

Jessica Rich, a privacy lawyer, says that one of the lessons that she has learned from the pandemic involves privacy and the failure of contact tracing apps.

Writing for Brookings, Rich notes that last spring, when the disease first started its rapid spread, these apps were heralded as a promising way to control it by tracking diagnoses and exposure through self-reporting and location tracking. At that time, Apple and Google announced a joint effort to develop technology that government health departments could use to build apps for their communities, “with user privacy and security central to the design.”

Rich writes that while these apps have had mixed success worldwide, “they’ve been a huge failure in the United States. Indeed, despite early hopes and multiple efforts to implement these apps in various states and localities, Americans have largely rejected them, and they’ve played a minimal role in controlling the disease.”

She adds:

A key reason for this failure is that people don’t trust the tech companies or the government to collect, use, and store their personal data, especially when that data involves their health and precise whereabouts. Although Apple and Google pledged to build privacy measures into the apps’ design—including opt-in choice, anonymity, use limitations, and storage of data only on a user’s device—Americans just weren’t persuaded. For example, a Washington Post/University of Maryland survey conducted soon after the app announcement found that 50% of smartphone users wouldn’t use a contact-tracing app even if it promised to rely on anonymous tracking and reporting; 56% wouldn’t trust the big tech companies to keep the data anonymous; and 43% wouldn’t even trust public health agencies and universities to do so. By June, Americans’ mistrust had increased, with a new survey showing that 71% of respondents wouldn’t use contact tracing apps, with privacy cited as the leading reason.

She notes that

Americans had good reason to be wary of data collection by these apps. In recent years, they’ve been victimized again and again by data breaches and other privacy abuses (including by the big tech companies) too numerous to mention. In many instances, the privacy laws in this country have failed to protect them from these abuses, whether because the abuses fell outside the limited scope of these laws, or because the laws imposed insufficient penalties or other remedies. The same dangers loomed with respect to contact-tracing apps. Indeed, as readers of this blog are likely aware, the U.S. has no baseline data protection law that would protect the sensitive data obtained through these apps.

After discussing the inadequacy of U.S. privacy laws, Rich writes that clear federal standards governing data use should not just be viewed as a restraint, but also as a way to enable responsible uses of data.

She concludes:

Taken together, all of these lessons lead us back to the same conclusion that was the topic of my earlier blog post—that we need a baseline federal privacy law to establish clear and enforceable privacy rules across the entire marketplace, one that protects our personal information in good times and in times of crisis.