Contact-Tracing Apps For COVID – 19 And Why It Is A Threat To Our Privacy

Contact-Tracing Apps For COVID - 19 And Why It Is A Threat To Our Privacy

Coronavirus contact-tracing apps harm privacy. Regardless of whether they protect the data they collect, they require users to walk around with their smartphones on all the time, which is a privacy risk in itself. More disturbingly, they normalize the idea of our behavior and actions being directly managed end masse by apps. Both set dangerous precedents for individual privacy and liberty, far in excess of any risk that our health data might be leaked.

Admittedly, Google and Apple have designed their exposure notification API – which they released to developers last week – with privacy and security in mind. But even though their technical specifications describe a decentralized system where data can be stored only on individual devices, any contact-tracing app based around these specifications would inevitably reduce your privacy. It certainly wouldn’t increase it.

On the one hand, cybersecurity researchers have already argued that suitably determined and malevolent bad actors could correlate infected people with other personal info using the API. On the other, the Google-Apple API and any app based on it carry two much more general and dangerous privacy risks.

Firstly, any contact-tracing app requires you to keep your smartphone on constantly throughout the day, whether you’re going for a walk, to the supermarket, or wherever else. Straight away, this is a massive privacy loss.

As shown by numerous studies and investigations, smartphones and many of the apps on them track your locations, aside from recording – and sharing – whatever data you enter into them. For example, a 2018 investigation from The New York Times identified at least 75 companies in the United States which receive anonymously – yet identifiably precise – location data from around 200 million mobile devices.

Likewise, other studies have found that around two-thirds of all smartphone apps share a variety of data with third parties. One study found that Android phones record location data even when they’re set to Airplane Mode. And a Washington Post study from last year discovered around 5,400 (mostly app-based) data trackers on an iPhone, all of which were sending data back to third-parties.

In other words, smartphones are very bad for your privacy. They and the apps on them are continuously transmitting location and other data to scores of companies. What’s more, all of these companies have an interest in using that data to later influence and indirectly control your behavior, usually by marketing ‘relevant’ products or services.

All too often, commentators on privacy focus narrowly on the risk of your data somehow leaking to other people. Yes, most people may not want their neighbor to know that they’re infected with the coronavirus, for instance, but few stop to ask why this is important. Is it because privacy is inherently valuable as end in itself?

Arguably, but privacy actually gains most of its importance and value because it protects people from interference and intervention. You may want to keep your fondness for, say, ballet dancing private from your neighbors, because of the risk that they might mock your pastime and make you feel ashamed about wanting to be a ballet dancer. You worry that they will interfere – either directly or indirectly – with your ability to develop as a person according to your own awareness and conception of your best interests.

Exactly the same thing goes for privacy in the context of smartphones and digital technology. It’s not enough to avoid sharing your data with the ‘wrong’ people (as opposed to scores or hundreds of ‘legitimate’ third parties). You also need to avoid interference and intervention to have true privacy. And in encouraging people to have their smartphones with them all the time, coronavirus contact-tracing apps fail abjectly in this test. They will encourage more smartphone use, which will result in more location-data tracking and more personalized ads. Such ads try to redirect your behavior away from how may have otherwise directed it yourself. Put simply, less privacy.

This links to the second major privacy problem of coronavirus contact-tracing apps. That is, while we’re used to ads attempting to prod our consumer behavior, contact-tracing apps will normalize the concept of apps themselves directing and managing at scale how millions of people live and behave. All coronavirus contact-tracing apps aim to notify people in cases where they may have come into proximity of someone who claims to have been infected with the coronavirus. Once notified, such people are then advised by the app to self-isolate, to stay at home, and not go out.

This is a massive problem for anyone concerned about the future of privacy and personal freedoms in the Digital Age. It would be one thing if any contact-tracing app could guarantee that a user had definitely been infected with the coronavirus. But there’s a very strong likelihood that such apps will also send notifications to lots of people who haven’t been infected. As researchers from the University of Washington wrote for the Brookings Institution last month: “False positives (reports of exposure when none existed) can arise easily … Studies suggest that people have on average about a dozen close contacts a day – incidents involving direct touch or a one-on-one conversation – yet even in the absence of social distancing measures the average infected person transmits to only 2 or 3 other people throughout the entire course of the disease.”

Coronavirus contact-tracing apps will end up requiring thousands (if not millions) of people to quarantine themselves at home unnecessarily. So, in most cases, rather than preventing coronavirus infections from spreading, the only thing such apps will achieve is desensitizing the general public to giving up another chunk of their privacy and personal freedom. Users will get used to the idea of an app telling them when to stay at home and when to go out. Basically, they’ll become more habituated to delegating judgment over how they should behave to apps and digital technology. In the process, they’ll suffer from the kind of outside interference with their behavior that privacy is meant to defend against.

And once coronavirus contact-tracing apps have created this precedent, other similar apps could potentially follow, capitalizing on our reduced sensitivity to privacy-violating technology. Post Covid-19, we could end up seeing apps and devices that more directly instruct us when to eat, when to go shopping, where to go shopping, when to exercise, when to sleep, and so on.

Of course, such a scenario is still (relatively) remote. However, we now have people willing to use contact-tracing apps under the premise that such apps will keep them safe, so it’s hardly a stretch to imagine people using other behavior-controlling apps under the premise that they’ll keep them healthy, fit, employable, lovable, attractive or whatever else. So once again, the coronavirus pandemic has infected us in more ways than one.

originally posted on forbes.com by Simon Chandler