Design and privacy: COVID Alert op-ed by SRI research leads in the Toronto Star
Ontario has just released a new app called COVID Alert that gives users notification of their potential exposure to the coronavirus.
It’s a useful tool in the fight against the pandemic. But it’s also a concerning indicator of the increasingly decisive role “Big Tech” plays in policy choices.
First, how does the app work?
The app is designed on a platform created jointly by Apple and Google, so it has the potential for wide reach. Although some call it a “contact tracing” app, it is better described as an “exposure notification” app, sending Bluetooth signals to other phones that it encounters, creating a sort of anonymized digital “handshake.”
If someone tests positive for COVID-19, they can choose to have users of other phones that “shook hands” with the infected person’s phone notified so they can voluntarily get tested or self-isolate.
The app’s design adopts what is called a “privacy first” approach: it minimizes government access to data since the digital handshakes are anonymized, stored and processed on individual phones. Public health officials do not learn who gets notified by the app or anything else about the infected person’s contacts. This should alleviate deep concerns about state abuses of data.
But this design comes at the cost of fully meeting public health goals. The design makes it difficult to integrate the app with manual contact tracing, which remains essential, and potentially limits evaluation of the app’s actual effectiveness.
What are the wider implications of how this app is designed to work?
Although the Canadian Constitution demands that privacy infringements be justified (and the test for justification is demanding), it does not specifically require this “privacy first” approach. But even if the government wanted to embrace a different kind of design for this app, it’s limited by the choices made by two of the world’s biggest tech companies.
Apple and Google decided that exposure-notification apps using their infrastructure cannot adopt a centralized design or require the provision of personal information (like a phone number). Jurisdictions that have wanted to use Bluetooth technology but adopt a different design have faced technical difficulties or had to backtrack and utilize the Apple/Google platform.
This is the key issue: critical infrastructure is being built by the global private sector—without the governance mechanisms needed to ensure democratic accountability and fundamental human rights. Big Tech are themselves the biggest collectors of data on users of critical infrastructure, and yet they now increasingly play the role of gatekeeper when other agents, like governments, want to access publicly important information.
States have to tread carefully here, too. Technology enables more effective and pervasive surveillance, especially in a crisis like COVID-19. While these techniques may have some public benefits, they increase the imbalance of power between state and society and trample on basic human rights of association, free speech and movement.
We should be dealing with potential privacy overreach by states through privacy laws, but Canada’s current privacy laws are antiquated and inadequate. People are less likely to trust the state use of data if no strong laws prevent its misuse—and they will prefer solutions where that data is not used at all.
To be clear: we will be downloading and using the app. But we also need to understand that the “privacy first” design decision was made for us by Big Tech and, as a result, governments are constrained in their ability to collect useful data. This raises serious questions about democratic oversight of critical global infrastructure.
Key decisions about what aspects of privacy to protect should not be made by shareholder-driven multinational companies unconstrained by traditional accountability mechanisms, instead of democratically elected governments.
While the COVID Alert app is a valuable tool in the ongoing fight against a global pandemic, we should be mindful of the larger implications it—and tools like it—have for the structure of our society, how we can and should run it, and who controls the reins.
This article was originally published in the Toronto Star on August 6, 2020.