SRI Kitchen Table explores data rights in a world of power imbalances, mass surveillance, and super-powered facial recognition
The first installment of the Schwartz Reisman Institute’s (SRI) Kitchen Tables Series, “What does it mean to have data rights?”, held on February 17th, 2022, brought together powerful voices from across the fields of political science, law, journalism, and industry to address questions around data rights and ownership. Participants explored the risks and challenges of a world in which the rapid development of data-gathering technologies has infiltrated all strata of society, while policy initiatives have lagged behind.
“We want to keep each session of the Kitchen Tables informal and collaborative,” observed series host and SRI Research Lead Wendy H. Wong as she welcomed attendees, “so we can reflect on how human rights matter in our data-intensive, algorithmically-informed world, and discuss what challenges lay ahead for all of us.” A professor of political science at the University of Toronto and Canada Research Chair in Global Governance and Civil Society, Wong’s research explores the impacts of datafication on human rights and international politics. Her co-host for the event was Anna Su, an associate professor at U of T’s Faculty of Law and SRI Faculty Fellow.
Speakers for the session included Dafna Dror-Shpoliansky, a PhD candidate at the Hebrew University and research fellow at The Federmann Cyber Security Research Center; Kashmir Hill, a reporter at The New York Times who specializes in technology and privacy; Petra Molnar, associate director of the Refugee Law Lab at York University; and Divya Siddarth, an associate political economist and social technologist at Microsoft, who also works with the RadicalxChange Foundation and the Ostrom Workshop.
Hill, whose work includes reporting that first exposed the rise of Clearview AI—an American facial recognition company whose data-scraping practices and contracts with law enforcement agencies have drawn criticism—has been focused on privacy and data rights for over a decade, and is currently working on a book focused on facial-recognition technologies.
“Facial-recognition has advanced so significantly in the last decade, and that’s thanks to neural networks, which are accelerating what is possible with artificial intelligence,” explained Hill. Hill believes the sheer power of Clearview AI’s facial-recognition tools could effectively usher in the end of anonymity in public, raising important questions around how we build laws, regulate at scale, and to what extent users own the data they generate.
“When we look at the contemporary data sphere, it presents a very different reality than the context in which human rights were initially developed,” noted Dror-Shpoliansky. “To achieve full protection and control over our data, there is an urgent need to reconceptualize the familiar legal frameworks that we have and try to adapt them to this new reality.” Dror-Shpoliansky explores this tension in a 2021 article in the European Journal of International Law co-authored with Yuval Shany, in which they suggested to develop three generations of “digital human rights” that extend beyond the current frameworks adopted by international institutions.
Participants noted a central factor motivating the urgency around data rights is how concerns around data use, privacy, and new technologies have become embedded in broader societal issues. Describing her journey to her current work as an advocate for the impacts of data on refugees, Molnar observed, “I didn't see myself in the tech and human rights space at all, but then my colleague Lex Gill and I started looking at how technologies are playing out in immigration and refugee processing, and since then, I’ve been working to understand the power dynamics inherent in the development of technologies used at the border.” In 2018, Molnar published a report with U of T’s Citizen Lab on the implications of automated-decision making for Canada’s immigration and refugee systems.
In recent years, Molnar’s work has been focused on documenting the effects of AI surveillance tools such as biometric data collection and drones. “When I ask myself whether it's possible to own data in a meaningful way, it's a question about who gets to determine what ownership looks like,” observed Molnar. “Especially with historically marginalized communities, racialized communities, communities on the move, and folks crossing borders, people often don't have the same kind of power or space in which they can exercise the rights that you and I might.”
“Over-reliance on private sectors is a huge piece to this puzzle,” added Molnar, “because not only is it problematic public-private partnerships that we're talking about when it comes to data rights, but also the priorities involved. Why are we developing AI lie detectors to be tested on refugees, and not using AI to root out racist border guards? There are very clear priorities established in the kinds of conversations that we’re having.”
“In some ways, you hear proposals to scale back these technologies—we'd like these databases not to exist, we'd like platforms not to have monopolies, and yet they do have scale, and they already exist. So, our solutions must contend with that idea of scale, even if some of those solutions would like the scale to be pared down.”
“I think this is really the core question around ownership,” concurred Siddarth. “Not only the issue of whether you can have ownership over data, but who decides what kinds of data, and what kinds of ownership? This is a complicated question because I think, in some ways, you hear proposals to scale back these technologies—we'd like these databases not to exist, we'd like platforms not to have monopolies, and yet they do have scale, and they already exist. So, our solutions must contend with that idea of scale, even if some of those solutions would like the scale to be pared down.” In response to these imbalances, Siddarth has proposed the concept of data cooperatives as an alternative model of ownership that would act as fiduciary intermediaries with companies.
“It’s part of why regulation has to focus on the use of data, and not the collection of data,” added Hill. “You as an individual are not going to be able to control your privacy by not uploading data online, or by not using a genetic service, because if anybody connected to you does that, your data is going to be used.”
All of the session’s participants agreed that questions of data rights extend beyond individual use and relate to broader issues of regulating platforms and new technologies. However, as Wong observed—citing the 2020 controversy surrounding Cadillac Fairview, in which cameras with facial-recognition technology were embedded in kiosks across Canada without knowledge or consent of customers—there is a broad tendency for many to disregard questions of data privacy, viewing them as irrelevant or only a risk to suspected criminals.
“I’m a little shocked when I still see people reacting that way,” responded Kashmir, noting that “the last decade has shown us that everybody has been touched with some kind of privacy invasion, some kind of exposure of data they weren't exposing, and they weren’t expecting… We all have something to lose if we don't give people some kind of power over how their own data is exposed.”
“People are oblivious to how their rights are infringed in the online arena,” remarked Dror-Shpoliansky. “In the past, if you would have walked into a shopping mall and somebody asked you, ‘What's your age? How many steps can you walk in a minute? Do you have kids? Which stores are you intending on visiting?’ it was easier for us to say, ‘I'm not going to share that!’ But when you don't see it, you don't sense it.”
Looking to the future amidst the myriad challenges facing the implementation and regulation of data rights, Siddarth summarised her thoughts as the session came to an end. “Please work on this,” she said, addressing students in the virtual audience. “It’s evident from this conversation that there are many more open questions than answers, and it’s an incredibly rich space to dive into, whatever the background you’re coming from. There is so much progress to be made. There is so much advocacy to be done. Today, there is a greater recognition of the importance of this. There’s more recognition from policymakers, from civil society, from community organizations, even companies, whatever their motivations might be… There is an open space to work on this, and it’s deeply necessary.”