The past, present, and future of digital privacy for youth and children: Part II

 
In the second of two posts (read Part I), Leslie Regan Shade, Monica Jean Henderson, and Katie Mackinnon, researchers on The eQuality Project, explore research on children’s and youth’s experiences of the online space, their unique needs for privacy protections, and how their conceptions of digital tools and the corporations that make them might be better informed through digital literacy education. The eQuality Project explores young people’s experiences of privacy and equality in networked environments, with a particular focus on youth from marginalized communities.This piece is the seventh in a series of posts on the features, implications, and controversies surrounding privacy law reforms in Canada and around the world in an increasingly digital and data-rich context. Missed Part I of this ARTICLE? Read it here.

In the second of two posts (read Part I), Leslie Regan Shade, Monica Jean Henderson, and Katie Mackinnon, researchers on The eQuality Project, explore research on children’s and youth’s experiences of the online space, their unique needs for privacy protections, and how their conceptions of digital tools and the corporations that make them might be better informed through digital literacy education. The eQuality Project explores young people’s experiences of privacy and equality in networked environments, with a particular focus on youth from marginalized communities.

This piece is the seventh in a series of posts on the features, implications, and controversies surrounding privacy law reforms in Canada and around the world in an increasingly digital and data-rich context.

Missed Part I of this ARTICLE? Read it here.


Digital privacy is important for young people and nuanced through the ways they manage their privacy in their communications on various digital platforms with their peers, teachers, and family.

Research about young people, online privacy, and consent, including work from the Ottawa-based digital literacy organization MediaSmarts, highlights that youth conceptualize privacy differently than adults and that youth tend to ignore the terms of service and privacy policies on the platforms, because they are obtuse, complicated, and lengthy.

How do children and youth conceive of privacy?

In their project, Growing Up in a Digital Age, funded by the UK Information Commissioner’s Office, London School of Economics researchers Sonia Livingstone, Mariya Stoilova, and Rishita Nandagiri capture the dynamics of youth privacy through three digital contexts: (i) interpersonal privacy (social environment, sharing practices, modes of self‐expression, participation, and social capital); (ii) institutional privacy (data collection by governments, schools, and aligned third party organizations); and (iii) commercial privacy (datafication by corporate platforms whose business model is reliant on behavioural marketing and data‐mining algorithms to collect and transmit personal information).

The authors emphasize that children most often conceive data interpersonally and in terms of data willingly given, such as their age and name. Children definitely care about privacy, but do not have the institutional or commercial knowledge of platform architectures, data flows, and transactions to fully comprehend the implications of data traces and inferred data. Instead, they “often understand ‘privacy’ as being able to keep their online activities to themselves without others finding out” (p. 17), like one might conceive of a ‘private’ diary. Children are also caught up with the equivocation of privacy with “e-safety,” or the idea of being private so that strangers can't find out where you are or what your name is. 

How do children and youth experience privacy in digital environments?

Children care about privacy, but do not have the institutional or commercial knowledge of platform architectures, data flows, and transactions to fully comprehend the implications of data traces and inferred data.

These privacy distinctions were highlighted in research conducted by MediaSmarts and The eQuality Project (eQ), a seven-year Social Sciences and Humanities Research Council (SSHRC) partnership grant whose goals are to explore young people’s experiences of privacy and equality in networked environments, with a particular focus on youth from marginalized communities.

Decision‐Making and Privacy: How Youth Make Choices About Reputational Data and Privacy Online examined, in the context of photo-sharing platforms, the types of personal information young people (ages 13-16) would willingly disclose online, their strategies for reputation and privacy management, and their knowledge of fair information practices toward their personal information. These platforms were perceived by young people as tools to manage, facilitate, and curate their online image and social interactions for different audiences—friends, family, or future audiences. Youth were thus mostly concerned with their interpersonal privacy in order to connect with friends, document shared memories, and seek peer approval. Photo-sharing platforms were not perceived to be “corporate entities,” as few youth “had a clear idea of what the corporations that owned the platforms they use did with their photos” (Johnson et al. 2017, p. 4). Privacy policies and terms of service on the platforms were perceived as too long and difficult to understand, so most young people felt that they were unable to give meaningful consent. Disempowered about negotiating their terms of engagement with the platforms, youth were unaware of privacy legislation (such as fair information principles and the Personal Information Protection and Electronic Documents Act—PIPEDA) and thus did not express a right to privacy.

In This is What Diversity Looks Like: What Young People Need to Enjoy Privacy and Equality in Networked Spaces, The eQuality Project (Jane Bailey, Jacquelyn Burkell, Priscilla Regan, Leslie Shade, and Valerie Steeves) explored youth experiences of online privacy and equality, including the constraints and affordances they experience in networked activities and platforms. Focus groups included LGBTQ, Indigenous, racialized, and general population youth (ages 13-17) from diverse geographic locations in Canada.

We asked our participants whether privacy played a key role in their use of social media. Similar to other research findings on digital youth and privacy, our participants did not routinely read privacy policies or terms of service, and they were described as “convoluted,” and “sneaky.” One participant related that platform companies “know that nobody reads the terms.”

One participant related that platform companies “know that nobody reads the terms.”

Indeed, our participants regarded the power of platform companies with cynicism and resignation. Said one participant: “They can sometimes sneak things in there that just flips everything on its head. Like Facebook and that big privacy leak.” Regarding how behavioural marketing targets them, one participant said that they felt “violated… but it’s the price you have to pay.” Another commented on the surveillance capitalism properties baked into the platform: “It’s really difficult to find an actual, like, good-natured website that’s trying to do stuff. They want to make money and so, they’re going to pretty much… get into every little bit of your life that they can to generate the most profit. They’re going to just… just… they, like, watch you.”

For young people, then, the tenacious datafication of their communicative practices raises concerns about whether they can maintain control of their digital identity and privacy over their life cycle.

Data’s lingering afterlives: erasure, forgetting, persistence, and archives 

The right to be forgotten and the right to erasure, two privacy recommendations included in the General Data Protection Regulation (GDPR) and the Standing Committee on Access to Information, Privacy and Ethics (ETHI), are relevant to debates about historical web data created by young people. eQuality Research Assistant and University of Toronto Faculty of Information PhD candidate Katie Mackinnon’s doctoral project Early Internet Memories explores personal and public archives that contain web materials made by young people throughout 1994-2005 and engages with the many ways that the afterlives of data are simultaneously weaponized, desired, destroyed, and kept.

Digital archives are repositories of digital communication generated by human activity and vary greatly in terms of size, structure, purpose, and scope. They exist both institutionally, commercially, publicly, and privately, and serve to preserve materials that might otherwise be inaccessible—often deleted, hidden, or removed from the searchable internet. Large web archive collections, like the Internet Archive, contain upwards of 453 billion web pages, many of which were created and maintained by marginalized and vulnerable groups, including young people, and contain a range of personal or sensitive material.

The right to erasure aims to prevent what Alexander Tsesis describes as “the indefinite storage and trade in electronic data, placing limits on the duration and purpose” of the data. It also states that individuals may request the data be deleted should it become irrelevant, inaccurate, or cause harm that is not outweighed by a public benefit in retaining the data. Bill C-11 states that the right to erasure would require organizations to dispose of personal information, but does not require search engines to de-index, nor does it require archives of digital-born material to adhere to any privacy and ethical guidelines around youth data, which—as we move into the third decade of digital production and expansion—will be increasingly important.

Digital and algorithmic literacy

Stable funding for digital literacy programming for young people In schools and after-school programs can strengthen their digital skills, allowing them to gain knowledge about the dynamic nature of datafication. Digital privacy policy literacy, encompassing three elements—policy processes, the political economy of platforms, and infrastructures—is a useful framework to unpack different elements of digital privacy. Leslie Shade and Sharly Chan applied this framework to the photo-sharing on platforms study described above. 

Policy processes provide knowledge of privacy legislation at national and global levels, and governance bodies such as privacy and data commissions. Here youth can become aware of their privacy rights.

Political economy of platforms allows youth to understand the ownership of the commercial sites and applications they use and their business models, which are reliant on immersive advertising and behavioural marketing.

Through infrastructures youth can understand better how their sociality is mediated by platform design and how privacy management is determined by complicated and obscure privacy policies and terms of services that can inhibit meaningful consent.

Monica Jean Henderson, eQ Research Assistant and U of T Faculty of Information doctoral student, has been researching how new digital literacy frameworks, particularly data and algorithmic literacies, are developing in response to exponential datafication and what Mark Andrejevic calls “big data divides.”

As Henderson’s research reveals, even as researchers attempt to model literacies that empower citizens to engage critically in online environments, data and algorithmic literacies are often deployed within instrumentalist policy frameworks. These approaches—often in the name of “empowerment”—serve to assimilate citizens into the big data economy rather than foster critical engagement. As a response, alternative frameworks such as Kelley Marie Cotter’s “critical algorithmic literacy and Catherine D’Ignazio’s notion of “creative data literacy are being pushed as methods to use algorithms and data in order to “speak back” to power (from Catherine D’Ignazio and Lauren Klein’s intersectional feminist principles for data feminism).

Researchers at the Data-Pop Alliance have even suggested a shift away from language such as “data literacy” and towards “literacy in the age of data” to recentre critical consciousness and data inclusion rather than data skills for employment and innovation only.

However we may choose to name them, algorithmic and data literacies will continue to represent key issues in youth media education as “algorithmic knowledge gaps are shown to represent possible new sources of social and digital inequalities.

The challenge is that algorithms (and the data they collect and adapt on) are what Dogruel et al. describe as “experience technologies,” meaning that algorithmic literacy is typically built through exposure to and use of algorithmic platforms. This means that children and youth might have to be exposed to algorithmic surveillance and targeted advertising first in order to develop critical algorithmic literacy. Further, recent research into perceptions and understandings of algorithms tends to focus on adults, and often those with postsecondary education (Cotter & Reisdorf, 2020; Hargittai et al., 2020; Dogruel et al., 2020; Sander, 2020).

These methodological challenges and research gaps pose important policy questions for how to develop youth algorithmic literacy in privacy-protected environments.

In 2020, we (Shade, Henderson, and Mackinnon) addressed this gap with an eQuality project in conjunction with the Scholars-in-Residence program at U of T’s Jackman Humanities Institute. A reading list encouraged the five undergraduate student-researchers to explore algorithmic literacy in their and their peers’ lives. The Scholars then designed an algorithmic literacy toolkit, The Algorithmic You, comprising educational factoid graphics that could be spread across social media and disrupt their peers’ typically frictionless experience of algorithmic platforms.

This was found to be an effective way to inject algorithmic literacy into the experience of and exposure to the platform itself. Most importantly, this methodology rejected the typical protectionist approach to youth privacy and instead used participatory methods to generate youth-centred perspectives and topics that spoke to our participants and their peer followers.

Editor’s note: Bill C-11 failed to pass when Canada’s federal parliament was dissolved in August 2021 to hold a federal election. In June 2022, many elements of C-11 were retabled in Bill C-27. Read our coverage of C-27.


About the authors

Shade_et_al_v2.jpg

Leslie Regan Shade is a professor at the University of Toronto’s Faculty of Information and a faculty affiliate at the Schwartz Reisman Institute for Technology and Society. Her research examines the social and policy aspects of information and communication technologies (ICTs), with particular concerns towards issues of gender, youth, and political economy. Her work includes promoting public interest and popular discourse on ICT policy an ongoing commitment to building participatory scholar-activist networks.

Monica Jean Henderson is a PhD student at the University of Toronto’s Faculty of Information exploring the relationships between democracy, citizenship, and technology at the intersection of digital policy and digital literacy. Her research projects also include topics such as algorithmic literacy, feminist game studies, and feminist information theory.

Katherine (Katie) Mackinnon is a PhD candidate at the University of Toronto’s Faculty of Information. Her research focuses on the history and social, infrastructural, and policy issues of the early web, including early internet use by young people. She also studies public policy and social issues around the internet, youth action, and ethical approaches to web archives.

Acknowledgements: The authors wish to thank the Social Sciences and Humanities Research Council (SSHRC) for their support of The eQuality Project.


Browse stories by tag:

Related Posts

 
Previous
Previous

Harnessing commercial data for public good: can it be done, should it be done—and how?

Next
Next

The past, present, and future of digital privacy for youth and children: Part I