The past, present, and future of digital privacy for youth and children: Part I

 
In the first of two posts (read Part II here), Leslie Regan Shade, Monica Jean Henderson, and Katie Mackinnon, researchers on The eQuality Project, introduce the implications of Canada’s proposed Bill C-11 for youth and children’s digital privacy, and reflect on the differences between the current legislation under review, past privacy protection policies in Canada, and concurrent efforts underway across the world to address the complexities of a digital era—especially to balance the risks and opportunities of the online space for minors. The eQuality Project explores young people’s experiences of privacy and equality in networked environments, with a particular focus on youth from marginalized communities.This piece is the sixth in a series of posts on the features, implications, and controversies surrounding privacy law reforms in Canada and around the world in an increasingly digital and data-rich context.

In the first of two posts, Leslie Regan Shade, Monica Jean Henderson, and Katie Mackinnon, researchers on The eQuality Project, introduce the implications of Canada’s proposed Bill C-11 for youth and children’s digital privacy, and reflect on the differences between the current legislation under review, past privacy protection policies in Canada, and concurrent efforts underway across the world to address the complexities of a digital era—especially to balance the risks and opportunities of the online space for minors. This piece is the sixth in a series of posts on the features, implications, and controversies surrounding privacy law reforms in Canada and around the world.


In November 2020 the Government of Canada released the first reading of Bill C-11, the Digital Charter Implementation Act, long awaited legislation designed to modernize PIPEDA, Canada’s private sector privacy law, and to operationalize the Digital Charter.

Bill C-11 brings in the Consumer Privacy Protection Act (CPPA) to replace Part 1 of PIPEDA, changing its title to the Electronics Documents Act. This legislation aims to protect personal information and gives the Office of the Privacy Commissioner of Canada (OPC) new enforcement powers. Part 2 of Bill C-11 enacts the Personal Information and Data Protection Tribunal Act, a new administrative tribunal to hear appeals of decisions made by the Privacy Commissioner under the CPPA and empowering it to impose penalties.

As Teresa Scassa, Canada Research Chair in Information Law and Policy at the University of Ottawa’s Faculty of Law, comments: PIPEDA’s consent-based data protection regime provides, in Section 6.1, criteria for valid consent. Therefore, if children or young people are the targeted audience or users of a website or app, “consent might have to be specifically tailored to their level of understanding.”

Does Bill C-11 do enough to protect the vulnerable?

Section 15 in Bill C-11 deals with consent; specifically, Sec. 15.(1) states that, “an organization must obtain an individual’s valid consent for the collection, use or disclosure of the individual’s personal information.”

However, as Scassa also notes, because Bill C-11 does not directly address children and young people, changes to the nature of valid consent “will make the law even less responsive to their data protection needs.”

In a recent critique of Bill C-11 and call for major revisions of the proposed legislation, Jim Balsillie, former chairman and co-CEO of Research In Motion (BlackBerry) and co-founder of the Centre for International Governance Innovation and the Centre for Digital Rights, also remarks that the “gutting” of the valid consent requirement is problematic: “The most troubling of all loopholes in Bill C-11 is that it strips the privacy rights of children and other vulnerable groups by gutting of the requirement for valid consent that is contained within our current privacy law. Because digital environments are inescapable for kids and because data intermediaries are tracking them without their consent, privacy regulations in the U.S. and Europe both encode explicit protections for minors with enhanced penalties for violators.”

Ryan Calo, professor at the at the University of Washington School of Law, highlights the power of algorithms to identify and intrude even without “identified” information. AI’s capacity for pattern recognition can infer extremely intimate information about consumers based on their inputs, even when the data they share seems innocuous. Calo suspects consumers will have little-to-no concept of the consequences of sharing their information, while citizens will have zero agency in resisting surveillance. Without valid consent in Bill C-11 then, Canadians—especially children—will be left very vulnerable to pervasive (and even complete) government surveillance.

Jane Bailey, Jacquelyn Burkell, and Valerie Steeves, eQuality Project researchers, also argue that Bill C-11  “falls short of addressing the needs, aspirations, and realities of young people in a number of ways.” They note that in treating  privacy as control over data, instead of as a human right or social value, the CPPA fails to consider how surveillance and algorithmic sorting shape young people’s digital experiences. Further, because algorithmic sorting weakens a capacity for informed consent, “a special rule for young people should be implemented—preferably one that prohibits collection of data from young people in certain circumstances or at least requires explicit prior consent for collection.” And lastly, Bailey, Burkell, and Steeves recommend that young people themselves should be directly engaged in further deliberations when Bill C-11 moves into committee hearings, which is in keeping with Canada’s obligation under the Convention on the Rights of the Child to ensure that young people “have a say in decisions affecting them.”

Without valid consent in Bill C-11, Canadians—especially children—will be left very vulnerable to pervasive (and even complete) government surveillance.

Earlier guidance on children’s privacy

While Bill C-11 provides scant attention to the digital privacy of children, earlier guidance from the Office of the Privacy Commissioner did call attention to the importance of children’s privacy. In 2018 the OPC’s revised Guidelines for Obtaining Meaningful Consent included a specific section on consent and children, where they stated that special consideration should be extended to children given that it is “unrealistic to expect children to fully appreciate the complexities and potential risks of sharing their personal information.” As a result, parental or guardian consent is required for the collection, use and disclosure of personal information, in all but exceptional circumstances from children below the age of 13. Even for minors above this threshold, the guidelines explicitly state that consent can be considered meaningful and valid if their level of maturity has been taken into consideration in developing the consent processes.

Also in 2018, the Standing Committee on Access to Information, Privacy and Ethics (ETHI), acknowledged the need for special consideration of youth in its review of PIPEDA, Towards Privacy by Design. In its 19 recommendations to the Government of Canada, the ETHI committee proposed adding privacy-by-design as a principle, implementing a right to erasure and right to de-indexing (emphasizing the implications for young people), and introducing specific rules surrounding the consent of minors with opt-in consent as a default. The committee also recommended implementing specific rules of consent as well as regulations governing the collection, use, and disclosure of minors’ personal information—which could involve introducing a minimum age for consent, similar to the United States’ Children’s Online Privacy Protection Act (COPPA) and the EU’s General Data Protection Regulation (GDPR). Recommendation 9 of ETHI provides specific rules of consent for minors, calling on the Government of Canada to “consider implementing specific rules of consent for minors, as well as regulations governing the collection, use and disclosure of minors’ personal information.”

Balancing the risks and opportunities of the digital world

The regulations above have had to navigate the dilemma of balancing the risks and opportunities afforded to children by the digital world, while also taking into account their level of maturity.

Parental consent is key to the earliest privacy legislation for children—1998’s COPPA, intended to control the marketing to children of commercial interests. Under COPPA, website operators are required to publish privacy notices and obtain verifiable parental consent for children under the age of 13. Parental consent is assessed on a risk-based approach alongside a gradient of the intended purposes. For instance, services that use children’s data for internal purposes must employ a lighter consent mechanism, such as sending an email to the parent and taking an additional confirming step after receiving the parent’s response (a method known as “email plus”). Services are deemed to be highest risk services if they disclose personal data to third parties, use behavioural advertising, and enable children to publicly post information; these must comply with stringent mechanisms. A weakness with COPPA is that it applies to information collected from children, but if the information is about them—for instance social media posts containing children’s photos—then it cannot address these privacy concerns.

Data protection initiatives specific to children

Child-specific regulations on the use of their personal data have also been adopted by data protection regimes, notably the 2018 European GDPR.

Under the GDPR, children enjoy the same rights over their personal data as adults. The GDPR imposes various obligations on data controllers and processors, taking into account measures including privacy-by-design, privacy-by-default, and other governance mechanisms when dealing with information related to data subjects (identifiable persons).

The GDPR also provides specific protection with regards to children’s personal data, set out in Recital 38:

Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child.

The GDPR also affords additional rights and safeguards for children: child-appropriate information (Article 12), protection against marketing and profiling (Article 6(1)(f)), especially regarding automated decision-making (Recital 71), stricter right to erasure (Article 17(1)(f)), and prior parental consent (Article 8, Conditions applicable to child's consent in relation to information society services), which states that the processing of personal data by children under the age of 16 is lawful “only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child,” although the ceiling can be set lower as long as that lower age is not below 13 years.

And most recently, after 30 years, a General Comment on Children and the Digital Environment updates the Convention on the Rights of the Child (CRC), an international treaty that recognizes that all children have universal human rights, and which delineates particular rights of children encompassing the principles of protection, provision, and participation (3Ps). General Comment No. 25 (2021) on children’s rights in relation to the digital environment accounts for a range of children’s digital rights, including access to information; freedom of expression and thought; right to education and digital literacy; right to culture, leisure, and play; protection of privacy, identity, and data processing; protection from violence, sexual exploitation, and other harm; and health and wellbeing.

Section E. of the CRC, Right to Privacy, recognizes the significant importance of privacy for children’s “agency, dignity and safety and for the exercise of their rights” (para. 67, p. 11), identifies threats to children’s online privacy from datafication (data collection and processing) by public and private organizations and businesses, identify theft, “sharenting” (the practice of parents publicizing content about their children online), and peer and social surveillance. General comment 25 notes that routine practices can amplify threats against children’s rights to privacy, including the increasing use of surveillance tactics including behavioural marketing, automated data processing, mandatory identity verification, and use of sensitive biometric data (para. 68, p. 11).

The best interests of the child are contained within the principle that data collection adheres to minimization and proportionality (para. 69, 12). Highlighted is the need for appropriate legislation for organizations and in environments that collect children’s data, with recommended use of privacy-by-design into children’s products and services (para. 70, p. 12). Consent needs to be “informed and freely given by the child” or parent/guardian, prior to data processing (para. 7, p. 12), and children, parents, or guardians should be able to easily access, rectify erroneous, or delete unlawful information. Children have the right to withdraw their consent if there are no legitimate grounds for collection and processing information, and terms of service for children, parents, and guardians should be designed and communicated in child-friendly language and in accessible format (para. 72, p. 12).

General comment 25 notes that children’s digital platforms and services, whether for entertainment, education, or care settings, is steeped in surveillance capitalism, and that any digital surveillance of children should “respect the child’s right to privacy and should not be conducted routinely, indiscriminately or without the child’s knowledge or, in the case of very young children, that of their parent or caregiver” (para. 75, p. 13). Indeed, the use of monitoring and tracking services and devices by parents and caregivers “should be proportionate and in accordance with the child’s evolving capacities” (para. 76. p. 13).

Having examined the landscape of past, current, and future legislation that aims to (and sometimes does not fully succeed in) protecting the digital privacy of children and youth—while facilitating their access to the benefits of the online space—we next turn to a consideration of why digital privacy is important for children and youth, and how their experience of the online space—its threats, harms, benefits, and potential—has shaped their conceptions of privacy and their unique needs for both access and appropriate protections.

READ The past, present, and future of digital privacy for youth and children: Part II.

Editor’s note: Bill C-11 failed to pass when Canada’s federal parliament was dissolved in August 2021 to hold a federal election. In June 2022, many elements of C-11 were retabled in Bill C-27. Read our coverage of C-27.


About the authors

Shade_et_al_v2.jpg

Leslie Regan Shade is a professor at the University of Toronto’s Faculty of Information and a faculty affiliate at the Schwartz Reisman Institute for Technology and Society. Her research examines the social and policy aspects of information and communication technologies (ICTs), with particular concerns towards issues of gender, youth, and political economy. Her work includes promoting public interest and popular discourse on ICT policy an ongoing commitment to building participatory scholar-activist networks.

Monica Jean Henderson is a PhD student at the University of Toronto’s Faculty of Information exploring the relationships between democracy, citizenship, and technology at the intersection of digital policy and digital literacy. Her research projects also include topics such as algorithmic literacy, feminist game studies, and feminist information theory.

Katherine (Katie) Mackinnon is a PhD candidate at the University of Toronto’s Faculty of Information. Her research focuses on the history and social, infrastructural, and policy issues of the early web, including early internet use by young people. She also studies public policy and social issues around the internet, youth action, and ethical approaches to web archives.

Acknowledgements: The authors thank the Social Sciences and Humanities Research Council (SSHRC) for their support of The eQuality Project. Earlier research was also conducted by Amandeep Singh.


Browse stories by tag:

Related Posts

 
Previous
Previous

The past, present, and future of digital privacy for youth and children: Part II

Next
Next

Agency, goals, and perspective: how do natural or artificial agents understand the world?