How will Bill C-27 impact youth privacy?

 

Does Canada’s newly-proposed Consumer Privacy Protection Act sufficiently promote and protect the digital identities of youth? Guest contributor Michael J. S. Beauvais and SRI Faculty Affiliate Leslie Regan Shade contend that the draft legislation takes a thin view of consent and choice requirements for online services. Photo: Bruce Mars/Unsplash.


Privacy is essential for relationships with others, agency, and the development of the self. For young people today, privacy matters more than ever because increasingly their social life and private experiences involve the processing of personal information by powerful corporations, third parties, and a burgeoning “surveillance advertising” industry. Children are confronted with surveillance, targeted advertising, algorithmic decision-making, and more as a way to influence their choices. At the same time, the ability to experiment with new identities in childhood, adolescence, and adulthood may be inhibited in an age marked by “the end of forgetting.”  Data protection laws have the potential to be a bulwark against invasive surveillance, especially by commercial entities, and to put citizens on a more level playing field with those who process their information.

With this in mind, we focus on the provisions of Bill C-27, and specifically the draft Consumer Privacy Protection Act (CPPA), that affect youth. The draft legislation contains a number of youth-centric provisions, from data retention to capacity to data sensitivity. The CPPA uses the language of “minors,” which the legislation does not define and is defined differently in federal, provincial and territorial legislation. That the CPPA has anything dealing with data protection issues and youth is a welcome development. Indeed, the youth provisions have been touted by Innovation Minister François-Philippe Champagne as the bill’s “biggest legacy.”

The bar here is low—the current federal, private-sector data protection law, the Personal Information Protection and Electronic Documents Act (PIPEDA), does not mention children or youth at all. (We do note the Office of the Privacy Commissioner of Canada dealt with some youth-related issues, such as children under 13 years of age needing to obtain parental consent before they could consent to data processing.) But do the youth provisions live up to the political hype? We believe that the text of the bill does not do enough to empower young people while protecting their personal information. If we want to leave a meaningful legacy for Canadian children and adolescents, more work is needed.

 
Michael J.S. Beauvais and Leslie Regan Shade

Michael J.S. Beauvais, a doctoral candidate at the University of Toronto’s Faculty of Law, and SRI Faculty Affiliate Leslie Regan Shade, a professor in the Faculty of Information, argue that clearer and more stringent rules around data privacy are needed to sufficiently protect the interests of Canadian youth.

 

International context

The CPPA is a first step toward Canada’s federal, private-sector data protection regime joining those of other jurisdictions in creating special rules regarding processing young people’s personal information. Children have a right to privacy under international law. With the ever-increasing datafication of children, their privacy is of increased interest to policymakers, legislators, and parents around the globe. Looking south, the United States has the Children’s Online Privacy Protection Act (COPPA) and associated Federal Trade Commission rule, which regulates online services that process the personal information of children under the age of 13. Across the Atlantic, the European Union’s General Data Protection Regulation (GDPR) is a comprehensive data protection law, covering both public and private entities, which contains interpretive guidance (recitals 38, 58, 65, 71, 75), a specific regime for children’s consent to the use of social media (article 8), and other child-centric provisions (articles 12, 40, and 57).

Any individual benefits from the CPPA’s provisions, and youth benefit from some special provisions that take precedence over the ones for adults. From this perspective, the CPPA finds itself more aligned with the GDPR than COPPA and fits more generally with Canada’s desire to maintain its adequacy status under the GDPR for easy transfers of personal information from the European Economic Area to Canada.

Children’s information as “sensitive” information

In our opinion, the most sweeping change in the proposed legislation is that which deems the personal information about minors to be “sensitive” (s. 2(2)). While the OPC has already considered children’s data to be sensitive for years now, expressly including this approach in the legislation further buttresses the protections from which children benefit. Deeming information about minors to be sensitive information affects several aspects regarding the ways that organizations can handle the data of young people, the general idea of which is that the processing of young people’s data requires additional care.

A child has a more robust claim to have their personal information disposed of compared to adults. (But, given the paltriness of the “right to erasure” in the CPPA, this is not saying much.) When a minor requests that their information be deleted, an organization cannot deny the request on the basis that the retention of the information is necessary to providing a product or service to the individual (s. 55(2)(d)), nor on the basis of information retention policies (s. 55(2)(f)). We note that sensitivity of information must also be taken into account when an organization establishes its retention policies (s. 53(2)). The sensitivity of information moreover informs the analysis of the requisite level of protection security safeguards are to provide (s. 57(1)). Similarly, in determining the adequacy of technical and organizational safeguards when an organization de-identifies personal information, the sensitivity of the information is meant to inform the analysis (s. 74). Hence, young people’s data protection interests weigh more heavily compared to those of adults.

Clearer and more stringent rules needed

As with its predecessor PIPEDA, the CPPA allows for individual consent to be implied where a collection, use, or disclosure fits with their reasonable expectations and having taken into account the sensitivity of the information (s. 15(5)-(6)). Likewise, the appropriate purposes test requires that the sensitivity of information be considered (s. 12(2)(a)). However, now that youth are considered in the draft legislation, the implications are unclear. First, it is unclear if those reasonable expectations of the individual are meant to be those of the minor, and how this is to be determined. To this end, we note that children’s privacy expectations can differ from those of adults. Second, the sensitivity of the information is also relevant for this test. This generally suggests that the circumstances in which consent is not required because of the individual’s reasonable expectations should be considerably reduced in scope. The provisions would benefit from greater clarity.

We further note a central ambiguity in the CPPA: what counts as a minor’s information? The obvious ambiguity is that the draft legislation does not define a minor. A further ambiguity arises when we consider personal information over time. If an organization has information about an individual that was collected when they were 12 years old, but that individual is now 25 years old, is that information still considered to be that of a minor? If not, then the aforementioned protections disappear upon reaching the age of majority, even if the information at issue relates to a minor’s life. This leads to potential data sets of formerly minor individuals where fewer safeguards are applied, and which were collected under circumstances in which these individuals had few opportunities to decide matters for themselves.

The phenomenon of “sharenting,” whereby parents and other caregivers share many aspects of their children’s lives—often without the latter’s consent—is one of many instances where the digital footprints of today’s children are outside of their control. Schools’ increased reliance on data-intensive EdTech platforms and services, which the COVID-19 pandemic only exacerbated, is another example of the power and decisional imbalances today’s youth face. Clarifying that the information of a minor also includes information collected when an adult was a minor would help correct for these imbalances.

While the CPPA’s draft legislation aims to create flexibility for organizations, especially where the information at issue and the processing context is highly variable, both clearer and more stringent rules for youth data are needed. For example, when it comes to mandatory reporting of security breaches to the Commissioner, the sensitivity of information must also be taken into consideration to determine whether “the breach creates a real risk of significant harm to an individual” (s. 58(8)(a)). Putting aside the issue as to whether such a high threshold is acceptable, we believe that breaches of safeguards of services that target youth should always be reported to the Commissioner.

“In not defining the preconditions for exercising digital agency, Bill C-27 creates the risk that organizations will end up defining for themselves the boundaries of children’s agency.”

 

Image: Tamarcus Brown/Unsplash.

 

Capacity and age of majority

The CPPA also furnishes minors with the legal ability to exercise their rights and recourses under the Act (s. 4). This presumably includes powers (e.g., the ability to consent to information collection, use, or disclosure). To do so, the minor must be willing to exercise the rights and recourses on their own and be “capable of doing so” (s. 4(a)). We welcome the emphasis on the need for a minor to be willing as the experience and readiness of youth to do so may vary widely. This approach also coheres with the Convention on the Rights of the Child’s right to be heard, which gives a minor the choice to not participate in matters affecting them. Indeed, young people consulted for the United Nations’ Committee on the Rights of the Child’s General comment No. 25 (2021) on children’s rights in relation to the digital environment, affirmed that digital spaces were key “opportunities for their voices to be heard in matters that affected them” (III, D, 16).

The CPPA should make explicit that in cases of differing opinions about the exercise of rights and recourses under the Act between minors and their legally authorized representatives (e.g., parents, tutors), the wishes of the competent minor are to prevail. (This is something that Ontario’s health data protection law has already made clear.) What needs further reflection is how we should think about minors’ “capacity” to exercise this form of digital agency. The draft legislation neither defines the elements of their capacity nor does it empower the government to issue regulations to better define capacity.

Definitions of capacity within data protection law exist elsewhere. For example, Ontario’s health data protection law defines capacity for consent to the collection, use, or disclosure of personal health information to hinge on an individual’s ability “to understand the information that is relevant to deciding whether to consent to the collection, use or disclosure, as the case may be; and… to appreciate the reasonably foreseeable consequences of giving, not giving, withholding or withdrawing the consent.”

However, transposing this capacity test outside of the health context can be difficult for a number of reasons. First, while health information custodians will likely be able to consult the clinicians who form the child’s circle of care to determine data protection capacity, commercial entities will rarely know the child as well and therefore will be capable of determining capacity. Second, the ability to understand the nature and consequences of big data processing techniques such as machine learning escape the understanding of many adults and children alike. Indeed, there is a lack of research on children’s capacity to consent to data processing in online contexts.

In not defining the preconditions for exercising digital agency, Bill C-27 creates the risk that organizations will end up defining for themselves the boundaries of children’s agency, and not democratically accountable legislatures or the executive. Organizations may indeed overestimate children’s capacity to consent in order to legitimize suspicious activities such as behavioural advertising and cross-site tracking.

“We believe the Act should offer stronger protections for youth that are justified not only on the basis of what a reasonable person expects, but also on Canadian values of democratic citizenship.”

Limits of capacity and of consent

The focus on capacity is an important one, but it also has its limits. Capacity is the bedrock to individual autonomy—an exercise of choice is only meaningful (and thus carries normative weight) where it can be understood to be an individual’s “own.” Where data processing is concerned, though, there should be limits on what individuals are capable of consenting to. The OPC already has “No-Go Zones” based on PIPEDA’s requirement that “an organization may collect, use or disclose personal information only for purposes that a reasonable person would consider are appropriate in the circumstances” (s. 5(3)).

The “No-Go Zones” represent purposes of data processing for which it is not permitted in all but exceptional circumstances because a reasonable person would not expect them to be carried out. These include otherwise unlawful data processing and profiling or discrimination that is contrary to human rights law. The CPPA has an analogous provision (s. 12) and so still gives the OPC an anchor for developing the “No-Go Zones.” Yet, we believe that the Act should offer stronger protections for youth—as well as for adults—that are justified not only on the basis of what a reasonable person expects, but also on Canadian values of democratic citizenship. Treating children as citizens, and not as mere consumers on platforms, could mean prohibiting the profiling of minors by commercial entities or even banning behavioural advertising.

More consideration is also needed for when youth reach the age of majority. If an organization is relying upon the consent given by parents to legitimate the ongoing collection, use, disclosure, or retention of personal information, re-consenting at the age of majority should be mandated. (This approach is already done for health research in Canada.) Doing so would allow young people to decide whether they would like their personal information to continue to be processed by organizations and gives the potential for a “blank slate” upon reaching the age of majority. We also note that the pervasiveness of data collection in children’s lives today makes it unfeasible to expect individuals to maintain a list of every entity who has their data, whereas entities can easily contact children around the time at which they are no longer minors.

The biggest gap in the CPPA draft legislation is in the field of age-appropriate design codes, which are among the most promising angles for protecting young people’s data. These codes guide the design of digital environments in which young people interact. The UK’s Age-Appropriate Design Code is a leading one in this area, and California’s Age-Appropriate Design Code Act has just been enacted. While the Office of the Privacy Commissioner appears to have the powers to develop a design code (s. 110(1)(b)), it is not under any obligation to do so. (Comparatively, the UK’s Information Commissioner Office was required by law to develop their design code.) Instead, the CPPA only requires organizations to provide information about their information practices in plain and age-appropriate language when their services are directed at minors (s. 15(4)). In effect, the draft legislation takes a thin view of what consent and choice in online services requires: organizations need only tell young people how their information will be used, instead of designing the services themselves to cater to young people’s interests.

 Want to learn more?


About the authors

Michael J. S. Beauvais is a doctoral candidate at the University of Toronto’s Faculty of Law. His doctoral research on children’s privacy from their parents brings together thinking on privacy, children’s agency, parental authority, and information-communication technologies. He also writes about data protection law and its interaction with health-related research.

Leslie Regan Shade is a professor at the University of Toronto’s Faculty of Information and a faculty affiliate at the Schwartz Reisman Institute for Technology and Society. Her research examines the social and policy aspects of information and communication technologies (ICTs), with particular concerns towards issues of gender, youth, and political economy. She is a co-investigator on The eQuality Project, a SSHRC partnership grant that explores young people’s experiences of privacy and equality in networked environments.


Browse stories by tag:

Related Posts

 
Previous
Previous

How will Bill C-27 affect the governance of online platforms?

Next
Next

SRI Director Gillian Hadfield named Canada CIFAR AI Chair