Who decides? Consent, meaningful choices, and accountability

 
Schwartz Reisman Research Lead Lisa Austin, professor of law specializing in technology, including privacy and transparency, comments on aspects of Canada’s newly-proposed privacy law reform. This piece is the first in a series of posts on the featu…

Schwartz Reisman Research Lead Lisa Austin, professor of law specializing in technology, including privacy and transparency, comments on aspects of Canada’s newly-proposed privacy law reform. This piece is the first in a series of posts on the features, implications, and controversies surrounding privacy law reforms around the world in an increasingly digital and data-rich context.


Lisa Austin

Lisa Austin

Much of what Bill C-11—the federal government’s proposed new private sector privacy legislation—accomplishes is to replicate the protections of the Personal Information Protection and Electronic Documents Act (PIPEDA) while adding stronger enforcement powers. However, there are also some notable changes to the substance of PIPEDA, including two new types of exceptions to the “knowledge and consent” requirements (an individual’s “knowledge and consent” are required for the collection, use, or disclosure of their personal information.) One of these exceptions is s.18, the new “business activities” section, and another is a set of exceptions for the use or disclosure of personal information that has been “de-identified” (ss.20, 21, 22, and 39). 

View the Government of Canada’s proposed Bill C-11, first reading, November 17, 2020.

In this post I will comment on the “business activities” exception and in a subsequent post I will comment on the “de-identified” personal information exceptions. 

In general, I endorse the move to create these kinds of exceptions and will outline why. However, there are also some deeply problematic aspects to these exceptions as drafted, so I offer here a deep dive that outlines what some of these problems are; some provisions should be removed and others redrafted. To understand why requires examining s.18 but also its relationship to other aspects of C-11. The other aspects that I will specifically focus on are those that ensure what I am calling here “meaningful choices” (I will focus on the ideas of necessity and proportionality found in ss.12, 13, and 15) and accountability (I will focus on the transparency provisions in s.62 and the penalty provisions in s.93).

A key question regarding how we govern data flows is “who decides?”—who gets to decide what information is collected, how it is used, and how it is shared? The starting point of PIPEDA, and now C-11, is that the individual data subject gets to decide about their own personal information. This is why consent is the default requirement for collection, use and disclosure of personal information, from which exceptions are carved out. Contrast this with public sector privacy laws—like the federal Privacy Act or Ontario’s Freedom of Information and Protection of Privacy Act (FIPPA)—where the answer to “who decides?” is the government (subject to legislative constraints, the adequacy of which is another discussion entirely). 

There are at least two problems with consent in the private sector context. The first is whether it can be truly informed, given the abysmal failure of privacy policies to provide transparency to consumers. Bill C-11 tries to address this through outlining a number of transparency requirements in s.62, including a “plain language” requirement. While this is appealing, the assumption remains that consumers will read documents outlining transparency details. In a data age, we can do so much better than this. My colleague, fellow Schwartz Reisman Research Lead David Lie, and I have written about how the transparency of data flows could be improved—for consumers and regulators—if privacy policies were written in ways optimized for automation.

For example, we can use machine learning techniques to train a model that can automatically read and classify information in a privacy policy and this can then be used to create a variety of additional tools, from auditing tools that can assist regulators to visualization tools that can assist consumers. Currently the accuracy of such models is hampered by problems with the policies themselves. One suggestion I have for the federal government, therefore, is to add a provision to the list of transparency requirements in s.62 that says “any other prescribed requirement.” This would provide a pathway for the government to create some regulations that could help standardize policies in at least some sectors in order to better facilitate the development of important transparency tools.

the transparency of data flows could be improved—for consumers and regulators—if privacy policies were written in ways optimized for automation.

The second problem with consent is that organizations often place individuals in the position of giving consent on take-it-or-leave-it terms. Consent in this case could be informed, but the organization effectively gets to dictate the terms. For example: “If you want to order this item and have it delivered, then you must agree to have your personal information shared with multiple third parties for a variety of purposes.” 

The consumer is not given a meaningful choice in relation to data flows.

Meaningful choice will not come about through strengthening consent. It requires a different set of provisions, usually reflected in ideas of necessity and proportionality. Constitutional lawyers and human rights lawyers are familiar with these ideas, but they are also present in various forms of data protection law. Data minimization—limiting the types of data collected to only what is necessary for a specific purpose—is an idea of necessity. Provisions that pay attention to the sensitivity of information or seek to assess the potential impacts of its use utilize the idea of proportionality. 

For example, s.15(5) of C-11 states: “The organization must not, as a condition of the supply of a product or service, require an individual to consent to the collection, use or disclosure of their personal information beyond what is necessary to provide the product or service.” (This is also found in PIPEDA.) There are other provisions in C-11 that also help to ensure meaningful choices—especially those under “Appropriate Purposes” (s.12) and  “Limiting Collection, Use and Disclosure” (ss.13 and 14). All of these provisions largely replicate PIPEDA, with some differences. I will discuss these in detail in a moment but here just want to emphasize that a consent-based regime needs additional resources than consent in order to protect privacy. 

But what about exceptions to consent? How should we think about these?

Imagine if a consumer wanted a business to provide home delivery of an online purchase but did not want to provide that business with their home address. In that case, because the address is necessary, the business can require consent (s.15(5)). Requiring consent is a rather odd idea as it is equivalent to saying: if you choose this transaction then you are taken to choose the necessary data flows associated with this transaction. The idea of consenting to the data flows, rather than the transaction, does no additional work and is a fiction. The fact is that neither the consumer nor the business gets to unilaterally decide about those data flows even though the transaction remains voluntary. The idea of “necessary.” as interpreted by regulators and courts, is what is protective here—public norms subject to public standards of justification. (See my articles on consent and PIPEDA and on privacy and power for why consent is not necessarily as central to privacy protection as some think.)

With this in mind, I want to look at the new “business activities” exception to consent in C-11.

S.18(1) allows an organization to collect or use personal information without an individual’s knowledge or consent where it is made for a “business activity” as described in s. 18(2) and where such collection or use is reasonably expected and not for the purposes of “influencing the individual’s behaviour or decisions.”

Three preliminary points. The first is that outlining explicit exceptions to consent rather than requiring organizations to rely upon a highly flexible and frankly fictitious notion of “implied consent” (PIPEDA’s model) is a good thing. Europe’s General Data Protection Regulation (GDPR)—thought of by many people as the gold standard of data protection law—has six grounds for the legitimate processing of data, of which consent is only one. There is no reason to think that strong privacy laws require consent to every data flow. 

The second is that these exceptions need to be wedded to a strong transparency and accountability framework. Here is one of the places where the current drafting is problematic. The transparency provisions in s.62(2) require organizations to provide information on how the organization “applies the exceptions to the requirement to obtain consent” (s.62(2)(b)). But the exception is to both knowledge and consent. So, putting this together with s.18, an organization has to give a general account of the unconsented-to data flows, but this does not need to meet the level of “knowledge,” although the collection or use does have to meet a test for reasonable expectations. 

This is confusing. S.18 needs to drop the language of “knowledge” and make the exception to consent only.

The third is that if C-11 outlines new exceptions to consent then it should get rid of implied consent in the consent provisions. This is what the GDPR does. Instead, C-11 places the onus on businesses to establish that implied consent is “appropriate.” There is also nothing in s.62’s transparency requirements that requires specific reporting on the use of implied consent, so it is entirely unclear how reliance upon implied consent will even be detected, let alone enforced. 

There is no reason to think that strong privacy laws require consent to every data flow. 

What about some of the specific exceptions?

S.18(2)(a) states that no consent is required for “an activity that is necessary to provide or deliver a product or service that the individual has requested from the organization.” This is reasonable for the same reasons that s.15(5) is reasonable, although it does raise the question as to the relationship between these two provisions. Ss.18(2)(c) and (d) provide an exception to consent where an activity is necessary for security or safety purposes. This seems right for a different basis—decisions regarding security and safety should not be made by individual consumers as they have effects on,  and are in the interests of, all consumers. 

However, s.18(2)(e) is extremely problematic and should be removed entirely. It allows an exception to consent for “an activity in the course of which obtaining the individual’s consent would be impracticable because the organization does not have a direct relationship with the individual.” Our current complex data ecosystem has so many third parties with no direct relationship with the individual that this opens up an unacceptably gaping sinkhole in the legislation. Even in other contexts where the consumer does not get to decide about data flows, the transaction is either voluntary or has some kind of democratic or judicial authorization. S.18(2)(e) says that companies get to decide whenever it is inconvenient to have individuals decide—regardless of the voluntariness of the transaction and without some form of public authorization. Cambridge Analytica, Clearview AI, analytic companies, and data brokers everywhere—this seems to be for you. 

decisions regarding security and safety should not be made by individual consumers as they have effects on,  and are in the interests of, all consumers. 

I doubt that was the intention and it might be that the drafters believe that s.18(1)(a) and (b) will be sufficient to deter abuse of this provision—these are the requirements that a reasonable person would expect the collection or use and that it not be for the purpose of influencing the individual.

There are three problems with this.

First, and perhaps most damning, is that the violation of s.18 is not something that can attract any of the hefty new penalties available under C-11. According to s.93, only some violations of legislative obligations can result in the new penalties and s.18 is entirely excluded. Second, the idea of “reasonable expectations” is notoriously slippery. It could refer to whether individuals actually expect something, which does not deter bad behaviour as individuals can expect organizations to engage in problematic practices. It could mean something more normative, although then the relationship between this and s.12’s “appropriate purposes” provision is unclear (more on s.12 momentarily). Third, organizations can engage in profiling not meant to influence the individuals from whom they collected the data—but might use the data to influence other people in the future. Section 18(1)(b) does not capture this. 

S.18(2)(b) is also problematic. It allows an exception to consent for “an activity that is carried out in the exercise of due diligence to prevent or reduce the organization’s commercial risk.” “Commercial risk” is too broad and vague a category and can encompass a large number of commercial interests. It therefore effectively says: companies can decide on data flows that are in their own commercial interest. In contrast, the GDPR allows for the lawful processing of data where necessary for a legal obligation (Article 6). If it is a legal obligation then it is not the company who decides, but ultimately the public (through the legislature or the courts). S.18(2)(b) should be removed entirely or replaced with something like the GDPR legal obligation provision.

“Commercial risk” is too broad and vague a category and can encompass a large number of commercial interests. It therefore effectively says: companies can decide on data flows that are in their own commercial interest.

What about the other provisions that help create meaningful choices?

I have already mentioned that it is not necessarily the consent provisions that provide consumers with meaningful privacy choices. In C-11, s.12 (purposes must be appropriate) and s.13 (collection must be limited to what is necessary) are crucially important. These largely replicate PIPEDA, or case law interpreting PIPEDA, with a few significant changes. However, I think that, as drafted, these provisions are problematic. 

Some of those problems stem from what looks like hasty drafting. PIPEDA always had a problematic structure because it chose to adopt the Canadian Standards Association’s Model Code into the legislation as a schedule even though it did not conform to legislative drafting norms. It then added additional provisions into the legislation and then left all of this to the courts to interpret, a task that the Federal Court has indicated requires “flexibility, common sense and pragmatism” rather than the regular legal methods of interpretation. Adopting elements of PIPEDA requires thought if we are not to repeat this mess. Ss.12 and 13 have overlapping considerations pertaining to data minimization and should be redrafted for clarity.

That might seem like a quibble but it is connected to deeper problems. One of those deeper problems is the fact that it is only the violation of s.13 that can lead to a penalty under C-11 and not the violation of s.12. One reason this is problematic is that s.13’s data minimization requirement only applies to the collection of personal information, not its use or disclosure. It is s.12’s “appropriate purposes” provision that covers collection, use, and disclosure, and also includes a consideration of whether there is a “less intrusive means” of meeting the organization’s purpose. But this is a “factor” to consider rather than a requirement, and it is qualified by “at a comparable cost and with comparable benefits.” So, if we put these two provisions together we get the following:

  • An organization is required to limit data collection to what is necessary, under pain of potentially hefty penalties.

  • An organization should consider limiting data use and disclosure to what is necessary, but only if it does not cost more than if it uses and discloses more data than is necessary, a practice which will not be penalized in any event.

Section 12 also includes proportionality factors that apply to the assessing purposes. For example, an organization’s purposes should be “legitimate business needs,” the “sensitivity” of the information needs to be considered, and the potential loss of privacy should be proportional to the benefits (taking into account data minimization efforts). Having an “appropriate purposes” provision (PIPEDA also has this) provides an important substantive review of data practices and, along with data minimization requirements, can be more protective of consumer privacy than consent provisions. Leaving this out of the new penalty regime is short-sighted.

Ss.12 and 13 have overlapping considerations pertaining to data minimization and should be redrafted for clarity. That might seem like a quibble but it is connected to deeper problems.

Now is not the time to water down the elements of privacy laws that provide consumers with meaningful choice. I appreciate that the government is mindful of the needs of small- and medium-sized enterprises and is seeking a balance in this legislation. The worry about regulatory burdens can be better met through developing other tools—like the codes of practice and certification programs that C-11 also provides for (ss.76-80)—that reduce complexity and risk for organizations without compromising consumer privacy. In another post on the exceptions to consent for personal information that has been de-identified, I will discuss these tools in more detail.

Editor’s note: Bill C-11 failed to pass when Canada’s federal parliament was dissolved in August 2021 to hold a federal election. In June 2022, many elements of C-11 were retabled in Bill C-27. Read our coverage of C-27.


Browse stories by tag:

Related Posts

 
Previous
Previous

Bill C-11 and exceptions to consent for de-identified personal information

Next
Next

Privacy law reform around the world: features, implications, and controversies