A New Direction for Canadian Privacy. Reviewing the new Bill C-27

It's an exciting time to be in Canadian Privacy. New legislation was introduced last week on Canada's new proposed privacy law and there are some interesting changes from the original Bill C-11 that was proposed last year. 

The Digital Charter Implementation Act, 2022 features three pieces of legislation: (i) the Consumer Privacy Protection Act (which would replace the current legislation “PIPEDA”), (ii) the Personal Information Act and Data Protection Tribunal Act, and (iii) the Artificial Intelligence and Data Act.

In summary, the new or reinforced requirements for organizations would be the following:

  1. Use plain language while collecting consent so that the person “would reasonably be expected to understand” what organizations will do with their Personal Information;

  2. Clearly identify any third parties to whom Personal Information will be disclosed;

  3. Develop a privacy management program (which has to be made available upon request of the Office of the Privacy Commissioner), and outline in their privacy policies whether they (i) use automated decision making about individuals that could have a significant impact on them, and (ii) whether they transfer Personal Information outside Canada or interprovincially in a way that may have foreseeable privacy implications for individuals.

  4. Enable individuals, in certain circumstances, to have their data securely transferred from one organization to another (“data mobility”);

  5. Treat the Personal Information of minors as “sensitive information”, knowing that the request for deletion of such information would be subject to fewer exceptions than the Personal Information related to adults;

  6. Establish retention periods that consider the sensitivity of Personal Information, and inform individuals of these retention periods in their policies;

  7. Collect express consent for activities are for the purposes of influencing behaviour or would not be expected by a reasonable person;

  8. Erase an individual’s Personal Information upon request in the following situations: i) when the individual has withdrawn their consent, ii) when the Personal Information is no longer necessary or if (iii) the information was collected in contravention of the Act;

  9. In case of the use of automated decision system that has a significant impact on individuals, be able to explain, if an individual requests it, the following: i) the type of information that was to make the decision, ii) the source of information, and iii) the reasons/factors that led to the decision;

  10. Identify, assess and mitigate the risk of harm and bias in the use of high-impact Artificial Intelligence techniques (which will be further defined by regulations);

  11. Document their rationale for developing AI and report on their compliance with the safeguards it sets out; and

  12. Publish on a publicly available website a plain-language description of the AI system.

Although this seems like a long list of requirements, the proposal introduces some new flexibilities for organizations subject to the Act:

  1. There would be new exceptions to the obligation of obtaining consent, for example if organizations can prove that they have “legitimate interests” in the collection, use or disclosure of Personal Information, provided that they identify any potential adverse effect on the individual and take reasonable measures to reduce or mitigate those effects;

  2. Organizations can process anonymized data because it is expressly out of scope of the legislation (knowing that anonymized data is irreversibly and permanently modified so that no individual can be identified, whether directly or indirectly, by any means);

  3. De-identified information (i.e., information that is no longer directly identifiable but where there is still a risk that the individual can be identified) is defined as Personal Information, and therefore subject to the act. However, de-identified data is not Personal Information in certain cases, especially with respect to research;

  4. Organizations are not required to provide explanations of automated decision-making in cases when the decision is not likely to have a significant impact on them;

  5. If a request for disposal is made, organizations may refuse to comply with it for new reasons, for example if the request is vexatious or made in bad faith, or if the information is scheduled to be disposed of in accordance with the organization’s information retention policy (provided that individuals are informed of the remaining period of time for which the information will be retained).

With respect to enforcement, the following is notable:

  1. The Office of the Privacy Commissioner (“OPC”) would be able to order companies to comply and recommend fines if they fail to do so;

  2. A tribunal with the power to make court orders would then review the recommendations and impose penalties;

  3. The most severe penalty would see non-compliant companies pay 5% of their worldwide revenue or $25 million, whichever is greater;

  4. A private right of action is introduced and would allow individuals to seek financial relief from Federal Court or a provincial superior court for various violations of the CPPA;

  5. An artificial intelligence and data commissioner would have the power to order independent audits of the activities of companies as they develop technology;

  6. The minister would have the power to register compliance orders with the courts; and

  7. Organizations developing AI could be prosecuted for using illegally obtained data and when there is intent to cause serious harm or economic loss.

Many notions and principles require further clarifications before the Act is finalized. With the House of Commons soon going on summer break, the bill is unlikely to be debated for the fall. We will follow-up with more in-depth articles explaining the potential practical impacts for organizations.

Guilda Rostama

Guilda Rostama is a GDPR specialist. As a fully-qualified French lawyer, Guilda has a PhD in law, and holds the Master of Law and Internet Technology from Paris Sorbonne, as well as the LLB of the University of Sheffield, United Kingdom, and the CIPP/C. Before moving to Canada in 2021, Guilda was a senior legal counsel in the Economic Affairs department in the CNIL (the French Data Protection authority) for more than four years. During her tenure in the CNIL, she was actively involved in building recommendations and guidelines for organizations implementing the GDPR. She was also the leader of the Social Media Expert subgroup in the European Data Protection Board (EDPB).

Previous
Previous

The New “Legitimate Interests” Exception to Consent: What Can We Learn From the EU?

Next
Next

Consent Under Bill C-11 (CPPA)