Advertisement 1

Ottawa takes second shot at overhauling Canada's consumer privacy laws

Article content

OTTAWA — Innovation Minister François-Philippe Champagne proposed a replacement for Canada’s 22-year-old private-sector privacy rules Thursday, introducing a bill that would let regulators seek the biggest fines in the G7 for companies that abuse consumers’ data, add new digital protections for kids and govern businesses’ use of artificial intelligence.

Talking Point

The Liberal government’s second attempt to overhaul Canada’s decades-old consumer privacy law includes new protections for children and responds to some critics’ concerns over exceptions to consent for businesses and rules for de-identified data. Bill C-27 also establishes a new regulatory system for organizations using artificial intelligence.

Advertisement 2

Story continues below

Article content

Bill C-27 is the Liberal government’s second attempt to update the laws governing the use of personal information and data, after its previous proposal expired with last year’s summer election call. But plenty of what’s in Thursday’s legislation is new. The changes come after the Liberals’ first effort faced criticism from regulators and digital rights groups. Here’s what you need to know about what the Liberals are proposing, and how they’ve tried to address those flashpoints.

Article content

What can businesses do with consumers’ data? 

The new Consumer Privacy Protection Act (CPPA) requires businesses to give consumers plain-language explanations about how their data will be used, so they can provide meaningful consent. But businesses won’t have to ask for permission for some processes that the legislation considers routine. 

Advertisement 3

Story continues below

Article content

Senior officials from Innovation, Science and Economic Development Canada (ISED) briefing reporters on the bill cited the example of a retailer sharing a customer’s address and contact information with a fulfillment service handling delivery of an e-commerce order. They also noted that ISED had narrowed the exceptions from the Liberals’ original bill, the former C-11 introduced in November 2020, based on stakeholder feedback. Such briefings are routinely conducted on the condition that participating officials not be named. 

Article content

Under the current Personal Information Protection and Electronic Documents Act (PIPEDA), which came into force starting in January 2001, the “predominant basis for collecting, using and disclosing data is individual consent,” said Carole Piovesan, managing partner at Toronto’s INQ Law. “That’s too burdensome on individuals. It’s very difficult to make it meaningful, because you’re asking for consent for everything.”

Advertisement 4

Story continues below

Article content

Under the CPPA, users will also be able to ask companies to transfer personal information they have collected about them to other firms. The portability requirements are identical to those in the former C-11.. Those provisions left “enough room for companies to make it operational in a manner that makes sense to them,” according to Chantal Bernier, a former interim privacy commissioner. But she said consumers should only be allowed to transfer information they provide directly, not data that organizations generate by analyzing it. 

Executives at Canadian companies that PwC surveyed in the spring of 2021 about the proposed CPPA rules expected the data transfer and deletion requirements–along with changes to consent—to have the largest impact on their operations, said Jordan Prokopy, who leads the professional services firm’s Canadian privacy practice.

Advertisement 5

Story continues below

Article content

Do you have a right to privacy?

Regulators and influential figures in Canada’s innovation-economy have called for the government to recognize privacy as a right—a broader protection than the country’s current rules, which focus on specific types of data and applications. “Technical protections, such as defining what information is required for meaningful consent, are often ineffective as they are regularly overtaken by developments in technology,” Therrien said at an industry event in March 2020. “However, the values that underpin the right to privacy are unlikely to change significantly over time.”

Thursday’s bill describes “the protection of the privacy interests of individuals with respect to their personal information” as “essential to individual autonomy and dignity and to the full enjoyment of fundamental rights and freedoms in Canada.”

Advertisement 6

Story continues below

Article content

Therrien has cited corporate opposition as one reason the government had not previously heeded his recommendations to adopt a rights-based approach. There’s “a sector in society … that believes that stronger privacy protection is an impediment to innovation [and] economic growth,” he said in December 2021.

“Privacy rights are fundamental,” Philippe Dufresne, the government’s nominee to succeed Therrien, told a Parliamentary committee on Monday.

Who should police privacy?

The privacy commissioner would get the ability to recommend firms breaking the law be fined as much as $25 million or five per cent of the company’s global revenues, which ISED said are the highest in the G7. The commissioner would also get new powers to order firms to change their practices. But the bill also carries over C-11’s plan to create a Personal Information and Data Protection Tribunal, which will have the final say on monetary penalties. 

Advertisement 7

Story continues below

Article content

While Therrien had long called for more enforcement powers, he objected to the creation of the tribunal. Lawmakers risk “depriving consumers of quick and effective remedies,” he said, since companies worried they’ll be found in breach of privacy rules would likely take their case to the tribunal rather than settle. He noted that organizations can already appeal to the court system.

Expanding the commissioner’s powers is “essential,” according to Bernier, now counsel at law firm Dentons. “There is such profitability in the use of personal information that there has to be commensurate costs for [its] misuse.” But she backed Therrien’s concern about the proposed tribunal, saying a two-step enforcement system is “simply unnecessary and may undermine [its] effectiveness.”

Advertisement 8

Story continues below

Article content

Some privacy practitioners take the opposite view. Therrien has said he needs expanded powers to address “what he perceives to be a general lack of compliance,” noted David Fraser, a Halifax-based partner at McInnes Cooper, who has represented clients in investigations by the privacy commissioner’s office. But “he hasn’t used all the tools that he has in his toolbox yet, or he hardly uses them.” For example, the privacy commissioner’s office has rarely sought court orders to enforce its findings.

In a joint investigation concluded earlier this month, federal and provincial privacy commissioners found Tim Hortons’s app violated the law by tracking and recording users’ movements. Testifying at a Parliamentary committee the following day, Therrien cited the case as evidence of the need for the ability to make orders and issue fines. “But you look at what actually happened—Tim Hortons changed its practices,” said Fraser. “I would count that as a win.”

Advertisement 9

Story continues below

Article content

If the commissioner does get new powers, the government should move forward with its proposed tribunal to ensure “due process and fairness,” according to Fraser. “We should not have a situation where the prosecutor is also the judge.” He cited systems in competition and human rights law that are similarly separated.

How do you keep kids safe online?

Thursday’s new bill would let kids, or their parents or guardians, request that organizations tell them about any personal information they’ve gathered on them, explain how their data is being used and delete it. The legislation also treats the data of minors as sensitive, meaning firms have to meet higher security standards to use it. 

“We have all lived through COVID. We have all seen our children … spending more and more time, for example, on digital platforms,” Champagne told reporters Thursday, saying the protections for kids will be “the biggest legacy” of the bill. 

Advertisement 10

Story continues below

Article content

In a December 2021 interview with The Logic, Champagne said his meeting with former Facebook product manager Frances Haugen had influenced the revised legislation. “The issue about algorithms [and] how to protect children is going to be featuring in our reflection,” he said. In October, whistleblower Haugen told a U.S. Congressional committee that the social media giant’s products harm kids and that Instagram’s algorithm steers them to unhealthy topics. Meta has disputed reporting based on documents she provided to media outlets.

Some policy experts and provincial lawmakers criticized the former Bill C-11 for failing to lay out how businesses should obtain consent for children’s data, and to what uses they could put it. Bernier favours adding clear language to privacy law. “To put in the act, right at the start, the parameters to apply for minors would yield better discipline in that regard [by] industry,” she said. Other privacy laws include such provisions. Quebec’s legislation requires parental consent for users under the age of 14. 

Advertisement 11

Story continues below

Article content

In September 2020, the U.K. Information Commissioner’s Office implemented a new code of practice for online services, requiring firms to ensure products for kids are age-appropriate to minimize the amount of information they collect and to conduct data protection assessments. Canadian regulators are likely to increase their focus on kids’ privacy, “given the severity of the issues and the degree to which children are spending time online,” Piovesan said. 

The word “child” doesn’t appear in PIPEDA, Fraser noted. “But privacy practitioners know how to deal with children’s information.” Provisions specifying ages for collection and use of data risk “infantilizing a whole bunch of mature young people, because you’re taking away their rights to choice.”

Advertisement 12

Story continues below

Article content

How should we govern artificial intelligence?

Under the new rules, the government would designate some high-impact uses of AI, such as those that could put people’s health and safety at risk or produce biased outcomes. Firms building or using the technology would need to check whether their systems could have such impacts, as well as publicize their use and how the company is mitigating any risks. The government would be able to audit any system it reasonably suspects breaks the law.

The legislation would also establish an AI and data commissioner to advise the government on new standards, and support enforcement in “highly-injurious cases,” ISED officials said. Those sanctions could include monetary penalties, which will be laid out in regulations to be published later. 

Advertisement 13

Story continues below

Article content

“I think we would be one of the first in the world to have [an] AI framework,” Champagne told reporters on Thursday. The EU proposed its own legal system for governing high-risk AI applications in April 2021.

It would be a significant regulatory expansion from the previous bill, which only required organizations to disclose and explain their use of automated decision-making systems. Digital rights groups have said the original bill didn’t go far enough. Bernier favours the model Quebec adopted in September 2021, which requires organizations to be transparent about their use of AI, give people access to the information they process and dispute their outcomes. Those provisions address “both individual control over one’s personal information and the risk of algorithmic bias,” she said, citing the increasing use of AI in hiring and credit decisions.

Advertisement 14

Story continues below

Article content

Fraser said it’s important to let people challenge automated decisions about them and ensure such systems don’t discriminate on prohibited grounds like race or gender. But requiring companies to explain how an algorithm came to an outcome may not always be realistic. “In a whole lot of cases, it’s a relatively black box,” he said. He wants regulation to be proportional to the AI’s impact, focusing more on uses like loan eligibility than ad targeting.

What’s next

Most of the firms PwC surveyed in spring 2021 had already begun planning for the original legislation, Prokopy said. The lack of certainty around the new rules left “businesses in a bit of an awkward position.”

Over the last two years, several provinces have passed or proposed new privacy laws or amendments, citing the delay of and gaps in the federal bill. Lawyers and business groups have expressed concern about a patchwork of overlapping and incompatible rules across the country, which they claim would increase firms’ compliance costs. 

Champagne tabled the new bill just before Parliament’s summer break. The House of Commons is scheduled to rise next week and not return until mid-September. The Liberals will then have to push it forward alongside their other legislative priorities, including digital regulation like new rules for online streaming services.

This section is powered by The Logic. The Logic is Canada’s preeminent tech and business newsroom. For more news, visit thelogic.co.

Latest National Stories

Advertisement 1

Story continues below

This Week in Flyers