Home

Donate
Analysis

Canada’s Online Harms Bill is Dead (Again): Three Questions to Consider for the Next Round

Mandy Lau / Apr 28, 2025

Canada’s Bill C-63: The Online Harms Act died less than a year after it was tabled in Canada’s House of Commons, writes Mandy Lau.

A Canadian flag flying in Montreal. Shutterstock

Only a year after it was introduced, Canada’s Online Harms Act (Bill C-63) died on the order paper when former Prime Minister Justin Trudeau announced his resignation and prorogued Parliament in January 2025. We Canadians have been here before. The effort to advance Bill C-63 represents the second time the liberal party pushed for online harms legislation right before government dissolution. Since then, Mark Carney took over as Liberal Party leader and prime minister and called for an election that is currently underway.

While online harms is not a current campaign priority, it’s still worth considering what can be learned from these failed attempts. Here, I suggest there are three questions to consider, along with some issues raised at the Justice Committee’s pre-study, as we await election results.

Background

Bill C-63 was a legislative and regulatory framework to reduce seven types of harmful content on social media platforms: content that sexually victimizes a child or revictimizes a survivor, intimate content communicated without consent, content used to bully a child, content that induces a child to harm themselves, content that foments hatred, content that incites violence, and content that incites violent extremism or terrorism. The bill would have required social media operators to follow four duties: the duty to act responsibly, the duty to protect children, the duty to make certain content inaccessible, and the duty to keep records. It would have established a Digital Safety Office, including a Digital Safety Commission and a Digital Safety Ombudsperson. You can read an overview of the bill here.

It is unclear if there will be a third attempt to pass online safety legislation under the next government. The most looming election issue that Canadians are wrestling with is Canada’s response to the hostile trade war and threats of annexation waged on by United States President Donald Trump. Tech regulation has taken a back seat as all focus has turned to the economy. Of the two main political rivals, the liberal party and the conservative party, only the Liberals have included children’s protections from online sexploitation and extortion in their campaign platform, a much narrower and the least contested element of Bill C-63. Of the other political parties, the Bloc Québécois and the Green Party vowed to make platform companies responsible for user-generated content.

It’s Groundhog Day: Three questions to consider for the next bill

1. Whose rights and freedoms will be prioritized?

The splitting of the bill is partly in response to the controversy surrounding online hate speech components. While there is broad consensus for the protection of children, the harms related to “content that foments hatred,” “content that incites violence,” and “content that incites violent extremism or terrorism” are seen as restrictions to freedom of expression. During the pre-study, witnesses shared concerns that the bill’s duties and penalties would encourage platforms to over-moderate user-generated content and engage in proactive monitoring and mass surveillance. They also state that the addition of hate speech as a discriminatory practice under Canada’s Human Rights Act would chill speech and lead to self-censorship, for example, when it comes to the expression of protests.

Moreover, the definition of hatred is said to be too broad and vague, and since the language borrows from human rights tradition and not criminal law, it may lead to constitutional challenges. Witnesses suggest further clarity concerning regulatory exemptions for private and encrypted communication and the addition of the duty to protect freedom of expression. Only one witness articulated the concern that not regulating hate speech limits freedom of expression for those who are the targets of hate speech and the most marginalized. As a new bill gets moved along, it will be worthwhile to note whose rights and freedoms are prioritized and whose are minimized. It will be important to note which perspectives are valued and shared at news conferences, debates, and studies, and which become actioned in policy. I’ll be taking my cue from legal scholar Mary Anne Franks, who invites us to consider how the concept of free speech works to protect reckless speech that serves heteropatriarchy, capitalism, and white supremacy and how we might reimagine whose “fearless” speech we ought to protect. For more, check out Justin Hendrix’s interview with Mary Anne Franks here.

2. How would new regulations be resourced?

During the pre-study, the opposing conservative party politicized the bill’s proposed establishment of the Digital Safety Commission as “more bureaucracy” with overreaching powers that will cost taxpayers too much and take too long to be effective. Instead, conservative members push for their own private member’s bill (Bill C-412) that is more limited to children’s digital safety and leverages the courts instead of creating a new regulatory body. Some witnesses stress that the current justice system can not remedy the unique challenges of online harms. Further, the courts are chronically under-resourced. They also emphasize that an effective and systematic legislative approach requires a new regulatory body with appropriate funding.

Any future bill’s efficacy will obviously hinge on the coordination of resources for any new and/or existing regulatory structures. But what stood out was the past year’s “bureaucracy-bashing”, a common feature in right-wing populist rhetoric. How did this latest round of bureaucracy-bashing undermine the implementation of future regulations and erode public confidence in regulatory bodies? How would a newly-elected government respond to potentially weakened public perceptions? For example, would there be processes to strengthen the independence of future regulatory bodies from political interference or mitigate the risks of regulatory capture?

3. How would the regulatory scope expand or contract?

The pre-study highlighted an appetite for debating what ought to be covered under an online harms bill. In terms of additions, some witnesses suggest including explicit responsibilities to social media companies around algorithmic accountability and transparency, design features that support the safety and well-being of children, privacy and freedom of expression protections of encrypted communication, and protecting users from proactive monitoring and mass surveillance. Others suggest reducing the scope of harms so it becomes more limited to children’s safety. Up for debate is also the general issue of who should be included under a regulated service that is nimble enough for any emerging and future technologies.

Of course, lobbying from special interest groups will influence the shaping of future bills. But I wonder specifically how transnational right-wing political framings would influence the regulatory scope of future bills. For example, in a letter to Alphabet CEO Sundar Pichai, US Congressman Jim Jordan (R-OH) framed Canada’s Bill C-63 as an “Orwellian thoughtcrime bill” part of “foreign censorship efforts” and a new, direct threat to the American constitution. This discourse strengthens right-wing movements in Canada and has implications for Canada’s democracy. Even with Canada’s renewed focus on national identity and sovereignty, the reality of having the US as our largest trading partner is a pressure point for the next government.

What’s next

This brings us back to the current Canadian election, which takes place today. After that, it should become clear whether Canada will take a third shot at advancing online safety legislation. Watch this space.

Authors

Mandy Lau
Mandy Lau is a PhD candidate in Linguistics and Applied Linguistics at York University in Toronto, Canada. She is broadly interested in language policy and language ideology within digital culture. Her dissertation explores the regulation of harmful speech on social media. As a former public-school ...

Related

Analysis
Online Safety Regulations Around the World: The State of Play and The Way Forward

Topics