Parliamentary Committee Notes: Online Harm
Date: April, 2022
Classification: Unclassified
Fully releasable (ATIP)? Yes
Branch / Agency: Canada Centre/Public
Safety Canada
Proposed Response:
- Our Government remains committed to taking action against harmful content on social media platforms, including hate speech, terrorist content, and online child sexual exploitation. These types of harmful content can undermine public safety, Canada’s national security, and social cohesion.
- In concert with my colleagues, the Minister of Canadian Heritage and the Minister of Justice, we are working to introduce legislation and associated regulations to reduce the spread of illegal and harmful content, and to promote a safer and more inclusive online environment.
- Ensuring our security and law enforcement agencies are equipped to combat illegal and national security threat activity online is one of our foremost priorities.
- Our work in this area is consistent with the Government’s promise to keep Canadians safe online and align with our international commitments under the Christchurch Call to Action.
- Our approach to addressing harmful content and activity online will be balanced and targeted, and recognize the importance of freedom of expression for all Canadians.
Financial Implications:
- NIL
Background:
The Government of Canada Commitments to Combat Online Harms
In their 2021 mandate letters, the Minister of Canadian Heritage and the Minister of Justice were instructed to work with each other to: “develop and introduce legislation as soon as possible to combat serious forms of harmful online content to protect Canadians and hold social media platforms and other online services accountable for the content they host. This legislation should be reflective of the feedback received during the recent consultations.” Public Safety Canada (PS) is supporting the development of the regulatory and legislative framework to address online harms, ensuring law enforcement and national security agencies are properly equipped to combat them.
Online Harms Regulatory and Legislative Framework
Since 2019, the Department of Canadian Heritage (PCH), in conjunction with other Government departments and agencies, has been leading efforts to develop a new statutory framework for social media platforms. PS and Portfolio agencies – particularly the RCMP and CSIS – are directly implicated and continue to be engaged in the ongoing development of this new framework. Given its role in supporting prevention of online forms of radicalization to violence and interaction with digital industry, PS’ Canada Centre for Community Engagement and Prevention of Violence (Canada Centre) has played a central role to inform the development of the framework, as well as coordinate input from within PS and the Portfolio.
The draft framework – now referred to as the “Online Safety Bill” – currently targets five categories of online “harmful content” on social media platforms’ public-facing services: (1) hate speech; (2) terrorist content; (3) content that incites violence; (4) child sexual exploitation content; and (5) the non-consensual sharing of intimate images. The proposed framework sets out a requirement for platforms to take proactive and reasonable measures to make harmful content inaccessible in Canada. The proposal also includes a new Digital Safety Commission to support three bodies that would operationalize the new regime: the Digital Safety Commissioner; the Digital Recourse Council; and the Advisory Board. However, these elements of the framework are subject to change following the recommendations provided by consultations with an expert advisory group.
From July to September 2021, the Government held a public consultation on this proposed framework with more than 400 submissions, and the “What We Heard” report was made public in February. Stakeholders expressed support for taking actions against online harms, however, identified concerns for the limited opportunity to inform its development. These concerns were primarily focused on: possible over-censorship and negative implications for privacy rights; perceived vagueness of the requirements on industry; perception that mandatory reporting to law enforcement would make social media platforms “surveillance” tools of the state; potential for over-biased reporting of harmful – but not illegal – content; and views that the proposed approach is inconsistent with leading models in likeminded countries. In particular, there were concerns about compelling the removal of violating content within 24 hours of such content being flagged to the platform’s moderators. Stakeholders indicated a 24-hour content takedown requirement risks over-censorship of legitimate and lawful content.
In response, Canadian Heritage, working with the Department of Justice and PS, has developed a way forward to signal that the Government heard the critical feedback and is open to reconsidering various elements of the proposal. Specifically, the reframed approach shifts away from a content-based regulatory scheme and towards a “system-based” or “duty-of-care” model premised on platform responsibility and the regulation of procedures for countering harmful content.
The proposed reframing would rely on two main rounds of stakeholder engagement. The first involves consultations with a small group of selected experts to explore how best to reframe the legislative proposal and implement a “systems-based” or “duty-of-care” approach. The Minister of Canadian Heritage launched an expert advisory group on March 30, 2022 that is mandated to provide advice on a legislative and regulatory framework that best addresses harmful content online. The group is composed of diverse experts and specialists from across Canada who will contribute their knowledge and experience from a variety of fields. The expert advisory group will hold nine workshops to discuss various components of a legislative and regulatory framework for online safety, including a session focused on law enforcement’s role in a regulatory regime. They will also take part in additional stakeholder engagement, including with digital platforms. Non-attributed summaries of all sessions and discussions are published on the PCH website.
[REDACTED]
Christchurch Call to Action
The Government of Canada is a signatory to the Christchurch Call to Eliminate Terrorist & Violent Extremist Content Online (Christchurch Call to Action). The Call is a commitment by governments – including, Canada, Australia, New Zealand, France, United Kingdom and the United States - and digital industry to coordinate and collaborate in efforts to eliminate terrorist and violent extremist content online.As a signatory, Canada committed to considering the establishment of regulatory or policy measures to prevent the use of online services to disseminate terrorist and violent extremist content consistent with a free, open and secure internet and international human rights law. Therefore, the development of a regulatory and legislative framework to address online harms meets this commitment under the Christchurch Call to Action.
Contacts:
Prepared by: Hannah Delaney, Policy Analyst, 613-299-3270
Approved by: Jill Wherrett, PACB ADM, 613-939-6435
- Date modified: