Session
Organizer 1: Charlotte Altenhöner-Dion, Council of Europe
Organizer 2: Małgorzata Pęk, Council of Europe
Organizer 3: David Reichel, European Union Agency for Fundamental Rights (“FRA”)
Organizer 4: Martha Stickings, EU Agency for Fundamental Rights
Organizer 5: Gajdosova Jana, European Union Agency for Fundamental Rights
The final list of speakers is:
Matthias Kettemann, Leibniz-Institute for Media Research | Hans-Bredow-Institut
Louisa Klingvall, European Commission
Laëtitia Avia, Member of the French National Assembly
Saloua Ghazouani Oueslati, Article 19 Tunisia and the MENA region
Victoire Rio, Myanmar Innovation Lab
Alex Walden, Google
Martha Stickings, Intergovernmental Organization, Intergovernmental Organization
David.REICHEL, Intergovernmental Organization, Intergovernmental Organization
David.REICHEL, Intergovernmental Organization, Intergovernmental Organization
Round Table - U-shape - 90 Min
- What solutions to address hate speech online are sustainable, proportionate, comprehensive and bring accountability to all the responsible parties? How can respect for human rights and the rule of law be incorporated in such solutions?
- How can regulatory solutions, such as laws requiring the removal of hate speech online, and non-binding measures, such as guidelines, be balanced and better complement each other? What is the role of initiatives such as the EU Code of conduct on countering illegal hate speech online? How can regulation to protect human rights be effective in the global online environment?
- How can the wide range of stakeholders better work together to address hate speech online? Who is responsible for determining what content should be removed?
- How do we balance the need to remove hate speech with protecting freedom of expression? How should we define the role of automated means in tackling hate speech online?
GOAL 5: Gender Equality
GOAL 9: Industry, Innovation and Infrastructure
GOAL 10: Reduced Inequalities
GOAL 12: Responsible Production and Consumption
Description:
Hate speech pervades the internet. At different times racist, antisemitic, Islamophobic, homophobic or sexist, it is a phenomenon that knows no boundaries. Whether driving polarisation or prompting long-term psychological harm among its victims, the consequences of online hate can be devastating. Striking at the core of human dignity, online hate speech impacts a wide range of human rights, from privacy, data protection and freedom of expression, to effective remedy, non-discrimination, freedom to conduct a business and victims’ rights.
Addressing hate speech online demands a concerted and comprehensive rights-based approach. Through an interactive multi-stakeholder discussion, this session aims to identify some of the key elements of a framework to effectively and efficiently combat hate speech online and ensure that human rights are protected. It will offer an opportunity to reflect on the role of different actors, possible approaches to regulatory solutions, and the place of on- and offline actions to tackle hate speech online.
The roundtable will consist of brief opening interventions by the subject matter experts (approx. 30 mins) to highlight the instruments they have developed and are working with to address hate speech online, followed by a discussion with and between other participants:
- Moderators - Martha Stickings (EU Agency for Fundamental Rights) and Charlotte Altenhöner-Dion (Council of Europe): introduce the subject matter experts, explain the discussion topic and highlight the key human rights issues at stake.
- Matthias Kettemann, Leibniz-Institute for Media Research | Hans-Bredow-Institut: setting out the key components of a clear, rule of law-based framework for detecting illegal online hate speech, with a particular focus on the Council of Europe Recommendation on the roles and responsibilities of internet intermediaries.
- Louisa Klingvall, European Commission: reflecting on the role of voluntary codes of conduct and how different stakeholders (regional organisations, business, civil society) can work together.
- Laëtitia Avia, Member of the French National Assembly: presenting the approach set out in the proposed French legislation to combat online hate speech, for which she is the rapporteur.
- Saloua Ghazouani Oueslati, Article 19 Tunisia and the MENA region: defining the main pitfalls of current regulatory approaches to online hate speech.
- Victoire Rio, Myanmar Innovation Lab: discussing the specific situation in Myanmar and Facebook’s responses to violence-inciting messages spreading across the platform in that country.
- Alex Walden, Google: highlighting the steps major tech companies are taking to tackle hate speech online, and the respective roles and responsibilities of states and internet companies.
To support practical outcomes and substantive policy discussions, subject matter experts will be provided with a set of guiding questions prepared by the organisers. These will ensure that each of the key policy questions are addressed. Discussion during the session will be facilitated by keeping the opening interventions short, leaving the bulk of the session for exchanges of questions and ideas with and between the walk-in participants and speakers. Speakers will be encouraged to respond to each other’s interventions, and those of the audience.
Expected Outcomes:
Discussions are underway at the national, regional and international levels – as well as with and among business and civil society – about how best to tackle the phenomenon of hate speech online.
This session will contribute to ensuring that human rights considerations are hardwired into legal and policy debates by identifying some of the key elements that any regulatory regime or voluntary initiative need to take into account. Participants will gain insight into existing instruments to address hate speech online and learn about the roles that different actors in the process can play.
At the outset of the session, the moderators will introduce some key questions to the audience, encouraging them to reflect on them during the opening interventions by the subject matter experts and to contribute their ideas and suggestions on these issues during the discussions. Throughout the session, the moderators will proactively reach out to walk-in participants, encouraging them to not only ask questions but to share their own ideas and experiences. Speakers will be clearly briefed on the format and encouraged to ask their own questions to each other and other participants.
Relevance to Theme:
Combating the various forms of hate speech online and minimising its negative individual and societal impact is a task fraught with difficulties. In addition to raising profound human rights concerns, online hate speech also raises questions of how to establish effective instruments in a global online environment that crosses jurisdictions and how, practically, to deal with huge quantity of content constantly uploaded to the internet. Taking up the task of identifying and addressing hate speech online requires collaboration between a wide range of multidisciplinary stakeholders.
This session highlights the human rights issues at stake and brings together different actors to discuss how they can work together to develop effective tools to respond to the threats posed by hate speech online.
Relevance to Internet Governance:
The challenge of addressing hate speech online showcases the multi-stakeholder nature of internet governance. Determining what constitutes online hate speech is the responsibility of governments, in line with their human rights obligations, and subject to scrutiny and enforcement by the justice system. However, the global nature of the internet and content platforms, combined with the volume of potentially hate speech on the internet, means that internet companies are essential actors. Transforming established human rights norms and principles into actionable rules to protect rights online is emerging as a core challenge for internet governance.
Usage of IGF Tool
Proposed Additional Tools: The co-organisers will actively promote the session on social media, encouraging remote participation and exchanges on the issues raised during the discussion. Remote participants will be able to pose questions to subject matter experts and other participants during the session. A special hashtag will be created, digital promotional materials will be published on official online platforms of both co-organisers and finally, both co-organisers will be running social media campaigns with a specific focus on Twitter and Facebook platforms.
Report
- What are the key elements of a framework to effectively and efficiently identify and remove hate speech online and ensure that human rights are protected? How can regulation to protect human rights be effective in the global online environment?
- Who is responsible for determining what online content should be removed?
- How can the wide range of stakeholders better work together to combat hate speech online?
Participants agreed on the importance of legal frameworks grounded in fundamental rights for tackling hate speech online. There was broad acknowledgement of the continued challenge of addressing online hate speech while protecting freedom of expression. Participants highlighted that while technical solutions are important, they cannot replace human intervention and oversight.
Some suggested that relevant legal frameworks lack the necessary clarity and specificity. There is also a lack of enforcement of legal protections. Several participants highlighted that much discussion is focused on legal frameworks and social norms in Europe and North America, and argued for greater attention to and recognition of the situation in the Global South. They tied this to discussions concerning context, and the difficulties that automated tools designed to identify and remove content have in understanding the particular context and in detecting problematic videos and images.
- It is the responsibility of states to determine what content is illegal and should be removed.
- There is a need for greater transparency concerning how content is moderated, and how content moderators are trained to identify and remove illegal content.
- Participants highlighted that actions to tackle hate speech online must be complemented by actions to disincentivise and address the perpetuation of hate speech and intolerance in society more broadly.
- Evidence collected by the EU Agency for Fundamental Rights (FRA) captures experiences of online hate speech and harassment.
- The Council of Europe Recommendation on the roles and responsibilities of internet intermediaries calls on states to provide and enforce a human rights and rule of law-based framework which should be complemented by human rights due diligence by companies.
- The proposed French law on combatting online hate speech would require illegal hateful content to be removed within 24 hours. Failure to do so would result in a fine of up to 1.2 million EUR.
- The European Commission’s Code of conduct on countering illegal hate speech online is an example of a non-binding initiative to counter online hate speech.
- Article 19 MENA region is raising awareness of the risks for freedom of expression when European models are applied in different contexts without sufficient oversight.
- Google is reviewing its harassment policy to enhance the focus on gender issues.
- There is a need for more awareness-raising for different stakeholders, including: for users on their rights and possibilities for redress, for content moderators, for judges and legal practitioners so that they can embed human rights standards in their enforcement work.
- A gendered approach is crucial to address the different ways men and women are victims of hate speech online.
Onsite participation: Around 120 participants, of which half were women.
Online participation: unknown.
Gender and the impact of hate speech online on women were central themes of the discussion. Participants noted that experiences of hate speech online are gendered and that women – including women journalists – are specifically targeted by perpetrators of hate speech online. They highlighted that this requires gendered responses taking into account the different experiences of men and women. In addition, the panel was predominantly female, as were the two co-moderators.