IGF 2023 Open Forum #95 Public-Private Partnerships in Online Content Moderation

Monday, 9th October, 2023 (06:15 UTC) - Monday, 9th October, 2023 (07:15 UTC)
WS 3 – Annex Hall 2


Break-out Group Discussions - 60 Min


As the digitalization of society advances, an increasing number of individuals encounter illegal content with personal ramifications. Threats such as doxing, extortion, revenge porn, and cyberbullying disproportionately impact women, politicians, and journalists, posing challenges to online inclusivity. To ensure an inclusive digital environment, effective online content moderation plays a crucial role. In the Netherlands, we are encouraging collaboration between the internet industry and governments as it becomes increasingly clear that a multistakeholder approach is essential for shaping the future of online content moderation. During this session we will explain our approach and give participants the opportunity to engage in a dialogue game, immersing themselves as one of the stakeholders.

1) How will you facilitate interaction between onsite and online speakers and attendees? Interaction between onsite and online speakers and attendees will be encouraged by using Mentimeter to facilitate interaction during the plenary part. After that, the attendees will be divided into break-out groups in which they will participate in a dialogue game representing the diverse stakeholders involved in online content moderation. 2) How will you design the session to ensure the best possible experience for online and onsite participants? While the plenary part of the session will bring together both online and onsite participants, the break-out groups will enable online participants to join a break-out group together and onsite break-out groups to play the dialogue game in person. 3) Please note any complementary online tools/platforms you plan to use to increase participation and interaction during the session. Mentimeter will be used to engage participants, as well as background information and arguments they can use to get into their role as a stakeholder. The different roles will receive diverging background information and should embrace their own priorities as explained to them.


Dutch Ministry of Justice and Security
Eleonora van Hoorn, Dutch Ministry of Justice and Security, Daan Quaijtaal, Dutch Ministry of Justice and Security, Bastiaan Winkel, Dutch Ministry of Justice and Security, Michiel Steltman, Electronic Commerce Platform Nederland (ECP)/ Digital Infrastructure Netherlands Foundation (DINL), WEOG Dorijn Boogaard, Electronic Commerce Platform Nederland, WEOG


Michiel Steltman (Electronic Commerce Platform Nederland (ECP)/ Digital Infrastructure Netherlands Foundation (DINL) Eleonora van Hoorn (Dutch Ministry of Justice and Security) Daan Quaijtaal (Dutch Ministry of Justice and Security)

Onsite Moderator

Marjolijn Bonthuis (Electronic Commerce Platform Nederland)

Online Moderator

Bastiaan Winkel (Dutch Ministry of Justice and Security)


Dorijn Boogaard (Electronic Commerce Platform Nederland)



Targets: A multistakeholder approach for online content moderation aligns with several relevant Sustainable Development Goals (SDGs), contributing to the overall theme of creating a sustainable and inclusive digital ecosystem. Below, the importance of a multistakeholder approach to online content moderation is explained in relation to SDG 3 (Good Health and Wellbeing), SDG 5 (Gender Equality), SDG 10 (Reduced Inequalities), SDG 16 (Peace, Justice and Strong Institutions) and SDG 17 (Partnerships for the Goals). SDG 3: Good Health and Wellbeing First of all, this proposed session links to SDG 3: ‘Good Health and Wellbeing’ as online illegal content can directly affect people's well-being. With an increasingly digitized world, online illegal content that may harm the mental health or wellbeing of people should be effectively dealt with. Public-Private partnerships are crucial in tackling this challenge. SDG 5: Gender Equality The session also relates to SDG 5: ‘Gender Equality’ as online content moderation has a direct impact on promoting gender equality by addressing online harassment, hate speech, and discrimination. A multistakeholder approach ensures that the voices and perspectives of marginalized communities, including women, are considered in content moderation policies, fostering a more inclusive and equal digital space. SDG 10: Reduced Inequalities Additionally, a multistakeholder approach for online content moderation enables a broader representation of diverse interests, including those of marginalized communities, ethnic minorities, and underrepresented groups. By taking multiple perspectives into account, online spaces can become a safer space for all. SDG 16: Peace, Justice and Strong Institutions A multistakeholder approach to online content moderation promotes the principles of justice, inclusivity, and accountability and is therefore linked to SDG 16. By involving diverse stakeholders, including technology companies, civil society organizations, and government entities, in decision-making processes, the policies can become more effective. Especially goal 16.10 - focusing on the access to information – is relevant in relation to online content moderation. By involving multiple stakeholders, the balance between moderating harmful or false content and preserving freedom of expression can be discussed, facilitating access to diverse viewpoints and promoting informed decision-making. SDG 17: Partnerships for the Goals A multistakeholder approach for online content moderation also relates to the goal of ‘Partnerships for the Goals’ (SDG 17). Through this approach, we encourage public-private partnerships to build on experience and share knowledge, expertise, technology and financial resources.