Session
Organizer 1: Marlena Wisniak, ECNL
Organizer 2: Javier Pallero, Access Now
Speaker 1: Javier Pallero, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 2: Babette Ngene, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Toledo Amalia, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 4: Daniel Bekele, Government, African Group
Marlena Wisniak, Civil Society, Western European and Others Group (WEOG)
Javier Pallero, Civil Society, Latin American and Caribbean Group (GRULAC)
Marlena Wisniak, Civil Society, Western European and Others Group (WEOG)
Round Table - U-shape - 60 Min
1. How could social media companies better take into consideration local context when moderating content on their platform?
2. What are some key challenges for adequately, consistently, and fairly considering local context?
3. What risks and opportunities related to content governance could come from novel models of social media platforms (e.g. decentralized platforms) and rule design and enforcement?
Connection with previous Messages:
5. Gender Equality
10. Reduced Inequalities
Targets: SDG 5: gender equality– women and gender non-binary persons are disproportionately impacted by social media platforms, both because they are often the target of online abuse, on one side, as well as silenced, on the other side. Women who have intersecting identity characteristics, such as racialized women; those from religious minorities; transwomen, queer and non binary persons; disabled women; girls, and those of lower socio-economic status, are furthermore disproportionately at risk of harm. These risks are especially acute for women from the Global South. Content moderation and curation that adequately considers the unique social, political and cultural contexts in which content is shared – often within patriarchal environments – is urgently needed.
SDG 10: reduced inequalities – content moderation and curation, especially when algorithmically-driven, can unfortunately accelerate and exacerbate existing social and economic inequality. Social media platforms thus have a responsibility to moderate and curate content in a way that promotes marginalized and vulnerable users’ (and broader stakeholders’) enjoyment of human rights, instead of harming them. Depending on how platforms amplify content, or on the flip side reduce visibility thereof or block content entirely, can significantly influence inequality globally. This includes both between groups within countries, as well as between regions themselves.
Description:
This session will explore the role of context and localization in content governance of digital platforms, especially algorithmic-driven ones. Beyond language and translation challenges, moderating and curating content in a way that truly prevents harm while enabling human rights for all requires assessing individual content in light of the local social, political, and cultural context in which it’s shared. Centering local context and lived experiences in product design, policy development, and enforcement is a key component of moderating hate speech, online harassment, incitement to violence, and mis/disinformation. This is especially important – and yet severely lacking – for stakeholders outside the US and Western Europe, especially racialized persons, women and non-binary persons, migrants and refugees, LGBTQI+, children and the elderly, disabled persons, and those of lower socio-economic status, among others.
Indeed, while leading social media platforms are primarily based in the Global North, their impacts are far-reaching. Nowhere else is this more striking than in the context of conflict, such as in Palestine, Ethiopia, Ukraine, Afghanistan, among others, where platforms have historically fallen short of adequately moderating content. Yet today’s conversations around content moderation and curation - both algorithmic and human - are generally confined to US/Canadian and European borders and values. Substantive change in the way content moderation policies are designed, developed, and enforced around the world, is urgently needed to prevent adverse impacts that the monolithic approach of digital platforms has over the Global South. Yet for this endeavor to ever be successful, representatives from the Global South must not only be included in, but also drive content governance and enforcement. This begins with identifying local problems, and meaningfully participating in developing and implementing solutions.
Speakers will highlight challenges from inadequate allocation of resources between regions and communities, to the difficulty of enforcing policies at scale, especially through automation. Through this interactive and participatory session, attendees will collectively explore opportunities for rights-based content governance. This will include critical thinking about the responsibilities and organizational models of platforms to govern content in the context of emerging technologies, from algorithmic content moderation to decentralized social media platforms.
Key outcomes:
1. Exploration of concrete strategies to improve local knowledge in content governance to address emerging context issues at global and local levels.
2. Strengthened collaboration between the private sector, academia and civil society focused on addressing nuanced issues surrounding context and harmful content to help cultivate multi stakeholder dialogue and improved approaches to global content governance online.
The session will provide a safe space for activists, researchers, private sector workers and community representatives to discuss the challenges of taking into consideration local context (especially social and political) when moderating content on social media platforms. Importantly, attendees will explore pathways for addressing these challenges. We aim to include a diverse range of voices, centering participants from the Global South and those working on these issues, such as representatives from social media platforms. It is equally important to recognize who is missing in the room and how we can include their voices in future conversations.
This workshop is designed to enable future discussion. The session will end with an invitation to discuss next steps in a follow up conversation. If there’s interest, the organizers would consider expanding this conversation and launch a broader initiative focusing on local-based content governance, exploring decentralized platforms.
To maximize outreach, organizers will later draft a summary of the session and make it publicly available. Stakeholders working in this space (including technology developers) can thus learn from shared perspectives and include civil society considerations into their research and/or products.
Hybrid Format: The session will be structured in three parts. First, the invited speakers will briefly present key challenges and opportunities existing today in content governance, as related to the lack of adequately considering local context, especially in non US/Western European countries. Second, participants will be invited to share their thoughts and reflections through an open (but guided) conversation. Open discussion will be available both for attendees participating remotely, and those who are attending in-person. The organizer will provide facilitation for both in-person and online breakout groups. Third, the organizer will provide a high-level overview of what was discussed, as well as open questions and ideas for future work, based on the group discussion.
Usage of IGF Official Tool.