IGF 2025 WS #311 Scaling normative social media & communication technologies

    Organizer 1: Civil Society, Western European and Others Group (WEOG)
    Organizer 2: Civil Society, Western European and Others Group (WEOG)
    Organizer 3: Civil Society, Western European and Others Group (WEOG)
    Organizer 4: Civil Society, Western European and Others Group (WEOG)
    Organizer 5: Civil Society, Western European and Others Group (WEOG)
    Organizer 6: Intergovernmental Organization, Intergovernmental Organization
    Organizer 7: Civil Society, Latin American and Caribbean Group (GRULAC)
    Speaker 1: Angela Oduor, Civil Society, African Group
    Speaker 2: Hlatky Felix, Technical Community, Western European and Others Group (WEOG)
    Speaker 3: Maffulli Stefano, Civil Society, Western European and Others Group (WEOG)
    Speaker 4: Afrooz Kaviani Johnson, Intergovernmental Organization, Intergovernmental Organization
    Format
    Roundtable
    Duration (minutes): 90
    Format description: As the IGF-2025 themes highlight; growing global concerns over AI-driven misinformation, election interference, and youth exploitation on social media, will engage a large, diverse audience, making the roundtable format essential for broad, meaningful participation. Relevance of this theme attracted many in-person IGF attendees for filling in if any of the proposed speakers can't make it. The 90-minute roundtable allows experts, youth representatives, and policymakers to engage in dynamic, moderated discussions rather than one-way presentations, ensuring equal participation. Onsite and online attendees will contribute via real-time interactive tools and live polls. With three interconnected parts—problem framing, reimagining solutions, and showcasing alternatives—the session combines brief speaker presentations with prepared comments, group discussions, and polls to maximize engagement. This format ensures deep discussion without feeling rushed, providing space for breakout interactions, audience participation, and Q&A. Our goal is an inclusive, insight-rich session, balancing expertise with collective brainstorming to drive actionable solutions.
    Policy Question(s)
    A) What do we know about how social media is affecting the physical and mental wellbeing of children and youth today, and what role is the integration of AI-tools, like chatbots, playing? B) What should characterise social media and communication technologies that operate in the public interest? C) What are some examples of digital public goods and other open source components that provide functionalities and features that are relevant to a normative social media landscape?
    What will participants gain from attending this session? The session will a) present and discuss evidence on the various inter-linked ways in which today’s big social media platforms are harming individuals and societies. This will yield important new knowledge and joint insights, as most of these concerns are often discussed in topical silos (i.e. youth mental health, radicalization and extremism, consumer exploitation), making it hard to see and address the underlying common drivers. The session will also include emerging insights on how AI will likely influence this landscape. The most important and novel understanding - for continued evolution also after IGF - will however be related to b) (re)imagining what normative, public interest social media could look like and c) understanding which digital public goods and other open source components that could help scale normative social media. d) will also highlight possible complementary policy and regulatory steps.
    Description:

    The session will have three parts, where each builds on the other. The session will: 1) Share updated research and evidence of the significant damages and risks to individuals and societies posed by how today’s main social media platforms operate. Emphasis will be on risks and harms to children and youth and to a free and open public discourse, including manipulation of election processes, commercial exploitation, and radicalisation and extremism. This will include the compounded risks posed by the integration of more powerful AI tools, including generative AI, to how social media platforms operate. 2) The next part will reset the narrative by inviting a discussion on what good social media could look like (exemplified in relation to children and youth and in relation to a free and open public discourse), including how to define relevant norms that should be operationalised in how these digital technologies work. Youth representatives will take part in-person and/or online throughout the workshop to ensure their perspectives are heard. 3) The last part of the session will highlight the potential for digital public goods and other open source technologies to help scale normative social media, both through providing alternative approaches to today’s big platforms that can be freely adopted and adapted across multiple contexts, and through providing digital tools that can help operationalise the intent of regulations and norms across technologies, and thereby help regulators regain governability. This part will showcase multiple relevant digital public goods and other open source solutions to demonstrate that normative approaches that can serve as alternatives to today’s dominant platforms already exist, and that it is possible to reimagine and evolve this landscape through norm setting, catalytic public investments, new sustainability and business models, including public-private partnerships and digital commons approaches.
    Expected Outcomes
    The session discussion will influence research, advocacy, and policy priorities and recommendations both on public-interest social media and AI risks. The session will directly feed into an in-depth follow-up workshop organized by the Digital Public Goods Alliance (DPGA) on the sidelines of IGF2025, bringing together policy representatives, domain experts, DPG product owners, and funders to explore how to develop, evolve, and scale normative social media technologies. A report, to be published by September 2025, will outline strategies for scaling normative social media, including policy insights and practical steps for implementation. Addressing information pollution is a DPGA priority, and session finding will shape follow-up activities led by DPGA members and Digital Public Goods product owners. The outcomes will drive advocacy and engagement with policymakers, technologists, and civil society to advance normative social media governance models.
    Hybrid Format: Hybrid Moderation: Our online moderator will monitor chat discussions, bring forward key questions, and ensure remote attendees' voices are included in live discussions. Live Polling & Q&A: We’ll utilize tools like Slido or Mentimeter to collect real-time audience responses from both onsite and remote participants to collect word cloud or category ranking about risks associated with today’s SoMe, vote on key public interest parameters that normative social media should incorporate, like enabling a free and open public discourse, protecting children and youth, transparency, consumer protection etc.) Breakout Discussions: We’ll use online breakout rooms (via Zoom or a similar platform) and in-person small group discussions simultaneously for running scenario based prioritisation exercises. Shared Digital Whiteboard: Miro will allow both onsite and online attendees to collaborate on brainstorming exercises regarding tech and policy matchmaking discussions.