Check-in and access this session from the IGF Schedule.

IGF 2019 WS #218 Deliberating Governance Approaches to Disinformation

    Organizer 1: Eileen Donahoe, Global Digital Policy Incubator, Stanford Universiiy
    Organizer 2: James Fishkin, Stanford Center for Deliberative Democracy
    Organizer 3: Max Senges, Google
    Organizer 4: Jan Rydzak, Stanford Global Digital Policy Incubator
    Organizer 5: Alice Siu, Stanford Center for Deliberative Democracy
    Organizer 6: John Weitzmann, [email protected]
    Organizer 7: Malavika Jayaram, Digital Asia Hub

    Speaker 1: Jaclyn Kerr, Government, Western European and Others Group (WEOG)
    Speaker 2: Jan Rydzak, Civil Society, Western European and Others Group (WEOG)
    Speaker 3: Vidushi Marda, Civil Society, Asia-Pacific Group
    Speaker 4: Sabine Frank , Private Sector, Western European and Others Group (WEOG)
    Speaker 5: Moses Karanja, Technical Community, African Group

    Moderator

    Max Senges, Private Sector, Western European and Others Group (WEOG)

    Online Moderator

    Alice Siu, Civil Society, Western European and Others Group (WEOG)

    Rapporteur

    James Fishkin, Civil Society, Western European and Others Group (WEOG)

    Format

    Round Table - U-shape - 90 Min

    Policy Question(s)

    - Evaluation of governance approaches: What are the trade-offs of various recent European policy instruments that address disinformation, especially with regard to preserving the balance between freedom of expression and the quality of discourse necessary to sustain democratic governance? - Multi-stakeholder contribution: What specific opportunities can focused multi-stakeholder assessment of existing policies open in the process of creating future policy instruments and regulatory models on disinformation? - Improving collaboration and standard-setting: What role should different stakeholder groups - including private sector Internet platforms, governments, and civil society actors - play in defining the standards for acceptable content in light of the dual need for freedom of expression and protection against the harmful effects of online content? How can globally accepted standards be developed? How can policy developments on the subject in different geographic regions inform each other? What unexplored forms of collaboration would help in fighting disinformation and ‘fake news’?

    SDGs

    GOAL 4: Quality Education
    GOAL 9: Industry, Innovation and Infrastructure
    GOAL 16: Peace, Justice and Strong Institutions
    GOAL 17: Partnerships for the Goals

    Description: This session will use innovations in the deliberative method to assess the strengths, shortcomings, and effects of three policy instruments that address disinformation and content moderation at scale in the European Union. It seeks to compare these approaches using a methodology that relies on objective ground truths and a series of deliberations conducted prior to IGF. Participants will identify cross-regional confluence points with regulatory and other actions that are being undertaken outside Europe, and develop best practices that cut across geographies. Ultimately, the session will help develop informed solutions that maximize the possibility for freedom of expression and democratic discourse while mitigating the harmful consequences of disinformation in online spaces. The conversation will center on France’s Law Against Manipulation of Information (2018), the UK House of Commons Digital, Culture, Media & Sport Committee’s ‘Online Harms’ white paper and proposal (2019), and the EU Code of Practice on Disinformation (2018). These three policy instruments represent distinct regulatory and self-regulatory approaches to content moderation and the proliferation of disinformation online and offline. In the months leading up to the session, two or three small-group deliberations will take place online, with the use of specially commissioned Balanced Briefing Materials and an automated smart moderator tool designed and tested at Stanford. These exercises will use the deliberative method. The materials, generated in the preparation phase, consider trade-offs between policy options on governing disinformation. These sessions will include IGF participants as well as other Internet governance stakeholders. Before the deliberation, we will survey our sample of stakeholders using questions related to the policy instruments. Those who take part in the deliberation will be re-polled immediately afterwards; their changes of opinion represent the new conclusions the public might reach if they had the opportunity to deliberate through an informed and fact-based process. We expect to demonstrate that debates based on shared ground-truth briefing materials provide a basis for informed decision making in the realm of content governance and freedom of expression online as a whole. Building on these online deliberations, the session at IGF will be structured as follows. (1) Introduction and overview of deliberative method (10 min.): The research team will open with an overview of the briefing materials, the rules governing the deliberation, and the performance of the automated moderator tool. Members of the research team will also briefly discuss findings and lessons from the deliberation on NetzDG at IGF Deutschland in 2018, and encourage participants to review the briefing materials for that session separately. (2) Expert discussion of deliberation results (60 min.): The organizers will present a snapshot of the results of the deliberative polls, focusing on reported changes in participants’ positions on different components of each instrument and level of knowledge following the deliberation. Invited experts familiar with the three laws and their links with policies being developed in other regions will assess the deliberations’ contributions to outlining best practices for addressing disinformation. (3) Debrief and Q&A (20 min.): The organizers will summarize the session, announce next steps, provide a brief preliminary assessment of the applicability of the deliberative method to the global discussion on disinformation, and allow space for any additional questions. The session builds on numerous successful implementations of the deliberative method, including the Deliberative Poll on the European Union (2009), a pilot deliberation on multi-stakeholder collaboration for extending Internet access to the next billion users (IGF 2015), a deliberation on encryption (IGF 2016), and the recent IGF Deutschland (2018), at which participants debated the German ‘NetzDG’ law using the same methodology.

    Expected Outcomes: The deliberative method is geared toward producing practical outcomes. Past Deliberative Polling exercises provide strong evidence of significant and measurable knowledge gains and changes in opinion among participants. We expect that this will be the case for this workshop as well, as not all participants will be conversant in all three policies at the outset. The workshop will produce the following outputs: (1) polling results measuring changes in levels of knowledge and preferences among participants, (2) a set of Balanced Briefing Materials with multiple uses outside of the deliberative process (e.g., comparative analysis of policy instruments), and (3) a report on the findings. The workshop will also lay a foundation for further deliberative exercises on the development of the three policies. The results of the workshop will form the basis for advisory opinions on these policy instruments, which will play a direct role in defining best practices for future legislation, particularly in cases where legislative proposals have yet to be formulated. Finally, in the broadest sense, the workshop will showcase the utility of a novel methodology to carry out an informed discussion and analysis of laws that govern online content. The method helps counter misinformation on existing and proposed policy instruments, guard against cognitive barriers that could marginalize or exclude individuals, and lead to reasoned decision-making. All three instruments discussed were published in 2018-19; this session would be their first comparative multi-stakeholder assessment. This will help distinguish the exercise from sessions that tackle disinformation as a broad issue without explicitly addressing policy instruments as well as from those that analyze a single instrument in isolation. We are happy to collaborate with other workshop organizers in the same field to ensure that our session is complementary and to drive collaboration in this space beyond the IGF.

    Interaction is critical to the success of the deliberative process. The IGF session will be actively moderated to ensure feedback not only from European actors, but from stakeholders from other countries who are interested in the cross-national diffusion of policy solutions on disinformation. The pre-IGF small-group deliberations will be guided by moderators well-versed in the method, who will prioritize giving every participant a chance to express themselves. This will drive the changes in levels of consensus and knowledge the project seeks to measure. The pre- and post-event polls are designed to maximize inclusion and useful feedback.

    Relevance to Theme: The session responds to the theme’s focus on defining overarching standards for acceptable content within the substrand of disinformation. It uses a collaborative, multi-stakeholder framework with a proven record in impacting policy to assess the impact, similarities, and differences among three policy instruments, all launched or proposed in the European Union, but globally relevant. This highlights the diversity of frameworks and categories of instruments (e.g., self-regulation, legislation, advisory reports, public-private partnerships) that have been applied to disinformation, leading to a more detailed and informed international conversation. The approach creates a basis for comparison and further collaborative work on the issue across various domains of expertise, including academia, civil society, the private sector, and other communities.

    Relevance to Internet Governance: The session will drive the cross-regional search for additional stakeholders (policymakers, technologists, netizens) who should be included in the debate on disinformation, and help define their roles. We will base the discussion on a research report that analyses the challenges and options in this field. We will capture the discussion at the IGF workshop and use it to improve these briefing materials to serve the concerned stakeholders better. All participating stakeholders, including government actors, will come out of the exercise with a deeper understanding of the possible regulatory, advisory, and self-regulatory instruments that address disinformation, which will inform future regulation in this space. The session’s emphasis on cross-regional insight broadens its relevance to Internet governance write large rather than a single geographic region. Balanced Briefing Materials can contribute meaningfully to deliberations about Internet governance solutions. Balanced Briefing Materials and deliberative methods offers a way to both provide a shared understanding of the challenges and options for Internet governance issues and to measure the impact of individual discussions.

    Online Participation

    A number of individuals who will participate in the online deliberation prior to IGF might be absent from the event itself. The online participation tool will enable us to receive their input and collect feedback from those who are neither able to participate in the online deliberation nor in the on-site session. The online moderator will be carefully monitoring the queue so that all participants with remarks are heard. We are also ready to devote a sub-block of the session to comments from the online platform if there are numerous comments.