Check-in and access this session from the IGF Schedule.

IGF 2021 Open Forum #4 Free expression and digitalisation: compatibility mode

    Time
    Tuesday, 7th December, 2021 (16:30 UTC) - Tuesday, 7th December, 2021 (17:30 UTC)
    Room
    Conference Room 7
    Issue(s)

    Regulation, competition and innovation: How could regulatory and self-regulatory frameworks help foster more competitive Internet-related markets, a larger diversity of business models, and more innovation? How to enable equitable access to data, marketplaces or infrastructures for fostering competition and innovation on the Internet?
    Content moderation and human rights compliance: How to ensure that government regulation, self-regulation and co-regulation approaches to content moderation are compliant with human rights frameworks, are transparent and accountable, and enable a safe, united and inclusive Internet?

    Panel - Auditorium - 60 Min

    Description

    Freedom of expression, as protected by Article 10 European Convention on Human Rights, is not only a fundamental individual right. It is also a means to protect and enhance democracy through open and public debate. Digital technologies must and have the potential to uphold and support freedom of expression. They importantly facilitate communication and ensure users’ access to relevant information, also enabling them to make informed choices and participate actively in democratic processes. Smart (including AI-based) tools are also used by media outlets to do research and create and distribute news. At the same time, the impacts of digitalisation on freedom of expression are not only positive. Communication between individuals is increasingly shaped by digital platforms that rank speech for profit – favouring sensationalist and often hateful exchanges, affecting freedom of expression among both the general public and vulnerable groups and ultimately silencing many. In response to the prevalence of illegal and problematic content, new forms of interference with freedom of expression arise – the blocking, filtering, removal, demotion or demonetisation of online content, that are often performed through automated processes and do not necessarily satisfy the requirements of legality, legitimacy and proportionality. While major digital platforms have become new influential media actors, their editorial responsibilities remain unclarified, for instance, with respect to required levels of transparency. At the same time, their dominance over traditional media actors, facilitated also by often unequal access to technology and data, creates important structural shifts within media markets. Wider societal, economic and political impacts of the implementation of digital platforms’ content-related policies and terms of service – that largely rely on the use of digital technologies and essentially affect freedom of expression – remain largely unexplored. How to ensure that further integration of digital technologies into public communication spaces and the media occurs in full respect of human rights, notably freedom of expression? What principles should be adhered to to ensure that digital technologies serve rather than curtail that freedom? Can meaningful transparency help develop evidence-based and efficient regulatory measures to ensure that the use of digital technologies supports freedom of expression and democracy more generally? And if yes, what levels of transparency, with data available to which actors, and on which conditions could create a “compatibility mode” for freedom of expression to flourish in the online environment? The discussion will rely, among others, on:

    - the Guidance Note on best practices towards effective legal and procedural frameworks for self-regulatory and co-regulatory mechanisms of content moderation (https://rm.coe.int/content-moderation-en/1680a2cc18)

    -  the draft recommendation of the Council of Europe Committee of Ministers to member States on the impacts of digital technologies on freedom of expression (https://rm.coe.int/msi-dig-2020-05-draft-recommendation-on-the-impact-o…)
     

    Organizers

    Council of Europe

    Speakers
    • Ms Natali Helberger, University of Amsterdam - Chair of the Council of Europe Expert Committee on Freedom of Expression and Digital Technologies
    • Ms Kathleen Stewart, Public Policy Manager | Content Regulation (EMEA), META
    • Mr Yoichi IIda, Deputy Director General for G7 and G20 Relations, Ministry of Internal Affairs and Communications of Japan
    • Mr Matthias C. Kettemann, Associated Researcher, Head of Research Programme  Regulatory Structures and the Emergence of Rules in Online Spaces, Leibniz Institute for Media Research │ Hans-Bredow-Institut (HBI)
    Onsite Moderator

    n/a

    Online Moderator

    Eliska Pirkova, Europe Policy Analyst and Global Freedom of Expression Lead, Access Now

    Rapporteur

    Rodica Ciochina, Elena Dodonova, Council of Europe

    SDGs

    5. Gender Equality
    9. Industry, Innovation and Infrastructure
    10. Reduced Inequalities

    Targets: The discussion, among others, will address the impact of major digital platforms' policies on vulnerable groups, and will also seek to stress the importance of enhancing research into such impacts.

    Key Takeaways (* deadline at the end of the session day)

    The the quality of regulation is about its capacity to provide efficient solutions that would, among others, respect, protect and promote human rights, and in particular – freedom of expression.Freedom of expression in the digital environment is not just about content moderation, but a structural problem. Among key solutions are the focus on processes and meaningful transparency.

    Regulation should be primarily focused on the processes through which internet intermediaries rank, moderate, and remove content, rather than on the content itself. Transparency, access to platform held data, re-alignment of interests of the states, researchers and platforms are among the conditions for evidence-based policymaking (and for independent research underpinning it).

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

     

    Still some years ago, the discussion focused on whether to regulate internet intermediaries’ activities in the field of media and information environment, or not. Today, as an answer, we see a range of regulatory attempts both at the national and international levels. What is at issue, however, is the quality of regulation and its capacity to provide efficient solutions that would, among others, respect, protect and promote human rights, and in particular – freedom of expression.

    As a follow-up to its earlier work on this subject, the Council of Europe developed the Guidance Note on best practices towards effective legal and procedural frameworks for self-regulatory and co-regulatory mechanisms of content moderation (adopted by the Steering Committee on Media and Information Society (CDMSI) in May 2021) and the draft recommendation on the impacts of digital technologies on freedom of expression (currently pending adoption by the Council of Europe Committee of Ministers).

    Both documents start from the premise that upholding freedom of expression in the digital environment is not just about content moderation, but it is rather a structural problem. Among key solutions, focus on processes (beyond the binary choice between taking content down or not) and meaningful transparency are proposed.

    In the session, speakers from different stakeholder groups (government, academia, business and NGOs) discussed this approach, adding insights from their specific perspectives.

    There was agreement that, as an alternative to fragmented regulatory response, the system as a whole should to be put under review. Up to date, both in Europe and beyond (e.g., in Japan) discussions continue about the types of content that should be considered legal/ illegal, or legal but problematic in different ways (e.g., harmful for certain categories of users). As values can often be contextual, regulation should be primarily focus on the processes through which internet intermediaries rank, moderate, and remove content, rather than on the content itself.

    Some speakers argued that an overarching framework for regulation and oversight of the digital platforms’ engagement in the media and information environment – ideally, at the global level, would be necessary, as it is the case with a number of other industries. At the same time, strong doubts were expressed against extending the rules governing legacy media to digital platforms. Instead of approaching the issue of platform regulation in isolation, it would be important to look at how they fit in a broad ecosystem of actors that follow different rules.

    As a common ground on this question, the panel agreed that standardisation of approaches was important and urgently needed. In this sense, the Council of Europe’s standard-setting documents provide an important reference point for alignment of practices across Europe and beyond. The panel also widely acknowledged their consistency with the EU’s relevant work.

    Apart from aligned approaches, the efficiency of regulation rests on the completeness and adequateness of factual information about individual and societal impacts of the use of digital technologies for human rights, particularly freedom of expression, across different social, political, legal, and cultural contexts. At the session, meaningful transparency was cited as a pre-condition for evidence-based policymaking (and for independent research underpinning it).

    It is crucial that researchers be provided due access to platform-held data, as freely available data only offers a limited view and may not invite the right questions. Access to substantial and full data would allow for an in-depth research on issues so far obviously problematic but little explored (such as disinformation, or the impact of social media on children), and would allow researchers to propose demonstrably true answers to policymakers. Re-alignment of interests of the states, researchers and platforms was proposed as a way forward – based on the provisions laid down in the Council of Europe’s new standards and the EU’s relevant documents.

    The speakers particularly stressed that meaningful transparency requirements should be tailored and aimed at efficient oversight and at user empowerment, including by offering them real choices (in contrast to providing numbers for the numbers’ sake”). And – most importantly - information gathered from transparency measures should then be translated into action. Access to data is just a first step. Researchers must have the conditions to fulfil their role, and to reach policymakers and platforms, so that research could be turned into efficient policies devising a compatibility mode for freedom of expression and digital technologies.