Check-in and access this session from the IGF Schedule.

IGF 2022 WS #458 Do Diverging Platform Regulations Risk an Open Internet?

    Time
    Thursday, 1st December, 2022 (06:30 UTC) - Thursday, 1st December, 2022 (08:00 UTC)
    Room
    Large Briefing Room

    Organizer 1: Marjorie Buchser, Chatham House
    Organizer 2: Yasmin Afina, Chatham House
    Organizer 3: Jacqueline Rowe, Global Partners Digital
    Organizer 4: Rowan Wilkinson, Chatham House
    Organizer 5: Alex Krasodomski, Chatham House

    Organizer 1: Civil Society, Western European and Others Group (WEOG)
    Organizer 2: Civil Society, Western European and Others Group (WEOG)
    Organizer 3: Civil Society, Western European and Others Group (WEOG)
    Organizer 4: Civil Society, Western European and Others Group (WEOG)
    Organizer 5: Civil Society, Western European and Others Group (WEOG)

    Speaker 1: Jacqueline Rowe, Civil Society, Western European and Others Group (WEOG)
    Speaker 2: Jamila Venturini, Civil Society, Latin American and Caribbean Group (GRULAC)
    Speaker 3: 'Gbenga Sesan, Civil Society, African Group
    Speaker 4: Usama Khilji, Civil Society, Asia-Pacific-Group 
    Speaker 5: Meg Chang, Private Sector, Asia-Pacific-Group 
    Speaker 6: Aman Nair, Civil Society, Asia-Pacific Group

    Moderator

    Yasmin Afina, Civil Society, Western European and Others Group (WEOG)

    Online Moderator

    Alex Krasodomski, Civil Society, Western European and Others Group (WEOG)

    Rapporteur

    Alex Krasodomski, Civil Society, Western European and Others Group (WEOG)

    Format

    Panel - Auditorium - 90 Min

    Policy Question(s)
    1. What forms of online platform regulation are emerging in different parts of the world, and in what ways do they diverge?
    2. What risks does policy divergence pose to an open and interoperable Internet, as well as to human rights?
    3. How can these risks be mitigated, and what opportunities are there for encouraging harmonisation and consensus?

    Connection with previous Messages:

    The primary IGF 2021 Message that this session builds upon is the need, identified in 2021, for “harmonization - ensuring that the Internet remains a global, unified platform that enables the exercise of human rights”. The session would build on this message by promoting a better understanding of the elements of platform regulation requiring greater harmonisation, and opportunities to encourage this harmonisation.

    The session also builds on a number of further IGF 2021 which sought to encourage appropriate, human rights-respecting regulatory frameworks relating to digital technologies, including the Messages that “adequate enabling environments (e.g. policies, legislation, institutions) need to be put in place at the national, regional and global levels to foster inclusive, just, safe, resilient and sustainable digital societies and economies” and that “policies implemented by Internet platforms to deal with harmful online content need to be transparent, acknowledge the limits of automated content moderation, and ensure a proper balance with the right to freedom of expression”.

     

    SDGs

    9.c
    16.10
    Targets: The primary link between this session and the SDGs is Target 9.c - “Significantly increase access to information and communications technology and strive to provide universal and affordable access to the Internet in least developed countries by 2020”. While the session does not relate to internet access in a narrow sense, it will explore how to ensure that the internet itself remains open and interoperable, which is an important prerequisite for meaningful internet connectivity and access. Further, by focusing on identifying elements of platform regulation that may restrict human rights, the session also links to Target 16.10 - “Ensure public access to information and protect fundamental freedoms, in accordance with national legislation and international agreements”, and will help policymakers in this space understand potential risks to human rights in domestic platform regulation.

    The last few years has seen a plethora of news laws and proposals which would regulate online platforms and other internet intermediaries. In the absence of any international frameworks or consensus on how to govern intermediaries, many of these laws and proposals are diverging widely. This is leading to potential impacts on intermediaries’ ability to operate globally, barriers to entry for new intermediaries in already concentrated markets, and risks to freedom of expression and other human rights.

    This session will:

    1. Provide a stocktaking exercise, examining some of the laws and proposals developed in recent years, particularly in parts of the world to which less attention has been paid so far, such as Latin America, Africa and South Asia;
    2. Present Chatham House and Global Partners Digital’s research on platform regulations and norms, as well as their potential impacts on an open and interoperable internet;
    3. Invite discussion on how to address the risks that policy divergence and fragmentation in this space poses to an open and interoperable internet, including through efforts to promote greater harmonisation and consensus among policymakers.

     

    Expected Outcomes

    The session will present the research undertaken by Chatham House and Global Partners Digital on platform regulations and norms and their potential impacts.

    The session will also provide an opportunity for inputs into a report to be developed by Chatham House in early 2023 assessing divergent forms of platform regulation. More broadly, the session will provide an opportunity for knowledge sharing, with participants able to better understand different forms of platform regulation being developed in different parts of the world.

    At the same time, the session will also encourage identification of, and participation in forums and processes, which promote harmonisation and consensus among policymakers in this space.

     

    We have deliberately requested a 90 minute session in order to ensure the maximum amount of time for discussion among the panellists and with the participants more broadly.

    • Chatham House will present the joint research that has been undertaken on online platform regulations, in partnership with Global Partners Digital, for the first 15 minutes.
    • The session will comprise short presentations of 5-6 minutes from each of the panellists, providing an overview of platform regulation trends in four key regions (Latin America, Europe, Africa, and South Asia).
    • The presentations will then be followed by a perspective shared from an industry angle.
    • In order to get the conversation flowing, the moderators will then use the next 35 minutes to ask questions of the panellists questions, reflecting on the ways that platform regulations are diverging. The moderators will also facilitate a discussion among the participants themselves, with input from the panellists. To encourage good online/offline participation, both the onsite and offsite moderators will coordinate, with questions to the panellists alternating between those from in person participants and those online.

    The moderators will play an active role, encouraging questions on specific themes. The moderators will also encourage constructive questioning and discussion on the issue of how to encourage greater harmonisation and consensus-building among policymakers in this space, including through global and regional standards and norms. Where necessary, the moderator will pose questions directly to the panellists on this issue.

     

    Online Participation

    The discussion will be held in hybrid format; facilitated by IGF's official tools (using Zoom as the primary platform for discussions). 

     

     

    Key Takeaways (* deadline at the end of the session day)

    The elaboration and enforcement of global standards may pave the way for greater alignment in regulating digital platforms. Its feasibility and desirability, however, remain much-contested. The development of standards, and more generally deliberations on platform regulations, must be done against the democratic context of each country and region in addition to their respective political, socio-cultural, legal and historical backgrounds.

    In addition, platform regulations bear great importance in shaping the power balance between governments, ‘big tech’ companies, civil society and everyday users of platforms. Greater resources must be dedicated to promoting and facilitating honest and inclusive multi-stakeholder discussions; protect digital platforms as an open and neutral civic space; and ultimately foster a healthy digital ecosystem for all.

    Session Report (* deadline 9 January) - click on the ? symbol for instructions

    Background

    The last few years has seen a plethora of new laws and proposals which would regulate online platforms and other internet intermediaries. In the absence of any international frameworks or consensus on how to govern intermediaries, many of these laws and proposals are diverging widely. This is leading to potential impacts on intermediaries’ ability to operate globally, barriers to entry for new intermediaries in already concentrated markets, and risks to freedom of expression and other human rights.

    Chatham House, along with Global Partners Digital, seeks to better understand the regulatory landscape around digital platform governance, and how this varies between regions. To this end, Chatham House convened a workshop at the 2022 Internet Governance Forum, bringing together experts and practitioners from Latin America, Africa, Europe, South Asia and South-East Asia to discuss and better understand the various regulatory approaches, and to extract common themes between them. The discussion sought to communicate and share this understanding widely and highlight areas which could benefit from further investigation and exploration.

    The discussion focused on several questions, including: What forms of online platform regulation are emerging in different parts of the world, and in what ways do they diverge? How is the local democratic context in each region? What does a human rights-based approach to regulation mean? Are these measurable through reviewing legislation? What risks does policy divergence pose to an open an interoperable internet, as well as to human rights? And how can these risks be mitigated? What opportunities are there for encouraging harmonisation and consensus? 

    Key Regulatory Trends Across Regions

    The European region is often perceived as pioneering the regulatory landscape surrounding digital platform governance. Legislations both at the European Union (EU) level (e.g., the Digital Services Act, DSA) and at the national level (e.g., Germany’s Network Enforcement Act) often serve as an example shaping regulations in other countries. Within the context of the EU, the adoption of the DSA is hailed as a particular success, and requires, among others, platforms to: have clear terms of service and redress systems in place for users; publish transparency reports; and each member state to appoint a national independent regulator as the Digital Service Coordinator, likely fostering greater collaboration and information sharing across countries. Yet, the success of the DSA will highly depend on implementation and enforcement – particularly in the light of human rights. Furthermore, beyond the EU bloc, concerns arise surrounding ‘outliers’ (in particular Belarus, Russia and Turkey) with regards to their non-alignment with the Act, in addition to vaguely worded restrictions on politicised content types (e.g., those offensive to public morality) and potential criminal sanctions on individual platform employees. 

    In Latin America, there is no established regulatory body producing regulations; this results in legislations varying from one another and for which there seems, at the moment, to be no appetite for alignment. Despite this fragmented regulatory landscape, one common approach across countries corresponds to considering major platforms as holding great influence over social discussions; yet user experiences and harm (e.g., misinformation and abuse) are often overlooked. In this sense, there is a pattern in regulatory approaches where instruments now pay greater focus to harm on social media platform over the bigger picture/internet regulation. With regards to the protection of freedom of expression, the Inter-American human rights system serves as a system to help safeguard this right across the region. 

    Regulatory instruments in Africa have also shifted their focus in recent years: the main concern changed from ICT access to heavily-politicised legislations. The most common approach consists of exercising control over the platforms and, subsequently, their users, which provides greater power and protection for states and their respective governing regimes. This approach is in contrast with standards where the overarching aim is to provide and guarantee protection for all. Such control is, for example, reflected in the increase in requirement, over the past 18 months, for platforms to formally register; thus paving the way for risks related to licensing; these platforms’ accessibility to the people; and proactive requirements by states in the realm of content moderation. In addition, issues surrounding non-compliance with human rights norm in the ‘offline’/real world bear influence online; for example as seen through the prevalence  and normalisation of emergency laws, and their effect on online platform governance.

    The South Asian regulatory landscape is, at the moment, highly dynamic and evolves very quickly. It does not only comprise legislations directly governing digital platforms; they also include those indirectly affecting these platforms, their users’ activities, and the power of states and governments over them. In India, the dominating approach is characterised by a general sense of distrust against non-Indian platforms; while greater protection is provided for national platforms by fear of external influence on the civic space. Echoing, to a certain extent, the approach adopted in African countries, two draft legislations were flagged as raising questions surrounding the government’s power and control exercised over platforms: the Indian Telecommunication Bill, establishing a licensing requirement and thus, raising questions over an open and free internet; and the Digital Personal Data Protection Bill, which will expand surveillance powers to the government. Pakistan’s regulatory landscape is also heavily focused on control, given that digital platform governance is framed around criminal law, with a particular focus on exercising control over dissidents. Concerns also arise with regards to the mandate conferred to regulators to interpret constitutional provisions, who often may overstep on the role of judges. Nevertheless, in both countries, multistakeholder advocacy efforts at preserving human rights and an open, free internet bear strength and influence over the regulatory landscape. 

    Commonalities and Question Marks

    1. In the absence of a supra-national regulatory body (akin to the European Union), the alignment and eventual harmonisation, within a region, of regulations governing digital platforms remain a challenge. Whether such harmonisation is, at all, desirable remains, however, debatable: in the light of countries’ respective priorities, legislative landscape and regulators’ varying mandates, the adoption of global standards working for all constitutes a challenge. This fragmented landscape makes it difficult for digital platforms to navigate different, and sometimes competing regulatory instruments across countries; especially with regards to enforcement and implementation. 
    2. Concerns arise surrounding the increase of regulatory tools conferring the responsibility to moderate and respond to online content towards the platforms (in contrast with an independent regulatory body); oftentimes threatening, if not ‘hostage-taking’ platform employees with risks of individual criminal liability, while paving the way for deteriorating compliance with human rights norms. 
    3. Digital platforms remain, at times, the last civic space available and accessible to all. This is particularly in the light of licensing requirements and other restrictions surrounding other forms of media (e.g., the radio, television broadcasting, etc.); thus, in certain countries, these platforms ought to be maintained as the ‘last fortress’ to enable open, democratic and participatory civic engagement.

    Risk Mitigation & Solutions

    1. Discussions and deliberations surrounding the regulation of digital platforms, as well as the eventual establishment of international standards and other soft law must be inclusive and of multistakeholder nature. There is a particular desire for governments to demonstrate greater political will in engaging and including civil society in shaping the regulatory landscape surrounding digital platforms. 
    2. Stakeholders with significant resources must facilitate and pave the way for inclusive and multistakeholder discussions and fora, in addition to leveraging these resources to improve the general understanding, across stakeholders, on the dynamics and trends surrounding platform regulation.  
    3. Governance deliberations and analyses must take into account local democratic contexts. These include, for example, local laws and customs, socio-political realities on the ground, human rights approaches, as well as the power relationships between the state and the people. 
    4. There is a need for digital platforms, in particular those bearing great presence over the population (e.g., Meta), to acknowledge the important role and influence they have; exercise responsibility in their approach to content moderation while preserving and safeguarding human rights norms; and show and exercise greater equity in the way they engage with users across regions.