IGF 2022 WS #406 Meaningful platform transparency in the Global South

Time
Thursday, 1st December, 2022 (12:35 UTC) - Thursday, 1st December, 2022 (13:35 UTC)
Room
Banquet Hall B

Organizer 1: Aishwarya Giridhar, Centre for Communication Governance, NLU-Delhi
Organizer 2: Nidhi Singh, Centre for Communication Governance
Organizer 3: Shashank Mohan, Centre for Communication Governance
Organizer 4: Joanne D'Cunha, Centre for Communication Governance

Speaker 1: Jhalak Mrignayani Kakkar, Civil Society, Asia-Pacific Group
Speaker 2: Chris Sheehy, Private Sector, Western European and Others Group (WEOG)
Speaker 3: Emma Llanso, Civil Society, Western European and Others Group (WEOG)
Speaker 4: Fernanda Martins, Civil Society, Latin American and Caribbean Group (GRULAC)

Moderator

Aishwarya Giridhar, Civil Society, Asia-Pacific Group

Online Moderator

Shashank Mohan, Civil Society, Asia-Pacific Group

Rapporteur

Joanne D'Cunha, Civil Society, Asia-Pacific Group

Format

Round Table - U-shape - 60 Min

Policy Question(s)

What does it mean to have meaningful platform transparency, and how does this change based on the type of information disclosed and the intended recipient of the information? What does accountability mean for digital social media platforms, and what is the role of transparency in ensuring accountability? What are the most important factors to consider in framing regulatory interventions relating to platform transparency, particularly in developing countries?

Connection with previous Messages: Our session will build on the following messages: “Policies implemented by Internet platforms to deal with harmful online content need to be transparent, acknowledge the limits of automated content moderation, and ensure a proper balance with the right to freedom of expression.” “The complex interplay between the market and society is being reshaped by online platforms. Online platforms continue to gain power in the digital world, generating high impact throughout the globe, especially in the Global South. There is no one-size fits all approach as impacts may be positive or negative, depending on the local reality.” The session seeks to engage with core components of the messages described above. We plan to explore the implementation of platform transparency measures, particularly in terms of regulatory interventions for Global South countries and contexts. We will also explore the role of various stakeholders in this process.

SDGs

16.10
16.8

Targets: The session will focus on the role that platforms, especially large social media intermediaries, play in ensuring public access to information and protecting fundamental rights such as expression and privacy (16.10). We will also explore the role of regulatory interventions in ensuring inclusive, participatory and representative decision-making in this context. Our session will focus on interventions tailored to the Global South and developing countries, which tends to be under-represented in international fora (16.8).

Description:

Transparency is currently a core focus area for regulators and other stakeholders in the context of the digital economy, particularly vis-à-vis large social media companies. This arises from the understanding that though platforms provide varied benefits, they have also been linked to a range of harms such as mis/disinformation, increased polarisation, harassment, discrimination and abuse, erosion of privacy, and anti-competitive behaviour. Though we know that advanced digital technologies and algorithms are at the core of platform functioning, there is very little external understanding of how platforms operate and on how their incentives drive behaviours and outcomes. This makes it difficult to address the challenges posed by advanced technologies and effectively harness the advantages they offer. Though transparency is a necessary first step in addressing harms arising from platforms, it is becoming increasingly clear that operationalising transparency can be challenging. For instance, many stakeholders have highlighted the difficulty in making sense of the information provided by platforms as part of their transparency reporting, and in drawing meaningful conclusions from such information. Moreover, the general understanding, especially in a regulatory context, seems to be that transparency will lead to increased accountability. However, that requires an understanding of what accountability would mean in the relevant context, on the information that is provided, and whom such information is provided to. A slew of transparency measures is currently being proposed by regulators, from reporting requirements on content moderation to the creation of ad libraries and increased calls for researcher access to platform data. Many of these conversations are taking place in regions such as the European Union, the United States, and other countries in the Global North. However, countries in the Global South are also contemplating regulatory interventions in this space. In this session, we seek to explore what meaningful transparency and accountability mean in the context of large social media platforms, the relationship between the two, and the stakeholders that transparency measures are meant to serve. Though our discussion will draw from global conversations on these issues, we hope to identify core considerations for regulators and other stakeholders in the Global South to prioritise as they begin to frame regulations in this space. We will explore, in particular, the different stakeholders that must work together, such as governments, tech companies, and civil society, to address the harms that have been identified in the context of social media platforms.

Expected Outcomes

The Centre for Communication Governance (CCG), together with the Global Network Initiative (GNI) and other civil society actors, has established an Action Coalition on Meaningful Transparency (ACT) to identify and build collaboration across various streams of digital transparency work, as well as to increase the diversity of perspectives considered in those efforts. Over the course of the last year, CCG has also been working on a report that engages with algorithmic transparency and accountability. We explored some of the challenges with operationalising transparency in the context of social media and the larger role that transparency measures seek to play in addressing platform harms. More generally, as the Indian government seeks to overhaul its information technology law, CCG will draw on insights from this session to continue to publish work that will support the development of relevant legislation and policy. CCG and GNI also aim for this session to be a part of our efforts to build a community of researchers and other stakeholders working on issues relating to platform governance, particularly in the Global South, to engage with each other. Learnings from the session will be published as blogs, on social media and would be tremendously useful in informing our research agenda for the future, particularly as it relates to platform governance.

Hybrid Format: Our session plan is flexible and is designed to allow for robust engagement and participation. To encourage interaction, we plan to allow for significant time for questions and comments from participants once the panelists have made their initial points, and will explore specific areas that participants indicate interest in. We will have multiple ways in which participants can provide their comments and questions - speaking online/ in person, typing in questions or comments on the live chat on the Official Online Participation Platform, and on a public Google document we will circulate with prompts for the participants. We will encourage in-person participants to also use these tools so that they are able to engage with those joining online. In previous sessions that we have conducted, we have found Google Docs to be the most bandwidth-friendly, simple, and easy to use tool for participants to flag questions and comments. Our on-site and online moderators will work with others from CCG and GNI to keep track of questions and comments from participants and flag them to the speakers and highlight emerging themes. In addition to active and engaged moderators, we plan to use polls as a tool to encourage participation and information sharing.

Online Participation

 

Usage of IGF Official Tool.