IGF 2022 WS #406 Meaningful platform transparency in the Global South

    Time
    Thursday, 1st December, 2022 (12:35 UTC) - Thursday, 1st December, 2022 (13:35 UTC)
    Room
    Banquet Hall B

    Organizer 1: Aishwarya Giridhar, Centre for Communication Governance, NLU-Delhi
    Organizer 2: Nidhi Singh, Centre for Communication Governance
    Organizer 3: Shashank Mohan, Centre for Communication Governance
    Organizer 4: Joanne D'Cunha, Centre for Communication Governance

    Speaker 1: Jhalak Mrignayani Kakkar, Civil Society, Asia-Pacific Group
    Speaker 2: Chris Sheehy, Private Sector, Western European and Others Group (WEOG)
    Speaker 3: Emma Llanso, Civil Society, Western European and Others Group (WEOG)
    Speaker 4: Fernanda Martins, Civil Society, Latin American and Caribbean Group (GRULAC)

    Moderator

    Aishwarya Giridhar, Civil Society, Asia-Pacific Group

    Online Moderator

    Shashank Mohan, Civil Society, Asia-Pacific Group

    Rapporteur

    Joanne D'Cunha, Civil Society, Asia-Pacific Group

    Format

    Round Table - U-shape - 60 Min

    Policy Question(s)

    What does it mean to have meaningful platform transparency, and how does this change based on the type of information disclosed and the intended recipient of the information? What does accountability mean for digital social media platforms, and what is the role of transparency in ensuring accountability? What are the most important factors to consider in framing regulatory interventions relating to platform transparency, particularly in developing countries?

    Connection with previous Messages: Our session will build on the following messages: “Policies implemented by Internet platforms to deal with harmful online content need to be transparent, acknowledge the limits of automated content moderation, and ensure a proper balance with the right to freedom of expression.” “The complex interplay between the market and society is being reshaped by online platforms. Online platforms continue to gain power in the digital world, generating high impact throughout the globe, especially in the Global South. There is no one-size fits all approach as impacts may be positive or negative, depending on the local reality.” The session seeks to engage with core components of the messages described above. We plan to explore the implementation of platform transparency measures, particularly in terms of regulatory interventions for Global South countries and contexts. We will also explore the role of various stakeholders in this process.

    SDGs

    16.10
    16.8

    Targets: The session will focus on the role that platforms, especially large social media intermediaries, play in ensuring public access to information and protecting fundamental rights such as expression and privacy (16.10). We will also explore the role of regulatory interventions in ensuring inclusive, participatory and representative decision-making in this context. Our session will focus on interventions tailored to the Global South and developing countries, which tends to be under-represented in international fora (16.8).

    Description:

    Transparency is currently a core focus area for regulators and other stakeholders in the context of the digital economy, particularly vis-à-vis large social media companies. This arises from the understanding that though platforms provide varied benefits, they have also been linked to a range of harms such as mis/disinformation, increased polarisation, harassment, discrimination and abuse, erosion of privacy, and anti-competitive behaviour. Though we know that advanced digital technologies and algorithms are at the core of platform functioning, there is very little external understanding of how platforms operate and on how their incentives drive behaviours and outcomes. This makes it difficult to address the challenges posed by advanced technologies and effectively harness the advantages they offer. Though transparency is a necessary first step in addressing harms arising from platforms, it is becoming increasingly clear that operationalising transparency can be challenging. For instance, many stakeholders have highlighted the difficulty in making sense of the information provided by platforms as part of their transparency reporting, and in drawing meaningful conclusions from such information. Moreover, the general understanding, especially in a regulatory context, seems to be that transparency will lead to increased accountability. However, that requires an understanding of what accountability would mean in the relevant context, on the information that is provided, and whom such information is provided to. A slew of transparency measures is currently being proposed by regulators, from reporting requirements on content moderation to the creation of ad libraries and increased calls for researcher access to platform data. Many of these conversations are taking place in regions such as the European Union, the United States, and other countries in the Global North. However, countries in the Global South are also contemplating regulatory interventions in this space. In this session, we seek to explore what meaningful transparency and accountability mean in the context of large social media platforms, the relationship between the two, and the stakeholders that transparency measures are meant to serve. Though our discussion will draw from global conversations on these issues, we hope to identify core considerations for regulators and other stakeholders in the Global South to prioritise as they begin to frame regulations in this space. We will explore, in particular, the different stakeholders that must work together, such as governments, tech companies, and civil society, to address the harms that have been identified in the context of social media platforms.

    Expected Outcomes

    The Centre for Communication Governance (CCG), together with the Global Network Initiative (GNI) and other civil society actors, has established an Action Coalition on Meaningful Transparency (ACT) to identify and build collaboration across various streams of digital transparency work, as well as to increase the diversity of perspectives considered in those efforts. Over the course of the last year, CCG has also been working on a report that engages with algorithmic transparency and accountability. We explored some of the challenges with operationalising transparency in the context of social media and the larger role that transparency measures seek to play in addressing platform harms. More generally, as the Indian government seeks to overhaul its information technology law, CCG will draw on insights from this session to continue to publish work that will support the development of relevant legislation and policy. CCG and GNI also aim for this session to be a part of our efforts to build a community of researchers and other stakeholders working on issues relating to platform governance, particularly in the Global South, to engage with each other. Learnings from the session will be published as blogs, on social media and would be tremendously useful in informing our research agenda for the future, particularly as it relates to platform governance.

    Hybrid Format: Our session plan is flexible and is designed to allow for robust engagement and participation. To encourage interaction, we plan to allow for significant time for questions and comments from participants once the panelists have made their initial points, and will explore specific areas that participants indicate interest in. We will have multiple ways in which participants can provide their comments and questions - speaking online/ in person, typing in questions or comments on the live chat on the Official Online Participation Platform, and on a public Google document we will circulate with prompts for the participants. We will encourage in-person participants to also use these tools so that they are able to engage with those joining online. In previous sessions that we have conducted, we have found Google Docs to be the most bandwidth-friendly, simple, and easy to use tool for participants to flag questions and comments. Our on-site and online moderators will work with others from CCG and GNI to keep track of questions and comments from participants and flag them to the speakers and highlight emerging themes. In addition to active and engaged moderators, we plan to use polls as a tool to encourage participation and information sharing.

    Online Participation

     

    Usage of IGF Official Tool.

     

    Session Report (* deadline 26 October) - click on the ? symbol for instructions

     

    At our session at IGF, we sought to identify key factors for regulators in the Global South to consider as they contemplate transparency regulations for social media platforms. We discussed the kinds of regulatory interventions being contemplated, the kinds of harms sought to be addressed, and safeguards and other considerations that regulations must account for. The following are the key themes and conversations that emerged over the course of the discussion.

    Participants and panelists spoke of the importance of using a two-pronged approach in considering transparency mechanisms for platform governance, which provides transparency obligations for both platforms as well as governments. The importance of considering how different States in the Global North and the Global South have challenges which are contextual to them was also highlighted. Transparency regulations must be framed such that they do not become tools for enhanced control over speech, and applying transparency requirements to States is essential in this regard.

    The panelists spoke about the importance of recognising that transparency is an iterative process which will have to adapt to the changing technological and regulatory environments, and will evolve based on insights gained from information provided by platforms. As first steps, it would be important to develop enabling frameworks to see what kind of information would be useful for platforms to provide, and to incorporate measures such as data audits and researcher access to platform data.

    Fernanda Martins spoke about experiences on platform behaviour during elections, and on the importance of working together to reduce political violence and misinformation in Brazil. She highlighted how the harms of disinformation or political violence are not limited to election periods, but are rather spread across broader timelines, meaning that platform efforts to tackle these behaviours could not be restricted to election times. Fernanda also spoke about the unpredictable nature of social media platforms – changes in governance or ownership structure have significant implications on online speech, and harms such as political disinformation and violence. Platforms can change behavior in significant ways if they are bought or sold, and such decisions can have massive effects on political speech and have other real-world consequences.

    Shashank Mohan spoke through some of the goals and challenges of operationalising transparency. Ideally, transparency would lead to a fair and unbiased experience for users on social media platforms, and a system that respects user rights. Any measures to operationalise transparency would have to include contextual factors such as the scale of relevant populations and the level of heterogeneity. Information provided without accounting for such considerations could be incomplete or have limited utility - for example, broadly worded requirements for transparency in the context of content takedowns may mean that platforms provide broad metrics on their moderation efforts and not account for nuances in local contexts which may be necessary to address harms. This would not serve the purpose of transparency regulations, and therefore regulatory interventions would need to balance the level of granularity required by transparency mandates. Shashank also highlighted the importance of the Santa Clara principles in developing standards in this context.

    Emma Llanso outlined the history of transparency mandates and provided an overview of various approaches to transparency that are currently being adopted. She spoke of the different kinds of regulatory interventions and their goals – the Digital Services Act, for example, sets out different obligations for platforms, and requires that they provide information on the automated tools they use, and also on the considerations behind the design of their algorithms. Such information would provide insight into the content that gets promoted on various platforms, and how these assessments are made.

    Emma pointed out that another core focus area for transparency regulation is on digital advertising, particularly on how targeted advertising works online. Another avenue of reporting targets users, and requires platforms to provide notices for content moderation, and policies and processes for content takedowns and appeals. Such measures, and others targeted at making websites more accessible, are aimed at helping users understand platform behaviour and empowering them to seek redressal. Emma also pointed out that another large bucket of regulation focuses on researcher access to data, and the use of audits to understand the internal processes driving algorithmic decisions. Measures that require platforms to share access to information with independent researchers are crucial in understanding the relationship between platforms and harms, and to identify areas for further intervention. In this context, regulations would need to find ways to provide necessary information to independent researchers while also maintaining privacy of users of platforms. Emma pointed out that it is currently difficult to assess what the consequences of such interventions would be, and that transparency regulations would need to be iterative and responsive to information that is provided.

    Chris Sheehy stressed on the importance of using a multi-stakeholder approach in transparency regulations. In part, existing efforts have been a response to previous multi-stakeholder collaborations. Chris highlighted the importance of the role of multi-stakeholder forums in checking transparency commitments of various platforms and also in auditing the frameworks of information and communication technology companies. In this context, he spoke of the Action Coalition on Meaningful Transparency (ACT), which is part of a global multi-stakeholder initiative led by civil society that aims to identify and build collaboration across various streams of digital transparency work, as well as to increase the diversity of perspectives considered in those efforts.

    In response to a question about the role of the government in the context of transparency requirements, panelists spoke of how more granular reporting (such as on what category of law was violated, clarity on when takedown requests have been made by governments, etc) would provide more useful information. The importance of requiring governments to be transparent about takedown orders, and on the importance of including States in such obligations was stressed on, as a way to make sure that transparency obligations are effective and centre users and their rights. Panelists pointed out that existing transparency requirements in this regard could be strengthened across Global North and South countries. The challenges of instituting such mechanisms in countries with a history of State censorship was also discussed, along with ways to balance speech and other considerations.