Session
Organizer 1: Michael Karanicolas, Mr
Organizer 2: Bruna Santos , Coding Rights
Speaker 1: Agustina Del Campo, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 2: Michael Karanicolas, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Tiffany Li, Civil Society, Western European and Others Group (WEOG)
Speaker 4: Bruna Santos , Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 5: Amba Kak, Private Sector, Asia Pacific Group
Michael Karanicolas, Civil Society, Western European and Others Group (WEOG)
Bruna Santos , Civil Society, Latin American and Caribbean Group (GRULAC)
Bruna Santos , Civil Society, Latin American and Caribbean Group (GRULAC)
Round Table - Circle - 60 Min
- Are international tech platforms doing enough to include voices from the global south in their policy development processes? - How can American-based digital platforms develop content moderation policies which reflect global standards of freedom of expression, and opposed to purely US-centric First Amendment standards? - What are appropriate avenues of consultation and engagement for civil society to comment on the design and implementation of content moderation policies?
GOAL 10: Reduced Inequalities
GOAL 16: Peace, Justice and Strong Institutions
Description: This workshop grows out of a project to develop a set of basic human rights principles for private sector intermediaries. The work was led by Michael Karanicolas, then of the Centre for Law and Democracy, in collaboration with the Arabic Network for Human Rights Information (Egypt), the Centre for Internet and Society (India), the Centro de Estudios en Libertad de Expresión y Acceso a la Información (Argentina) and OpenNet Korea (South Korea), as well as Tamir Israel of CIPPIC and Christopher Parsons of Citizen Lab, and under the oversight of an Advisory Panel of international experts, including the United Nations Special Rapporteur on Freedom of Expression, and representatives from Google, Facebook, AirBNB and Mozilla. The resulting publication, Stand Up for Digital Rights: Recommendations for Responsible Tech, was launched at a session at RightsCon. That project led into Michael's incoming position with the Yale Information Society Project, which is focused on fostering dialogue between academics, private sector representatives, and civil society voices, particularly from the Global South, in order to facilitate engagement between these diverse constituencies to develop specific reform proposals for global content moderation standards, as well as to help develop avenues of consultation, engagement and oversight that the reflect the diverse global role that these global gatekeepers of speech now hold. This session will provide an opportunity for researchers including Michael and Tiffany C. Li to present the findings of their work in this field, and for civil society advocates from the Global South to express their views on how a global conversation addressing these issues should take place, where the tech companies need to do better, and what shape improvements should take. They will express these views as part of a dialogue with representatives from the tech sector, giving the latter a chance to respond and work collaboratively to develop a more inclusive dialogue on global freedom of expression challenges that the platforms face.
Expected Outcomes: The main outcome of this workshop will be to foster a global dialogue between academics, civil society voices, and tech platforms on generating content moderation standards which reflect the global role that these platforms have, and to set the stage for future discussions as these policies continue to evolve.
An important outreach mechanism will be to enlist project collaborators from the previously launched Recommendations for Responsible Tech to ensure that a robust parallel conversation takes place online. This includes organizations based in Egypt, India, Canada, Argentina and South Korea. Some of these partners will be in attendance at IGF, but others will not. Those collaborators who are not in attendance will drive online discussions about the issues from their respective bases in different parts of the world in parallel via Twitter, and other social media. Each collaborator will use their own network to stimulate interest in the event in the days leading up to the panel IGF, so that on the day of the presentation itself there will be global engagement, and significant virtual participation in the live-tweeting and online discussion which will accompany the conversation at IGF.
Relevance to Theme: With the increasing global attention paid to content moderation policies at major platforms, including approaches to countering hate speech, incitement to violence, disinformation and problematic content, companies are investing a growing amount of energy and resources in seeking to define their approach in this space. This includes, most notably, Facebook's announcement that they would be constituting an independent appeals body. However, as this debate moves forward, it has been strongly coloured by the fact that the major platforms are all based in the US, leading to an American-centric understanding of the issue, including interpretations of freedom of expression that are heavily grounded in First Amendment principles, rather than global freedom of expression standards, and which fail to properly account for the diverse nature of this problem. In particular, voices from the global south have been largely absent from this conversation. This session will aim to bring civil society and academic voices from the global south together with representatives from major tech firms to foster a dialogue on generating including global standards for moderating content, as well as models for outreach and engagement of these under-represented voices.
Relevance to Internet Governance: One of the most important differences between the Internet and earlier modes of communication is the central role that private companies play in facilitating expression. This creates a conceptual challenge, since international human rights rules are primarily designed to bind States rather than private actors. There is, however, a growing recognition that corporations also have responsibilities to promote and protect human rights, particularly online. Internet intermediaries also face commercial pressure to institute policies and practices that protect the expressive interests of their users. The growth of interest, among governments, civil society, academics, and Internet end users, in the shape of content moderation policies, has made this one of the most dynamic areas of debate in the Internet governance space, as each stakeholder seeks to influence the policy direction that major platforms adopt, while the platforms themselves must carefully balance their interests in facilitating free and open discussion against the pressure to act swiftly and effectively to remove problematic content. It is a core component of Internet governance going forward, and decisions made in this space today have enormous implications for the future of speech online.
Usage of IGF Tool