IGF 2022 WS #502 Platform regulation: perspectives from the Global South

Thursday, 1st December, 2022 (10:50 UTC) - Thursday, 1st December, 2022 (11:50 UTC)

Organizer 1: Olivia Bandeira, Intervozes - Coletivo Brasil de Comunicação Social
Organizer 2: Paula Martins, Association for Progressive Communications - APC
Organizer 3: Gustavo Gomez, OBSERVACOM
Organizer 4: Lucía León, Hiperderecho
Organizer 5: Toledo Amalia, Wikimedia Foundation

Speaker 1: Raúl Echeberría, Private Sector, Latin American and Caribbean Group (GRULAC)
Speaker 2: Orlando Silva, Government, Latin American and Caribbean Group (GRULAC)
Speaker 3: Paulo Victor Melo, Technical Community, Latin American and Caribbean Group (GRULAC)
Speaker 4: Lillian Nalwoga, Civil Society, African Group


Lucía León, Civil Society, Latin American and Caribbean Group (GRULAC)

Online Moderator

Paula Martins, Civil Society, Intergovernmental Organization


Olivia Bandeira, Civil Society, Latin American and Caribbean Group (GRULAC)


Debate - Classroom - 60 Min

Policy Question(s)

1) How do the content moderation policies and practices of major digital platforms impact differently on different regions of the globe? 2) What regulatory principles of content moderation processes can better guarantee freedom of expression and other fundamental rights in diverse legal systems, what are the challenges for the regulation of large global digital platforms and what are the appropriate bodies and mechanisms to do so? 3) How regulations currently under discussion in the US and Europe will impact Africa, Asia and Latin America and what is the regional approach that CSOs in these regions could move forward to ensure regulations are aligned with international human rights standards?

Connection with previous Messages: The session intends to advance with the following messages: “5.1. The complex interplay between the market and society is being reshaped by online platforms. Online platforms continue to gain power in the digital world, generating high impact throughout the globe, especially in the Global South. There is no one-size fits all approach as impacts may be positive or negative, depending on the local reality. 5.2. Suggested underlying principles to guide policy approaches towards strengthened market competition and consumer protection include: (a) transparency; (b) global taxonomy of service providers; (c) emphasis on rights application; (d) proportionality; (e) acknowledging the complexity of platforms, content and behaviours and jurisdictions; (f) harmonization - ensuring that the Internet remains a global, unified platform that enables the exercise of human rights.”


16. Peace, Justice and Strong Institutions

Targets: 16: Promote peaceful and inclusive societies for sustainable development, provide access to justice for all and build effective, accountable and inclusive institutions at all levels. 16.7 Ensure responsive, inclusive, participatory and representative decision-making at all levels 16.10 Ensure public access to information and protect fundamental freedoms, in accordance with national legislation and international agreements The influence of big digital platforms on public debate and their role in guaranteeing or violating the freedom of expression of individuals makes the discussion about the need for regulation grow. The debate aims to promote peace, social and socio-environmental justice, access to justice, and inclusive, effective, and accountable institutions. For regulation to protect fundamental freedoms, the debate must consider international agreements and local legislation. At the same time, regulation has to be done in an inclusive and participatory manner, taking into account the voice of the Global South and minorities in terms of gender, class, and race-ethnicity, in addition to the dialogue between different stakeholder groups.


The debate on the need to regulate big digital platforms has grown around the world. It is a reaction, first of all, to the growing power of large digital platforms to influence public debate, the functioning of democracies, and freedom of expression. Other related issues that demand regulation are the privacy of users and the use of their personal data. Although this debate is global, it has regional specificities and implications since the major digital platforms are based in the countries of the Global North and often do not even have official representatives in the countries of the Global South. In addition, they have adopted content moderation and privacy protection rules and measures that are different in the Global North and Global South, disproportionately affecting the freedom of expression and other human rights of individuals and groups in Latin America, Africa, and Asia, as shown by several studies. Given this context, regulatory agents in different parts of the world debate the need for regulation. While some proposals follow international principles of freedom of expression, others are authoritarian proposals that aim at government control and generate forms of censorship. Researchers and civil society have also presented regulatory proposals in different regions of the world, seeking to protect freedom of expression and other fundamental rights. This workshop aims to advance the debate and proposals for democratic regulation that guarantees freedom of expression from the point of view of the Global South.

Expected Outcomes

In Latin America, civil society in several countries has debated the need for democratic regulation of big digital platforms. In 2021, these organizations launched a document with principles to guarantee the need for transparency of these platforms in dialogue with UNESCO. With this workshop, we hope to: 1) initiate a process of dialogue on the topic between civil society, academia, the private sector, and the public sector in Latin America, Africa, and Asia; 2) hold follow-up meetings in each region; 3) produce a report that answers the questions asked in this workshop, taking into account the specificities of each region.

Hybrid Format: - How will you facilitate interaction between onsite and online speakers and attendees? We’ll ensure specific activities designed to integrate engagement and contributions from participants online and in person. - How will you design the session to ensure the best possible experience for online and onsite participants? Online and in person speakers. Integrated working groups. In situ and online moderators. Chat moderator. Translation. - Please note any complementary online tools/platforms you plan to use to increase participation and interaction during the session. Miro, pads and instant surveys (like Mentimeter).

Online Participation


Usage of IGF Official Tool.


Key Takeaways (* deadline 2 hours after session)

There is consensus among stakeholders that regulating digital platforms is necessary since, despite being private companies, they have a growing importance in the public sphere, influencing democracies, human rights, and people's lives. Regulation must follow international human rights parameters, and the debate on the subject must involve governments, private companies, civil society, and the technical and academic community.

The debate must move forward regarding how regulation should be carried out. First, while some agents think that existing rules, such as consumer protection and personal data protection, are sufficient, others believe that new regulations are needed to advance on issues such as content moderation, transparency, and competition. Secondly, it is necessary to debate the coexistence of global norms and parameters and local legislation.

Call to Action (* deadline 2 hours after session)

National States must formulate antitrust and anti-concentration policies for digital spaces, with mechanisms to guarantee plurality and diversity, aiming at the constitution of a balanced digital ecosystem that respects freedom of expression, privacy, and personal data protection.

Global platforms must have representatives in the countries of the Global South and consider local laws and regulations in their operating policies.

Session Report (* deadline 26 October) - click on the ? symbol for instructions

The workshop "Platform regulation: perspectives from the Global South" brought together representatives of different sectors. Speakers agree with the need for the digital environment to be regulated but differ about the scope and nature of regulation. They also agree that new laws can not contradict human rights and respect international standards.

For Raúl Echeberría, executive director of the Latin American Internet Association, regulation is not the only answer to every problem. Different policies can impact regions differently, so we need other principles and measures, considering variations in each society. In his words, we need innovative public policies, international instruments, multistakeholder agreements, and good practices for companies and society.

Echeberría says that companies are not against regulations but that in the digital world, there are already many regulations that platforms are already subject to, such as consumer protection and personal data protection laws. Thus, he thinks the focus of the discussion should be how to improve the scope of regulation.

He also believes that the sectors must agree on some aspects. He says there are two groups of thought: one that thinks platforms should do much more to moderate content, avoiding hate speech, online violence, etc., and the group that believes that there should be no moderation.

To resolve the fake news issue, Echeberría says we need to avoid criminalizing platforms and focus on the responsibility of other sectors, for example, governments, as the digital world is a representation of the real world.

Brazilian federal deputy Orlando Silva, who is a rapporteur for the Internet Freedom, Responsibility and Transparency Bill (PL 2630/2020), a benchmark in Latin America, brings another point of view. Silva believes few rules protect human rights and the public interest in the digital environment. He highlighted the importance of regulating the internet because it cannot continue as a territory where the rules only serve to promote the profit of big techs.

For the deputy, the rules must exist to protect freedom of expression; digital platforms can not operate according to their private criteria but according to publicly agreed standards. Bringing the example of the presidential elections in Brazil, Silva said that the moderation of content that platforms do without public regulation puts freedom of expression, privacy, free expression of conscience, and democracy at risk.

Silva also says that, despite the need for national laws that adapt to the reality of each country, it is necessary principles, concepts, and parameters for a global, international plan. He also stressed that the presence of civil society in the debates for the definition of these global parameters is fundamental. He believes that the regulations to be made should focus above all on the transparency obligations of digital platforms.

Agreeing with Raul, the deputy thinks that the quality of the public debate is not just a problem for the platforms but that public leaders have to be more responsible for the content they post and be sanctioned since they are accounts of public interest. He also thinks that the German experience of combating hate speech, based on "regulated self-regulation," should be a reference for the debate.

Representing civil society, Lillian Nalwoga, president of the Internet Society's Uganda Chapter and Policy Officer at the Collaboration on International ICT Policy in East and Southern Africa (CIPESA), said that in Africa, investments in digital platforms are increasing. Many people who connect to the internet do so through social media such as Facebook and Whatsapp, even before using search engines such as Google.

In this context, she said many platforms allow violations of human rights and disinformation, and moderate content, so it is necessary to talk about regulation. On the other hand, some governments are going to extremes that violate freedom of expression, such as blocking and closing social media.

Nalwoga also said it is essential to talk about jurisdiction because today, when there is a problem, people in Africa have to resort to the country where the platform is located. So it would be necessary for each of these platforms to have local offices in each of the nations in which they operate.

Lillian also said that civil society is trying to understand how regulations in the United States and Europe will impact Africa and what regional approach civil society can take to ensure that law is in line with international human rights standards.

She also said that in Africa, they are talking not only about content moderation, disinformation, and hate speech, but other aspects, such as the taxation of platforms, which already happens in Europe.

Representing the technical community, Paulo Victor Melo, researcher at the Institute of Communication at Universidade Nova de Lisboa and assistant professor at Faculty of Design, Technology and Communication/European University, said that if we talk about regulating digital platforms, we must first understand the support structure through which these platforms operate, which some authors call "digital colonialism." For Melo, the colonialist logic is visible, for example, in technovigilance policies, which segregate the public space and reinforces punitive practices in public safety. It is also present in the illegal extraction of minerals in the Amazon, mainly due to the invasion of protected indigenous lands. It is also present in the production of an industry of slavery, death, and environmental destruction, for example, in the Congo, where ores used to manufacture digital devices such as smartphones, computers, etc., are explored.

For Melo, from the point of view of the Global South, regulation mustn't serve, on the one hand, authoritarian government projects; fascist governments use digital platforms as echo chambers to promote disinformation and hate speech. On the other hand, it must prevent platforms from continuing to determine the "game's rules," as this has allowed hate speech, especially against minorities. Despite being private companies, digital platforms operate in the digital public space, which is not separated from the public space, people, and territories.

The professor highlighted that in countries with little diversity and plurality in the media, it is essential to have legislation to encourage competition, avoid monopolies and oligopolies of the large digital platforms, and encourage balance in the digital ecosystem. He also says that regulation also needs to consider that digital is dynamic and the specificities of each context, for example, in countries where internet connectivity is low and where there is no digital literacy. Finally, the researcher believes that no democratic regulation can occur without the active participation of the territories and communities that inhabit them and are affected by platform actions, from mining to hate speech.