Session
Dynamic Coalition on Children's Rights in the the Digital Environment
Round Table - U-shape - 90 Min
Content moderation and human rights compliance: How to ensure that government regulation, self-regulation and co-regulation approaches to content moderation are compliant with human rights frameworks, are transparent and accountable, and enable a safe, united and inclusive Internet?
Data governance and trust, globally and locally: What is needed to ensure that existing and future national and international data governance frameworks are effective in mandating the responsible and trustworthy use of data, with respect for privacy and other human rights?
With legislation on the table in several countries and regions to address harmful and illegal content in digital environments, Internet Governance is experiencing a transition from self-regulatory to regulatory environments for online service providers. This is highlighting the complexity of the issues at play, as well as the lack of consensus between different issue groups such as privacy organizations, fundamental rights and freedom of expression group and children’s rights advocates. The arguments for and against regulation often default to binary or seemingly mutually exclusive positions, missing the nuance and compromise needed to identify sustainable, cross-sector solutions to violence against children in real-life and in digital environments. Advocates of regulation argue that technology companies have failed to self-regulate effectively and at scale across the sector and globally; they point out that the technologies to address the digital manifestations of OCSEA are proven to work for online spaces, victims and society yet they are not being taken up and deployed sufficiently widely with appropriate levels of transparency. Proponents of a public health approach to online child sexual exploitation and abuse argue that too much focus has been placed on regulation and technical interventions, at the cost of investment in prevention strategies that tackle the problems at the root, leading to adverse impacts particularly though not exclusively on marginalized groups. This session seeks to explore the challenges and opportunities inherent in regulating digital environments, and to build consensus on a broad, sustainable and nuanced approach to protecting and ensuring the rights of children in digital environments. Two overarching questions for the session: 1. Is the emphasis on regulation and technical solutions obscuring larger challenges in terms of addressing online child sexual exploitation and abuse (OCSEA)? 2. How do we align the principles of regulation of digital environments and prevention strategies to protect children?
Other key policy questions to guide the conversation (indicative): 1. How does the UN General Comment 25 to the UN Convention on the Rights of the Child guide our response? 2. How important is it to achieve a high degree of issue separation, for example as between addressing illegal content, harmful content and content in the grey area? 3. Where do we draw the line between the different types of content related to child sexual abuse? Should the most heinous types be prioritised over less violent content? 4. Is it realistic or desirable for technology companies to shift their focus from managing digital harm to investing in prevention strategies? Can’t they do both, and aren’t they already? 5. Is it the case that governments are investing more in regulation and control that in evidence-based prevention of online and offline violence against children? 6. What does the evidence tell us about regulating online activity as opposed to promoting safety and resilience? 7. Can we learn from the different stages of response to OCSEA from different segments of the technical community such as social media platforms and the gaming industry?
• We have selected the Roundtable (U-shape) format with the intention of combining elements of both a panel session and a roundtable. This will help ensure that as many voices as possible are heard, and that a robust moderated conversation takes place around a table. • Speakers and participants will be both online and offline, thereby enabling further inclusion of people from different regions and with different perspectives to bring to this topic. • To ensure that online participants are heard, the offline moderator will systematically consult the online moderator to open the floor to online participants, and/or ensure that as many questions as possible submitted via the chat are put to the group, within the time available. • If necessary, additional tools will be considered. For example, early, mid and end time polling will be considered to take the temperature of the ‘room’, either using the IGF software or an alternative tool. • However, content and participant management will be easier if interaction is streamlined through one or maximum two tools.
ECPAT International - Doro Czarnecki / Amy Crocker
Stiftung Digitale–Chancen – Jutta Croll
Online Safety Expert - John Carr
1. Patrick Burton – Centre for Justice and Crime Prevention (Male, Africa, civil society) – onsite
2. Thiago Tavares – SaferNet Brasil (Male, Latam, Policy and Legal) - online
3. Michael Tunks – IWF (Male, Europe outside EU, civil society) - onsite
4. Andreas Hautz – Jugendschutz.net – (Male, Europe-EU, civil society) - online
Jutta Croll (Stiftung Digitale–Chancen)
Doro Czarnecki (ECPAT International)
Amy Crocker (ECPAT International)
Targets: Focus SDG is 16.2 - End abuse, exploitation, trafficking and all forms of violence against and torture of children. To effectively tackle SDG 16.2, we need to understand and mitigate the impact of digital technology on abuse, exploitation, trafficking and all forms of violence against and torture of children. However, despite significant developments such as the UN General Comment to the CRC on Children’s Rights in Relation to the Digital Environment, regional and sectoral variances exist. Emerging regulation is seen by some as essential to protect children online, by some as harmful to fundamental rights, and by others as only one of the many components needed for an effective response to online violence against children. With such opposing views, we risk losing sight of the goal of SDG 16.2. This proposal aims to facilitate an open discussion on these complex challenges. See DC paper - https://www.intgovforum.org/multilingual/index.php?q=filedepot_download/6975/1390
Report
1. Regulation and prevention do not exist in a dichotomy but are too often framed in that way – it is possible and essential to have both.
2. Regulation is often politically expedient and based on the values, priorities, and political interests of a country. Therefore, lessons must be learnt from public health messaging and other approaches to social problems, with attention given to disparities between Global North and Global South countries that are often missed in general regulation proposals for the internet.
As the world looks at diverse ways to regulate online spaces, the Dynamic Coalition on Children's Rights in the Digital Environment calls on children’s rights groups to find a shared voice and improve cooperation and help build consensus around ways to balance regulation and child protection that respects fundamental human rights such as privacy online.
The Dynamic Coalition on Children's Rights in the Digital Environment calls on governments, when considering legislation to regulate online spaces to protect children, to consider the limits of regulation without broader attention to prevention strategies informed by a public health approach and adapted to their context.
- The session began with a participant poll on the question: What do you hope to get out of this session? The results:
- 27% were keen to inform themselves about the topic
- 23% were keen to see the prevention prioritised as a way to protect children online
- 22% were keen to find common ground and a roadmap for the future on this topic
- 18% were looking for a clear answer to the question in the session title
- 10% were keen to see regulation prioritised as a way to protect children online.
- From this starting point, the session speakers presented the following challenges and arguments:
- (London School of Economics – Prof. Sonia Livingstone) The debate between digital child rights advocates and advocates of democracy and free expression has become problematic. This is in part because discussions around illegal and harmful content have been unhelpfully confused and may also be because child rights advocates have not sufficiently prepared for the debates around technologies such as encryption. This makes the safety/privacy-by-design movement an important one, because it offers solutions that are not dependent on heavy-handed content-based regulation, but on the regulation of processes and outcomes, combined with standards, training and remedy.
- (Centre for Justice and Crime Prevention- Patrick Burton) Regulation and safety-by-design are important elements of (online) violence prevention, but they are too often overemphasized as solutions and in this sense can be seen as politically expedient, despite good intentions in many cases. Sustainable violence prevention requires behavioral change, which requires an understanding of the way children engage with tech and are supported to live online. Furthermore, because there is an intersection between online and offline violence, we need to learn from other sectors about the role of tech and industry in supporting interventions that regulate them, and we need to learn from public health messaging and other approaches to social problems.
- (jugendschutz.net - Andreas Hautz) Prevention through education and regulation go hand in hand because you cannot effectively keep children safe online without a combination of detection and removal of illegal content, proactive reporting by online service providers, active creation of safe digital environments, including in relation to safety by design and safe settings, and strong reporting mechanisms for when harm does occur. This combination lies behind a new law in Germany this year. In brief, regulation is needed to get engagement from online service providers, but analysis of the problems is essential, and regulation is only fully effective when it leads to prevention. At the same time, you cannot regulate everything, which makes media literacy a crucial element.
- (Safernet Brazil- Thiago Tavares) Latin America is a dangerous place for children due to high levels of poverty, social inequality, violence and insufficient public policy and education, but also for society due to state surveillance and other government control tactics. These underlying social factors pre-dated the internet and are sometimes missed in proposals for regulation of the internet. Different societies propose different approaches based on their values, priorities, and political interests, which has been seen starkly in Brazil where the Federal Government has proposed regulation to prohibit content moderation in relation to harmful behavior such as hate speech, cyber bullying, neo-Nazism, and misinformation. This has put civil society organizations advocating for digital safety in direct danger. It also highlights the importance of rejecting regulation that fails to respect the due process of law or the rights to free speech, data protection and privacy.
- (Internet Watch Foundation – Michael Tunks) The UK draft online safety bill places a duty of care, as a well-established regulatory approach, on online platforms with user-to-user services and search providers to ensure the safety of its users. However, regulating is complex and takes time as has been seen in countries/regions such as the UK, EU, and Australia. Furthermore, regulation must complement not duplicate mechanisms that exist and work, it should be based on principles and be flexible enough to adapt to inevitable changes in technology, and it should respect the rights of children to privacy and a childhood free of abuse and exploitation. Having clear standards about what is and is not illegal is very helpful for multinational solutions. However, that does not mean companies should not go further and do more to keep children safe.
- To the question of whether we have an option to overemphasize either prevention or regulation, participants responded:
- The UN General Comment #25 offers clear procedures for achieving a balance, but a careful process is needed to decide how any decision might affect any or all of children’s rights. This is hard since children are individuals. This leads to an argument for regulation for ‘hygiene factors’ that governments and businesses must adhere to ensure children’s privacy, security and equity, combined with intermediators such as auditors, funders, trade associations and ethics organizations that can help ensure children rights more broadly.
- Useful regulation would be regulation that lays out the responsibility of online platforms to have strong reporting, referral and victim support processes, and to focus not only on technology to detect the outcomes of violence such as child sexual abuse material, but also all the things before that go unreported.
- Indeed, governments have so far failed to provide companies with a good regulatory framework that governs their proactive work in this area. We need better rules that guide companies and enable to them to take responsibility.
- Some regulation is needed, such as to require the use of automated tools to detect CSAM, but solutions should be implemented in a way that does not affect other human rights.
- Regulation may be politically expedient, but it is also about what is possible at a given moment in time. And as much as we talk about global standards, the internet is not a global system that must be run in a global way. This Idea of global standards and systems has been an obstacle to action. The crucial question is how come up with a framework that recognizes the responsibility of internet companies to prioritize safety – so regulation is essential, but if we leave the decision to online platforms, they will not prioritize safety.
- A participant from the Technical University of Munich asked what should be done to protect child influencers. Participants responded that there is no easy answer, but that there is a distinction between types of influencers. For example, there is clearly a role for regulation regarding the use of children to market certain products that can be harmful for children, and this also requires attention to child labor laws.
- Dotkids.com asked about the value of creating child friendly domains/spaces. Participants responded that safe spaces are important, but that diversity is also important, for example in relation to language provision for access to information and safety measures (such as language recognition to ensure safety). Furthermore, at minimum regulation should not reduce spaces for children, which is what happened in relation to the ePrivacy Directive in the EU.
Conclusions
- We need regulation and prevention – there is no dichotomy between the two.
- Regulation must be seen in context not in the abstract
- We have to do the best we can for the children in our own countries/jurisdictions, rather than pursue goals of ‘global’ harmonization
- Waiting for harmonization even on definitions will just lead to inaction, so we must focus on other areas and for local and contextual adaptation.