IGF 2020 WS #257 Online child abuse: prevention beyond platform regulation

Subtheme

Organizer 1: Ingerman Meagan, Prostasia Foundation
Organizer 2: Arsene Tungali, Rudi International

Speaker 1: Arsene Tungali, Civil Society, African Group
Speaker 2: Jeremy Malcolm, Civil Society, Western European and Others Group (WEOG)
Speaker 3: Tomiwa Ilori, Civil Society, African Group
Speaker 4: Charity Embley, Civil Society, Asia-Pacific Group
Speaker 5: Narine Khachatryan, Civil Society, Eastern European Group

Moderator

Arsene Tungali, Civil Society, African Group

Online Moderator

Ingerman Meagan, Civil Society, Western European and Others Group (WEOG)

Rapporteur

Arsene Tungali, Civil Society, African Group

Format

Panel - Auditorium - 90 Min

Policy Question(s)

Aside from the regulation of Internet platforms, what other interventions could be effective to prevent online child exploitation? What roles can civil society and academic stakeholders play in addressing online child exploitation? Does the stigma around this topic impede nuanced discussions of possible solutions, and if so how can stakeholders address this?

The central issue that our workshop will address is how stakeholders can promote the prevention of online child sexual exploitation before it happens, rather than the investigation and enforcement of laws against offenders after an act of abuse has already been perpetrated. A challenge to this approach is the stigma that surrounds the topic of child sexual abuse exploitation. This can create a barrier to nuanced discussion of prevention-based approaches, which in turn leads to a rather singular focus on enforcement-based approaches. The stalemate between proponents and opponents of stronger platform liability rules creates an opportunity for us to break out of this mold, by discussing solutions to online child exploitation that don’t involve the direct regulation of Internet platforms. There is the challenge ex-ante content moderation systems pose to free speech. If it is possible to work with proactive moderation such as ex-ante content moderation that picks out potentially harmful content against children without adversely impacting the freedom of expression, it would be a great option to explore. An argument that might work in favour of such an approach is if the public interest of protecting children outweighs the individual right to freedom of expression. This will involve an important buy-in of platforms and online websites to get more invested in transparency that can help monitor for accountability. This can also be an opportunity to highlight the challenges with the EARN IT Act as an example of how bad laws affect such initiatives on peer and professional support groups. In addition, we can also consider looking at how laws in need of reviews for child protection online may go about it. For example, many laws (hard or soft), from major treaties to finer comments to national laws and reports on children’s rights now require an overhaul to accommodate these new realities. How can this session set things on motion for such?

SDGs

GOAL 3: Good Health and Well-Being
GOAL 4: Quality Education
GOAL 5: Gender Equality

Description:

Regulating social networks has become a primary policy position for the world’s largest child protection organizations, due to the use of Internet technologies as an element in many child exploitation offenses. But by feeding into larger calls to rein in the power of dominant Internet companies, this approach has seen child safety turned into a political football. To circumvent these debates, this workshop will set aside strategies for child exploitation prevention that depend upon the regulation of Internet platforms, to consider other ways in which online child exploitation can be prevented. The workshop will also look at how the conversion to fully online instruction during the COVID-19 pandemic has also brought new challenges in exposing children to online harassment. There is a constant pressure for online meeting platforms such as Zoom to adjust and strengthen security, as well protect the privacy of children (personal data) from online trolls and predators entering virtual classes. Zoom is also sharing data with Facebook, even data on people with no facebook account (Cox, 2020). The tech website Motherboard (2020) reported that Zoom iOS app sends the data to Facebook. This is alarming because minors become vulnerable to online predators without knowing that their data is being used even if they do not have a Facebook account. Zoom did not anticipate the huge demand during the pandemic. Zoom is profit-driven and if it’s free, the cost is minimal privacy. Jeremy Malcolm, Prostasia Foundation (Male, Civil Society, WEoG) will introduce how experts in the United States are using online support groups to provide peer and professional support to those with an elevated risk of sexually offending against children. He will explain how these initiatives are jeopardized by laws such as the EARN IT Act, currently under consideration in the United States Senate, that would increase the risk to platforms of hosting such support groups. Arsene Tungali, Rudi International (Male, Civil Society, Africa) will speak from the experience of his organization working in the area of child online protection and living in a country where child online protection is not a priority. He will share why he thinks in most African countries, child online protection education has failed and suggest how we can better support children as they try to take advantage of the numerous benefits that the Internet brings, while staying safe. Arsene will also share how the word “regulation” means violation of people’s fundamental rights in most African countries including his country and why we should avoid suggesting it. Tomiwa Ilori, Centre for Human Rights (Male, Academia/Civil Society, Africa) will speak on a possible design of a multi-stakeholder toolkit for both state and non-state actors on the need to engage the growing challenges on child protection in the digital age. The toolkit will be informative to improve literacy in the area while also advocating for best practices that are rights-respecting. It may potentially assist in two key ways: it drives a bottom-up/multi-lateral/multi-stakeholder approach into the conversations of child protection in the digital age and minimises direct regulation of internet platforms. Depending on whether the idea will be further thought through, the toolkit may be designed in such a way that it draws a baseline of protections while also suggesting possible ways of adapting to several contexts. Designing such a toolkit, among other several ideas we will mull over, will be my contribution to the workshop. Charity Embley (Female, Academia, WEoG) will address the issue of child online safety during COVID-19 isolation. While applications like Zoom have identified some security flaws within their settings (e.g., zoombombing; Bond, 2020), another question also arises: What are faculty doing to protect the privacy of their minor students? Within the excitement of switching to online instruction, some faculty have posted images of students in their social media accounts without realizing the damage it could do to minor students, faculty are unaware and in dire need of more training when it comes to protecting the privacy of their minor students. Indeed, remote instruction has brought new, unforeseen challenges in our educational setting and this, among many other issues are what will be discussed at the meeting. The workshop will conclude with a half-hour open discussion of the policy questions that we have identified, led by the onsite and online moderators. Additional Reference Document Links: https://www.npr.org/2020/04/03/826129520/a-must-for-millions-zoom-has-a… https://www.vice.com/en_us/article/k7e599/zoom-ios-app-sends-data-to-fa…

Expected Outcomes

A comprehensive workshop report will be produced and published together with draft recommendations drawn from the discussions. The report will be published and circulated to a sign-up list of participants from the workshop who will be able to comment on the draft recommendations using an online discussion forum. The final recommendations will be published as an outcome from the workshop if and when a rough consensus on them has emerged from the online discussions. Depending on interest, we may also use the session to kickstart a project that develops a multi-stakeholder toolkit. This toolkit will address the needs of key stakeholders involved with child protection online. Each stakeholder will have a set of responsibilities and there will also be a point of convergence for all stakeholders on what is possible together. This, given the success can morph into a legislative policy.

Due to the stigma that surrounds this subject area, we will provide the opportunity for participants to contribute interventions to the online moderator anonymously, either during the session or ahead of time. These will be treated on an equal footing to contributions from participants who identify themselves.

Relevance to Internet Governance: Child sexual exploitation is a social problem that manifests itself both online and offline, but its online manifestations are some of the most intractable and troubling—especially the circulation of unlawful sexual images of minors, and the use of online communications channels for sexual grooming of minors. These problems are frequently at the epicenter of proposals for new laws that affect the Internet. But as the definition of Internet governance recognizes, not all governance solutions take the form of laws. This workshop will explore how stakeholders other than governments and Internet platforms are involved in abuse prevention interventions.

Relevance to Theme: One of the illustrative policy questions for the thematic track of trust is “What are the responsibilities of digital platforms and public authorities in regulating or policing content?” Our workshop will go beyond this by also exploring the limits of the actions that digital platforms and public authorities can take. Our panelists and discussants will investigate ways in which non-regulatory approaches to the prevention of child sexual exploitation can bypass intractable disputes between advocates for freedom of expression and child safety, while still contributing towards the development of a more trustworthy Internet.

Online Participation

 

Usage of IGF Official Tool.