The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> I think we'll take a few minutes for people to file into the room. But our remote speakers should be aware that the session is now about to start. They should already see the room? Okay, thank you. Thank you so much. Okay. Good afternoon, good morning, everybody. We are going to start the workshop number 183, digital wellbeing of youth, and we will be talking about selfgenerated sexualised content. We will have a look at this issue on the one hand on the legal regulations that we already see in many countries. And we will contrast that with the perception of young people themselves who are in the situation that they probably face even legal problems when they produce selfgenerated sexualised content. I am working for the German Opportunities Foundation. There I'm responsible for a project called Children Slides Digital, and we've spread some of our postcards around here so that all the documents we are talking about can be found in our website so that you don't need to write everything down. Just refer to the website, and then you will be able to check out the documents that we are referring to.
We would like to begin our session today with talking about what does it mean, selfgenerated sexualised content? I do think we have several different perceptions of what is selfgenerated, what is even sexualised content, and therefore I would like to refer firstly to our first panelist, which is Professor Sonia Livingstone, from the London School of Economics, and Sonia, I hope can you hear us. I hope you are connected to the session, and the floor is yours.
>> SONIA LIVINGSTONE: Thank you so much, and I hope on my side that you can hear me clearly. And it's a pleasure to join this session and discuss this very important topic.
So I want to give a few remarks from a child rights perspective, setting out some questions of definition and framework, and for those who don't know, the U.N. convention on the rights of the child sets out children's rights as an obligation for all signatories, which is all states in the world apart from the USA, and the committee on the rights of the child has been elected by those states that ratified the convention to scrutinize their progress in implementing children's rights.
And the Committee makes recommendations on measures to strengthen compliance for children's rights including producing what are called general comments, which are, if you like, the committee's jurisprudence, the committee's thinking on how the convention applies in relation to a particular domain. So general comment 25 is what I want to talk about, which is the newest general comment, and specifically addresses the role of the convention on the rights of the child in relation to the digital environment.
And it was informed by a very considerable global process of consultation that some of those are here, I know, have been part of the consultation included experts, stakeholders, and importantly also children. So it's a bit of authoritative international soft law, if you like, adopted in early 2021, and welcomed on its adoption by the World Health Organization, the ITU, UNESCO, OECD, We Protect Global Alliance, and so forth. Several of those are going to participate in this session today.
To my mind and I think for many, it kind of marks a real change in relation to Internet governance, and it marks a real change in ensuring the recognition of children's voices, children's needs, and children's rights. So coming to our topic now, in relation to harmful or we might say potentially harmful sexualised content, which is where the general comment gives most of its focus in relation to sexualised content, the general comment, the committee on the rights of the child, emphasizes the responsibility of both states and businesses in, quote, taking all appropriate measures to protect children from risks to their right‑to‑life survival and development. And it locates the importance of sexual risks of harm in the content of national child protection policies and protection from all forms of exploitation, also in relation to the importance of international collaboration and cooperation, and it emphasizes that we must think of children's rights holistically. So we are here debating or discussing selfgenerated sexual content. This needs to be put in the context of other rights, freedom of expression, privacy, digital literacy, education, and nondiscrimination and the situation for children, in particular circumstances of disadvantage.
So I think what I would like ‑‑ one more point really about definitions because I think the notion of selfgenerated sexual material has generated quite a lot of misunderstandings in its own right, and it's the notion of selfgenerated which is signaled which is really crucial. So I think from the point of view of the general comment, two of the key issues are around evolving capacity, the child's growing maturity, and ability to understand and make decisions for themselves and the question of consent.
And I hear in debate at least three ways in which selfgenerated sexual material is talked about. And the general comment has, if you like, three areas of recommendations. So let me distinguish them. Others may use other terms. So the first meaning of selfgenerated sexual content refers to a coercive or exploitative situation by an adult who is often remote, at a distance and digitally facilitated from the child victim.
So if we are talking about a coercive or exploitative situation, then general comment 25 emphasizes the importance of safeguarding and protection for the victim, platform responsibility, and regulation, both to prevent the situation and also to ensure redress if it occurs. It calls for criminalization of the abuser and rehabilitation of the victim. So the second situation is coercive, but the perpetrator is also a child. And then general comment 25 also calls for the protections for the victim or the ‑‑ for the victim but also for restorative and not criminal approaches for the perpetrator when they are the child.
And then we need to acknowledge the third possibility, which is a truly consenting situation, probably among those who are themselves under the age of 18. So children in the light of the convention. And then general comment 25 calls for a nonpunitive approach in accordance with the child's evolving capacity and best interests.
And then just as a rider, in all of those circumstances, there is the possibility ‑‑ the likelihood of creating content over which even a consenting child with another child loses control. And in those cases the state and the businesses bare responsibility for any and all sharing of images and general comment 25 calls for prompt and effective takedown. So I'll leave it there and look forward to others' contributions and come back in a little later, I think. Thank you.
>> Yes. Thank you so much, Sonia, for explaining what the general comment says in regard of selfgenerated sexualised content. We have spread here around in the room some smaller postcards with a link to our website, www.childrensminorsrights.digital, and there you will be able to find all of the documents that we are talking about today.
I would now like to bring into the debate Hazel Bitana speaking from a user perspective from the ASEAN region. I hope you are there and can respond to what Sonia has already presented to us.
>> HAZEL BITANA: Hi. I hope you can hear me clearly. Good morning, good evening to you wherever you are. My name is Hazel Bitana. And I am based where they currently have members in 13 countries in Asia and southeast Asia. One of the intrinsic blueprints of our coalition is that we work for and with children, and we have had several opportunities to discuss and implement actions with children in relation to their rights in the digital environment, including the consultation to the general comment that Sonia talked about.
When we talk to children about online sexual exploitation and abuse and ask them about their recommendations on how to address the issues, they could not help but discuss problems in action, both online and offline. They always highlight the interrelationship between their online world and their offline world. And they bring to fore the importance of addressing. For example, for digital platforms in relation for this workshop session, one of the children's key recommendations is to have a child‑friendly and accessible reporting mechanism. Unpacking that, what do we mean by accessible and child friendly? It should be in child‑friendly terms, in a language that they understand. And in the Asian region, there is a variety of languages that are not, should I say, popular. Languages with different scripts that are not popular in the online world. So if they want to report a takedown of the selfgenerated sexualised content, can they easily find the report button? And if that report button asks them to choose what type of category their report falls under, are these categories easily understood by the children?
And then more on the online/offline relationship. Children are highlighting the importance of having comprehensive sexual education. Quality information on this could be accessed online, but it should be also taught in schools in order to prevent children from putting themselves in dangerous situations online. Children need to learn about what are our sexual and reproductive health and rights. What is happening to our bodies as we develop? How do we understand gender? And what is a healthy relationship?
And my third and last point is about the children's need to have good and strong relationships offline. With our families, with our peers, or other members of the community. To further illustrate this point, I will relay sharing from about one of our discussions with children. In 2019, together with other organizations, we held the Asian Children's Summit, and there was the sharing from the digital environment workshop group that says children know some friends. They know fellow adults who sign up for dating apps. They lie about their age when they sign up because they are searching for someone who will give them a sense of belonging that they could not find at home. So in this dating app, the adult was asked to share a selfgenerated sexualised content. Unfortunately one story ended tragically, but children wanted to point out how their relationship with their families, having trusted adults and friends impact children's engagement online. And it should also be worth noting that especially for adolescents, brain science has proven that adolescence is a period of transition, where they’re in rapid development, another comment, general comment number 20 on the rights of children and adolescents. And during that brain development, the emotional part of the brain matures earlier than the rational part of the brain. So the adolescents' brains are wired to think emotionally first before thinking rationally. And this should be taken into consideration when we talk about the wellbeing of children and youth who see the advantages of the digital environment as a platform for them to express their ideas, connect with their peers, and exchange with the people in the online world. I'll end with that and back to you.
>> Yeah, thank you so much, Hazel, for giving us an insight into the research that you are doing in the Asian Pacific region perception of children in this regard. Before we come to the question which answers does legislation provide, I would like to invite you to ask questions to the first two speakers if you have some. And I also would like to refer to my co‑moderator online, Sophie Pohle, from the German Children's Charity Fund. Sophie, do we already have questions in the chat?
>> SOPHIE POHLE: Hello, everybody. Hello, Jutta. Not so far. There are no questions at the moment. But, yeah. You are invited to put a comment or a question also in the chat tool, for example.
>> JUTTA CROLL: Okay. Thank you, Sophie. We have a question from the floor.
>> Hello. Firstly, thank you very much for your talk. My question is about what do you think is responsible for big tech companies? When there is an exploitation online, there are imprints of actually those exploiters and all of the big tech companies have tools to access to a lot of data, a lot of insights about who these people are and how it is happening. So what would your suggestion be on engaging those companies and holding them accountable for this matter? Thank you so much.
>> JUTTA CROLL: Yeah, thank you for this very relevant question. If you allow, I would postpone it a little bit so that we first hear what legislation is around, what legislation says in this regard and whether there are obligations to the platform providers in various countries. And we are lucky to have Gioia Scappucci which gives legislation in 43 countries around the world, and it clusters the very best legislative approaches but also maybe pitfalls. Gioia, are you there? Then I would hand over the floor to you, please.
>> GIOIA SCAPPUCCI: Yes, I am here, and I am delighted to be with you all today. Hello, everyone, all around the globe. I'm honored to present the main findings of the committee's latest monitoring report, which indeed focuses on how states which were examined in this report which were 43 states over Europe or parties to the convention which is a council of Europe convention but which is open also to non‑European countries. So if there are any questions about that, I can reply to that later on.
This report was adopted in March last year, and its main focus was trying to understand what is in place in these countries in order to tackle the challenges raised by child selfgenerated sexual images and videos, in particular. As Sonia said previously, one must be very careful with the terminology and the identification of different kinds of selfgenerated child images of a sexual nature, and the committee indeed has also made that distinction actually even before comment 25, which builds on that opinion of the committee. And in this report, again, the committee draws importance to states because it found that only 11 out of the 43 countries that it had examined specifically addressed child selfgenerated images and videos or sexual content produced by children in its legislation and more importantly does not make the distinction between when that is coerced, the self‑generation, or when it is part of a private consensual relationship between children of similar ages and maturity or persons of similar ages and maturity. So the committee really asked for that to be fixed in the legislation, and we will see the recommendations later on. So I wanted to point out that as far as the coercive behavior is concerned and when adults are exploiting the fact that they have in their possession child selfgenerated image and use that to ask for more such images or for a financial gain or for other sexual purposes, the legislation that we have looked into of these 43 countries, only one country has a clear self‑standing sexual offense for this specific behavior, criminal behavior. So the committee has encouraged strongly, the states parties to the convention, and in general as a benchmark for any country wanting to combat this increasingly ‑‑ this increasing phenomena is that of creating a self‑standing sexual offense because it is generally addressed in the legislation of the countries we have examined by concurring, which does not facilitate the investigation and prosecution and putting the best interest of the child at the forefront and also intervention of the support and the assistance, et cetera. So these are two very important findings. There is really a very low number of countries which in its legislation contemplates behavior around child selfgenerated images and videos of a sexual nature because in most legislation, the terminology used is still old terminology, which the committee has also recommended to move away from, which is the terminology of child pornography. So the committee really insists that this terminology should be abandoned as soon as an opportunity arises to modify legislation. One should insert the terminology of child abuse material, and in that context, a distinction should be made when children are producing and possessing and sharing among themselves for their own private use, sexualised images, this behavior not does amount ‑‑ is not related to behavior which is connected with the production, possession, and transmission of child abuse material because of their sexual development. However, when that is part of an exploitative behavior, even, for example, in the context of grooming or pornographic performances which involve children, then in such cases the children should be immediately ‑‑ addressed to support help and the perpetrators should be prosecuted, as Sonia has also said before me. I just would like to conclude by saying, like, Hazel, that in this monitoring process, the committee involved children, and the children we involved to share their experience and give their recommendations raised all of the same points that Hazel has put forward, so children participating in Europe had the same issues and main recommendations as those that Hazel has put forward. Most importantly, they are calling for sexuality education, which is not focused on scaring them off of certain behaviors but empowering them to understand how to better protect themselves while developing their sexuality and also the need to have apps or videos or easily accessible tools online to report when they become victims of exploitation so that those images can be taken down. And I think other speakers will address the question which was raised concerning whether there is legislation or obligations in legislation as to the automatic detection and removal of abusive material. I can just hint at the fact that this is definitely a topic which is extremely high on the agenda in Europe and beyond currently in the context of the European Union, there is a proposal for regulation to require that this content be removed by the private sector industry, the industry which ‑‑ where the platforms where this material is to be found. But probably there is ‑‑ this will be dwelled upon later. It was not part of the analysis that the committee carried out over the past years, which occurred when this proposal was actually coming up, and therefore it will be seen how that is dealt with in the near future once it is enforced. But for sure this requirement is not yet embedded in legislation throughout the European countries, and therefore it is important that it will be spelled out in the regulation, which is coming ‑‑ which is being negotiated currently. And the obligations of those involved are clear, and the necessary safeguards with respect to of all the fundamental rights involved are invited in the procedure and the requirements that will be put forward.
I think that for the moment I will stop here because I will then come back to you later on with main recommendations. Thank you very much for your attention.
>> JUTTA CROLL: Thank you very much. Yes, thank you, Gioia, for giving us that overview and also for the very useful work the committee have been doing in this regard because otherwise we would not know how the legal situation in regard of selfgenerated sexualised content is in various countries. I have seen people that we have in the meeting who already raised their hand. So, Sophie, could you please tell us the names? Because the technical supporters need to know who they should open the floor and the microphone so that they can come in with their intervention. And then later on we go to the perspective from the South African youth. But firstly, let's stick a little bit more to the legislation question.
>> SOPHIE POHLE: Actually, I cannot see anybody's hand raised in my view.
>> JUTTA CROLL: I thought I had seen an insert in the Zoom meeting, but I know that Martin ‑‑ Martin should be there who wanted to talk about the German legislation, which is currently under review in this regard. And then also Stella Anne who is on our list of speakers, but she wanted to step in here also with the legislation in Asia. Is Martin in the room? Can anybody help me?
>> SOPHIE POHLE: I have seen him.
>> Hello. Can you hear me?
>> SOPHIE POHLE: Hi.
>> Yes. I had trouble unmuting. Can you hear me all right?
>> JUTTA CROLL: Yes, we can hear you very well. Hi, Martin.
>> Martin: Hello. My name is Martin. I work at the Safe Internet Center in Germany. Thank you very much for having me. And I'm here today, I was asked to give a little review of a meeting we had last week. It was organized by the German (? ) Jutta, you have to help me, children ‑‑ what's the English translation? Children's Charity Fund, right?
>> SOPHIE POHLE: German Children's Fund, it is.
>> Martin: Yeah. And we had a working group meeting on the topic of sexting and the exchange of selfgenerated content. And the meeting consisted of two short lectures by a psychologist and by a lawyer, giving their perspectives on the topic, and the two lectures were followed by a discussion. So what were the main takeaways from our meeting? For me, the main takeaway from the first input by a psychologist was that we should really be careful not to disproportionately focus on the problems and the negative sides of teenagers engaging in sexting. Yes, everybody agrees that there are risks you take when you engage in sexting, and people that encounter problems such as nonconsensual sharing of their pictures, for example, those people, of course, should receive proper help. However, the majority of people does not make such negative experiences. On the contrary, sexting can be seen as a regular and healthy part of sexual development of teenagers. And especially in my line of work, which is child protection and raising awareness for all the problems and all the scary stuff you can encounter on the Internet, it's sometimes quite harder to take a step back and take a look at the bigger picture and not only focus on the problems but also see the positive aspects.
The second input was focused on the current situation in Germany when it comes to prosecution. It's a quite complicated matter. I cannot discuss it in full detail here, but I will try to sum it up. So basically since last year, procession of pictures depicting minors in sexual situations is a crime. And this means the minimum sentence for this is one year in prison. There are cases where you can engage in sexting as a minor and not be persecuted for a crime. However, there are also many cases where teenagers and children are committing a crime when they engage in sexting.
When you look at it in detail, it's a quite absurd situation, and luckily since the new law came into effect last year, it slowly but surely dawned on the people that this was a really bad idea. So now chances are quite good that we'll see a revision of said law in the near future. But at the moment, we have to deal with the situation as it is right now.
Then for our discussion, after the two inputs, we focused on the aspect of empowering children. How can we protect children from harm but also respect their rights to regular and healthy sexual development which includes sexting for some of them. Also one of my fellow members of the working group pointed out that sexting can play an important part of persons of the LGBTQ community. For example, if you cannot reveal your sexual identity due to fear of repression from your family or from your peers, going online and finding a partner there may be your only choice.
As I said, in Germany, the emphasis when talking about sexting is on the risks, mostly, and there was also a question that I was asked to pose here at the Internet Governance Forum where we can get answers and ideas from all around the world. Do you know about any resources or projects that talk about sexting in an empowering and noncondescending way to teenagers? We would be very pleased to hear your opinion and make sure to share any ideas with my fellow members of the working group. Thank you very much.
>> JUTTA CROLL: Thank you, Martin, for this input from the ‑‑ I would say German local hub that had their meeting a week ago. And I do think this fits very well into the perspective that we now get from South African youth, Tebogo. I hope I spelled that correctly. Are you there, and can you please give us an impression on the situation there?
>> TEBOGO KOPANE: Hello, everyone. Hi, Jutta. I hope you can hear me. Am I audible? Please let me know by nodding. There we go. Thank you very much for the opportunity. My name is Tebogo. I am representing the Technical Community, African group. I also work with a nonprofit organization called Young Aspiring Thinkers, and we do ‑‑ we work with a lot of young people, teenagers, from grades 9 until 12, which is the last grade of high school. And we interact with a lot of youth. This year we interacted with about 2,000 learners. So this is an important aspect that we also should be looking into. We do address this, but I think we don't necessarily address the sexualised aspect, which is an important aspect of a learner, of a young person. We cannot even address the individual without also addressing that aspect. It's a very important discussion that we are having.
So to give you perspective of South Africa, in South Africa, these kind of ‑‑ the increased access to the Internet, for example, has definitely propelled laws such as the one that has brought us here as well as the law in South Africa, the Sexual Offenses Act, to be reformed, to be looked over in order to address the learners or young people as active agents in the space that we're talking about. So in South Africa, there is a Law Reform Commission that was created in order to address these gaps that actually exist because of the increased access in Internet or the Internet space as well as these reforms are ones that were created for the Sexual Offense Act, which is the only act that we have that addresses or governs sexual offenses in South Africa. So these reforms ‑‑ or this Commission had 11 recommendations that were formulated in order to address these gaps or to bridge the gaps that were currently created. But in South Africa and as well as in the rest of the continent, I think it gets a bit more complex, and the reason I say this is because we have an issue of an increased access of this infrastructure but also the issue of that's compounded by the culture of silence. There's a lot of young people that are not speaking up. There's a lot of young people that are not having conversations with their elders. And this is a cultural issue, right? There aren't enough open conversations that are happening with caregivers. There aren't any open conversations that are happening with trusted individuals such as teachers. It's not in our curriculums. It's not being implemented in those kind of spaces that are supposed to be trusted spaces wherein this kind of engagement has to happen.
So if I were to give an example, there's a study that was done by the UNICEF as well as INTERPOL where it speaks about 200,000 learners between the ages of 12 and 17 in Tanzania alone have been sexually abused and have been coerced to create sexual content online, whether it's by their own peers or whether it's by an active individual or an older individual. That kind of example makes you think that there are closed conversations that are not happening inside the household. There are conversations that are not happening that are educative, right? Because we live in a society that is very black and white. It's a carrot and stick society. It's not so much educative and reparative. So this legislation, for example, in South Africa and the one that has brought us all together kind of forces us to do a lot of reform work, but necessarily to move from wrong and right to create a space wherein there's discussions that are open, that are happening, and ask questions such as what happened? What made you take part in such an act? How do we then teach and capacitate these young learners as agentive agents or active participants? And how do we teach and capacitate the learner to understand that the effects, right, the effects of producing, the effect of coercing, the effect of participating in selfgenerated sexualised content, the magnitude of these effects, the carrot and stick that actually takes place in the magnitude of effect. The formulation of teaching and learning environments. How do we then ‑‑ what kind of individuals do we bring in to teach learners and capacitate for them in order to understand the effect of what they are doing as well as a reparative measures. How do these reparative measures look like, and are they protecting the victims, or are they making the victims feel less than?
So it's a discussion that has to be had in depth. And locally, by locally, I mean in the families and also in a much broader spectrum, which is legislatively, but it's definitely a very important discussion globally.
>> JUTTA CROLL: Thank you, Tebogo. That was really interesting, especially about the cultural differences with young people not talking about to their parents or their teachers or even among themselves about the issue. I'm looking around in the room, and I see someone who wants to add something. And then for the technician, could you please then open the microphone for Stella, Stella Anne. But first you have the floor, and please introduce you.
>> Yeah. So I'm Jacqueline. I'm the CEO of R&W Media, facilitating large digital communities of young people where we bring young people together and create a safe space to talk about sensitive topics. And based on the question of Martin but also what Tebogo talked about, I would just like to mention, because it is an online community, active in over six countries, having more than 5 million active followers on Facebook, but we are big on Instagram, YouTube, and what have you, operating in the Arab world, Arabic, but also operating in Kenya and partnering in South Africa. And what we know is that if you use the language young people use themselves and you use pleasure‑positive language, young people are much more attracted to the content you're sharing. We talk about love, sex and relationships. It gives the opportunity to attract more than 1 1/2 times more young people. And in the end, they stay longer on our social media content. And they also ‑‑ how do you say it ‑‑ end up on our sex ed pages. So in the end, by using fun content, you still are able to give them the right messages on how to protect themselves and how to make sure that they take the rights they have in love, sex, and relationships.
And Love Matters, you can find it on RandWmedia.org. It's really an inspiring, yeah, community. Yeah. Thank you.
>> JUTTA CROLL: Thank you so much. You can give me the link to that site, and then we will put it into the report for the session.
>> I will.
>> JUTTA CROLL: That will be useful. And it links directly to the next part of the session, which answers to further national policies and transnational strategies provide. Before we go to that, Stella, the floor is yours.
>> STELLA ANNE TEOH MING HUI: Hello, everyone. I think there's issues with my video. I'll be speaking from the perspective of ASEAN youth particularly. So just to cover current national frameworks across ASEAN, there are issues regarding the inconsistency of definitions. So like a previous speaker mentioned, use of terminology that may be a little outdated or it does not cover the complexities of our current situation which is child pornography and no clear mention of, like, terms like selfgenerated. However, there is a little hope. We do have an ASEAN regional plan for action, and it does cover selfgenerated sexualised content. But there is no concrete mention of it across ASEAN nations framework.
As for specifically the country where I'm from, Malaysia, so it has one of the highest Internet penetration rates in the region. But issues regarding legislation are quite fraught because there is issue with the clear age of consent. There's also issue regarding possession of child pornography. There's no clear definition or punishment as an offense. And across the board ‑‑ across the region, ASEAN member states, only a few of them or three out of the ten are actively part of international alliances to combat issues like this.
So from a legal perspective, we do see that there may be a lot of questions when a youth wants to look at what is available there in terms of the law, like does the law really protect our interests, et cetera? And when I think of it as youth maybe perhaps beyond the child stage, I would also like to mention that in an ending harm report, there was also mention of a culture where some people feel that children or youths, minors who have already proceeded with this selfgenerated sexualised content, if it comes back around to become a threat to them, they believe that this is, you know, a fault of the victim themselves. So it results in a sort of stigma to share that you are also involved in such an activity, and even when you are facing threats or extortion, blackmail, et cetera, you are also at the risk of being ridiculed or harassed by your own peers because they do feel that it was your own fault that you actually went that far in any context. Yeah. That's just what I wanted to share. And my question would be, in general, does anyone ‑‑ do any of the previous speakers believe that the global south will ultimately always have, you know, slower progress in terms of closing the gap between, you know, properly defining things, and would we also ‑‑ will we always be looking to, like, try and emulate frameworks from, like, perhaps the European countries or Northern American continent? Yeah.
>> JUTTA CROLL: Yes, Stella, thank you again to you. I think I will get back to that question, who is looking to whom in regard of legislation and also strategies at the end of the session because it is very important to have a look at the strategies that we are following there. But for the moment, I would like to welcome Chloe Setter from the We Protect Global Alliance. And please, could you open for Chloe, the microphone. And she will present us what the we Protect Global Alliance is and what strategies they are following.
>> Chloe Setter: Thank you very much. I'm just going to share my screen quickly. Can everyone see my screen?
>> JUTTA CROLL: Yes, we can see it. Wonderful.
>> Chloe Setter: Wonderful. Hello. I'm Chloe Setter, the head of we Protect Global Alliance. I'm glad to be with you today. When he look at child abuse online, obviously we know that children have more access to technology and children have ‑‑ are doing so younger. Plus an increase in the number of chat platforms and the sort of complexity and knowledge that offenders now deploy. We see that this is a problem that is increasing rapidly, which is a big concern for all of us here today, I'm sure. But we, as an alliance, are trying to shift this sort of narrative away from this being an inevitable consequence of connected technology towards something that is a preventable problem.
In terms of child sexual abuse online, we do something called a Global Threat Assessment every two years. And this is kind of breaks down critical trends related to abuse online. And our 2021 Global Threat Assessment found that child selfgenerated material comprised an increasing proportion of the content that was being detected. We also found that the COVID‑19 pandemic had created a sort of perfect storm for increasing this type of content partly because children were spending more time online and also because there were reduced opportunities for offenders to commit in‑person abuse. And also identified with the commercial drivers, the sort of ability for young people to potentially make money out of sharing and creating content and also, again, linked to the pandemic, an increase in sort of poverty in particular areas that led us to be very concerned about this creating a kind of perfect storm of conditions to increase the more dangerous side of child sexual abuse, child selfgenerated, child sexual abuse.
I wanted to give some statistics just by way of background. The Internet Watch Foundation, a member of We Protect, found in the first six months of 2022, almost 20,000 reports of content of selfgenerated material. And there were incidents involving 7 to 10‑year‑olds which surged by two‑thirds in the first six months of this year. The largest age group in that data is 11 to 13‑year‑olds. And the majority of victims are girls, but they are seeing a sharp rise in the number of boys. Another member of our thorn does a regular survey in the U.S. It's the U.S. only, but they have found that two in three victims of sextortion where children are being forced to create this content are girls under 16. So evidence suggests, as we've heard, that sharing sexual images is not an uncommon practice for young people. However, we do also know some children are more likely to be pressured into doing so and that some children are at risk of coercion or sextortion or nonconsensual sharing.
There are many cases children are manipulated and coerced by adults into livestreaming this type of abuse across videos and then that creates more content in terms of videos and screenshots, which can be shared by offenders. I won't go into this too much because I feel like Sonia covered this in the first half. This is, I think, a good visual representation of how this issue can be summarized, which is a very complex one, one that I sometimes struggle to explain to people, but I feel like this is a useful graphic where you can see there are kind of ‑‑ in terms of the harm that can come out of this. Obviously as we've said, it's not always a harmful activity for young people. But when we are talking about the harm, there are three different kind of ways which that can be manipulated.
So the non‑sexual material, which is, for example, a child innocently playing on the beach, not wearing many clothes, and that can be mispurposed and misused by offenders. There's obviously the incidents where children are voluntarily in an age‑appropriate way sharing with another adolescent selfgenerated material, but that can be shared against their will. And then finally, the coercion or the sextortion where a child is groomed or tricked or forced into sharing that material, and that can happen by obviously a peer or an adult. So it's quite a complicated area when we say selfgenerated material. It encapsulates a wide range of different experiences from young people. And there's often no clear consensus on what is an appropriate response. We've heard from Gioia about the types of legislation. But in terms of the practitioner response, what should teachers, parents, police and others do? We know that in many countries this is creating huge challenges because of a lack of clarity.
And so we protect we believe legal provisions should be in place to protect children from criminal liability in cases where it is appropriate and where it's intended solely for their own private use in the correct age categories. We are currently doing some research ourselves, working to speak to young people, participatory research in three countries in Ghana, in Thailand, in Ireland. And that's speaking to young people about what their views are and what they call sexting or sharing nudes and what they think could be helpful in terms of solutions by governments and tech platforms. We'll be publishing this in the first half of 2023.
In terms of the response, and I know I wanted to talk a little bit about the strategic response to this topic. So We Protect, for those of you who don't know, we bring together more than 300 members of government, Civil Society, and the Private Sector to focus specifically on child sexual abuse online. We produce ‑‑ we use kind of theme. These are the six main themes we use to break down an effective response, an effective system, response to child abuse online. In terms of producing evidence, supporting and protecting young people and using the voices of young people and survivors to advocate for change. In terms of ensuring there's a blueprint for laws, for national strategies, for supporting tech members to adhere to a common and transparent approach to reporting and implementing tools to prevent it happening in the first place. The criminal justice response in terms of having a common standard for databases and common terminologies and reporting technologies. And also that broader societal cultural response about how we as the public understand how children use technology, how children can ask for platforms to be safer. And underpinning all of that is international collaboration in terms of making sure that there's all the relevant organizations are working together.
I'm on the final one, don't worry. So we have, to guide this response, a national framework called The Model National Response, which is a blueprint to help governments on the response and also a Global Strategic Response, which is about looking at that international nature and how we can coordinate across governments. But these are kind of the core five recommendations that we put forward for governments in terms of tackling this issue. And I'm aware I'm slightly over time, so I will finish there and hopefully able to answer any questions that anyone has.
>> JUTTA CROLL: Thank you, Chloe, for this presentation and for giving us an insight in the really important work that we protect global alliance is doing around the world. We have on the list now the question how can young people themselves help design approaches? And I would like, again, look around the room, if anybody in the room would like to take the floor and tell us about what approaches you think would be appropriate to address the issue. So no one ‑‑ you would like? A bit reluctant. Please take the floor and introduce yourself.
>> First of all, I would like to thank you for this interesting topic and session. Maybe ‑‑ I had a couple of questions, but I've got the response. But I'd like to answer you about how to ‑‑ how can we get involved or involve people more and more on this topic and to protect children and to protect ‑‑ to protect our children, in general? So I think the most important thing to be done is to involve more and more associations and organizations like sport ‑‑ sport teams, like NGOs that are in charge of cultural ‑‑ and so on. This is, I think, one of the important ideas, I think, to be taken into account.
>> JUTTA CROLL: Mm‑hmm. Thank you. Thank you so much. Anyone else in the room who has to report about the process? Yes, please, take the floor and introduce yourself.
>> Hello. Hello, everyone. And also thank you for giving me the floor to ask ‑‑ to provide some opinion on that. Sorry, I forgot to introduce myself. I'm Keo from Myanmar. I think we have to think about in a different way because very recently I found the community domain for children for creating a safe space for them. That would be one of the solutions for protecting the children from harming themselves from the content moderation as well. (?) Organization and I recently found an idea to protect the children and also to create a safe space for the children as well from the technical aspects. Also, I think the kinds of initiatives should be more supported because if we can create a domain for the children and create the space for the children, but that has an effect on the community as well. That would be another idea how we can solve and address ‑‑ to protect our children and the children's rights as well. Thank you.
>> JUTTA CROLL: Thank you for that input. And I do think we will have another session also where we will talk about the dot kids domain and how we can make that ‑‑ sorry, the generic top‑level domain a safe space for children as well, but that is within another session.
Hazel and Tebogo, do you have anything to add from the youth perspective?
>> HAZEL BITANA: Yes. I'll echo the first point of Keo. What is important is that children will be given a space in decision‑making processes. And children from different backgrounds should be involved because children are not (?). It is crucial that children from different situations will be given a space and that their involvement will be meaningful. I heard recently from another speaker, another engagement, digital survey providers tend to design approaches for a perfect family, with present and supportive adults who can readily provide support to children. But that is not always the case. We should take into consideration the nuances. In Asia, for example, we have a number of children left behind by migrant areas. That's one of the characteristics of the region. My parents are migrating and children are left for care of grandparents or other adults, you know, other relatives.
So we should take into consideration those nuances.
And I also touched on this earlier, but I wanted to add another detail. What service providers can do is to provide acceptable child‑friendly and high‑quality information to children. Children go online to find answers to questions, including about health or relationships that are difficult to ask offline because these are issues that are often considered taboo in their culture or questions that adults perceive that children should not be asking. So they are afraid to ask their parents or teachers about these things, so they go online for information. So these things that we need to ensure that children have access to high‑quality information online. We should address the online misinformation and disinformation including on sexual and reproductive health and rights and sexuality and relationships. And we should also develop children's digital literacy which includes critical thinking.
>> JUTTA CROLL: Thank you. Thank you, Hazel.
>> Can I just ‑‑
>> JUTTA CROLL: Please, take the floor.
>> TEBOGO KOPANE: I think that it is definitely a collaborative effort, and it is not a one‑size‑fits‑all solution or approach that we need to develop. It has to be context‑specific. So what happens in South Africa will definitely not be the most feasible or most implementable thing that would happen in Zambia, and that's our neighboring countries as an example because there are differences in the context in the ways in which we operate. So it definitely has to be a collaborative effort. And like I had said initially, it has to be an environment of an open and honest conversation that has to be had with the different stakeholders that form part of the life of the minor, and I think that in all honesty, because this conversation is not being had in a public domain and a lot of young kids from the ages of 10 actually do go to, like, porn sites or sexual content sites in order to get information. So it just goes to show that the immediate caregivers or the immediate trusted individuals or entities are not the most reliable because, and it could also be a cultural thing, right? So adding individuals or experts such as sexologists in the life orientation class. Life orientation is a subject in South Africa wherein it teaches learners about life. So anything relating to life, but it doesn't touch on sex or it doesn't touch about the sexual being of the learner. So now that becomes a huge problem because you ask yourself then that means it doesn't address the learner as a whole or the young person as a whole. So where do they go? They go to these online sites because those are the most easy to access. And then it begs the question of then what are the parents saying when a learner or when the child accesses these sites?
Parents don't even know or don't even have a clue what their kids are having. So what does control look like? And the most healthiest way ever. Controlling the access that they have online.
And what are big‑tech companies also saying in terms of ‑‑ and that's a question that was asked on the floor as well ‑‑ what are big‑tech companies contributing in this discussion? It can't just be only a thing of are you 18 or younger? Surely I'm going to say yes, I'm 18, while I'm 10, just so that I can get the answers that I want, right? So it has to be an active and continuous engagement of all the stakeholders involved in order to address this (?) And we can discuss it.
>> JUTTA CROLL: Okay, thank you. I think we are running out of time, so thank you so much for giving us, again, your input to the session. We will now turn again to Sonia Livingstone and then Gioia Scappucci from the Council of Europe. Because we are here at the Internet Governance Forum, and I'm really convinced that all these questions are related to Internet governance and that we need to look at what Internet governance can do to support a common approach in respect of different political systems and cultural backgrounds that we have already heard about. And Sonia and also Gioia, if you could also help please answer the question ‑‑ the first question in the room, how can we hold platform providers accountable, and could Internet governance give some input into that debate, how accountable they are, which responsibility the platform providers could take, and which precautionary measures they could undertake, not to have these situations, these young people being criminalized in a certain way? Sonia, the floor is yours now again.
>> SONIA LIVINGSTONE: Thank you so much. This has been a really fascinating debate. I just wanted to read the very short paragraph from General Comment 25 specifically on the question of selfgenerated sexual material, which says selfgenerated sexual material by children that they possess and/or share with their consent and solely for their own private use should not be criminalized, and child‑friendly channels should be created to allow children to safely seek advice and assistance where it relates to selfgenerated sexually explicit content.
So I want to ‑‑ I mean, the rest of the general comment has many answers to the question that Jutta Croll raised about both the options available to the state and the options ‑‑ the recommendations to business. But I really just wanted to emphasize that the implication of paragraph 118 in the general comment is that it really only sexually generated content only relates to that which is done with consent and is appropriate to the young person's evolving capacity. And I do think it might help this conversation if we didn't use the word "selfgenerated" at all in relation to situations of coercion and/or exploitation. And the general comment refers to those as technologically facilitated forms of exploitation and abuse. And I think that notion of technological facilitation captures the way in which an abuser is remote from the physical situation without the implication, and it will always be possible to misinterpret if we call an abusive situation in some sense down to the child themselves. So I would ‑‑ on listening to the kind of two conversations in a way that we've been having here, how do we respect the sexual rights and sexual expression of emerging adults as the psychologists would say and how do we respect what young people themselves want in that regard, and how do we keep that really very different and distinct from exploitative situations? Bearing in mind, as has been said before, that all images can be abused in being shared away from their original use and original context.
So we are talking about some very different kinds of situations. General Comment 25, I think, you know, lays out a whole host of things that both states and businesses should do. Both should begin all policy processes with a Child Rights Impact Assessment. I think that would just bring together child consultation, a holistic approach to child rights, a recognition of the different issues that have to be addressed, and put them at the forefront of the agenda before a policy or before a new product is released and becomes effect in the world. So Child Rights Impact Assessments captures many things. If we did that, I think many things would be greatly improved.
But clearly there is a very important relationship to be clarified in legislation about the relation between when law enforcement gets involved. And I take some hope from the Council of Europe charting that, you know, it's now clear what should be done. That the challenge in a way is one of implementation. States need to transform their language, be much more careful in the distinctions they make about what is with consent and what is under coercion and then give much clearer guidance to law enforcement for what they criminalize and not. And also, of course, Civil Society and the state have a very important role to play in relation to providing help, opportunities for those open and honest conversations that we have been discussing today and ensuring that children do have access to support, whether they need it because ‑‑ as it were, a consensual situation has gone wrong or whether they need it because they find themselves in a situation of coercion and exploitation.
So business, is remains an open conversation, I think. What General Comment 25 calls for is that this should be a very active dialogue, that we need a very multistakeholder dialogue. Clearly business does have unique forms of knowledge. It and only it knows when one actually or potentially abusive adult is contacting multiple children across multiple circumstances. That's the kind of unique data that we need them to harness with transparency and in collaboration. But I don't think anyone wants business to be kind of the last port of call for safeguarding children. I think that we do want that responsibility to stay with the state and with the oversight of Civil Society. Thank you.
>> JUTTA CROLL: Thank you so much, Sonia. We have, again, a question from the floor. So is it addressed to Sonia?
>> Yes, please. Hi, Sonia. Thank you very much for your insights. My question is about actually the idea of content. What we believe as people who work in children's rights and human rights, it is also problematic to gain consent from children, to generate consent, that's also abuse. So how do you define consent and considering that there are age differences and across countries, there is no one homogeneous description of sexual abuse, right? So when you go to Egypt, until a certain time, it was not criminalized, domestic abuse, whereas you go to Sweden, you see a lot different ideas on what consent is. Even in a consensual relationship among adults, you can be convicted of rape if you are not informing your partner about your protection. So we see there are a lot of differences among legislation. So what's number one part. I think that interface how you suggest we address it. And secondly, I think one of the issues ‑‑ I'm really curious about your opinion ‑‑ is privacy. So based off in Turkey, we had a project which was giving quite live information about every single second that child sexual abuse imagery was downloaded, child exploitation imagery was downloaded. However, the site had to be taken down because the government and the European Union was, like, it's great you're finding those, but we can't act on them because our privacy laws do not allow us to go against those people. So this might be a bit detail, but I'm really interested about your idea of how do you define consent when you're trying to do actions among different countries and different actions, and how would privacy laws be applied to those? So thank you.
>> JUTTA CROLL: Thank you for that question. Sonia, would you like to take it? Would you be able to take it?
>> SONIA LIVINGSTONE: Well, I'm happy to offer some thoughts. I think one of the really difficult issues here, and I think Gioia and others will want to comment, the situations you described in a way are just overly simplified. Of course, questions of every country has ‑‑ most countries have already a law on the question of consent and sexual consent, and it's embedded in many ways in the law, in ways that are understood but always understood according to a child's evolving capacity and context.
So the problem arises in a way in the digital world in relation to Internet governance, either because as number of us sitting in Civil Society, we don't have access to the knowledge and the nuance of the particular platforms or online circumstances in which interactions occur, or we haven't yet developed sufficiently nuanced responses to them. And so we call for platforms to ban or restrict or allow in a very kind of crude way. So in a way we're in early stage of the debate, we need to speed it up because children's wellbeing is at stake. But I go back to the idea that the notion of sexual consent, I think in both the cases you described, and you described it as abuse in Egypt and with a more nuanced notion of consent in Sweden. I think actually there is less ambiguity about when children can give consent under those circumstances. But it will require a nuanced decision that I think we do not want platforms to be making.
And I would say the same on the question of privacy in a way that was kind of an overly simplified account, clearly many countries have privacy and data protection regulation that does enable the identification of actual or potential abusers. But the particular case described seemed to me to be going about it the wrong way, and actually we already have many practices that do identify abuses with more subtlety and in accordance with data protection regulation. So I think we need to be very careful not to look for simple, overarching solutions in relation to fantastically complex problems, we do need a more nuanced language to discuss the technological spaces and interactions that are at the heart of facilitating some of these abuses.
>> JUTTA CROLL: Thank you, Sonia. And I think it perfectly fits in that we hand over now to Gioia, because Gioia, you have already referenced in your entry statement to the seasoned regulation that is on the table from the European Commission, and that is, of course, dealing with the question whether privacy could be given priority over child protection or the other way around or whether we find a balanced approach to both. So, Gioia, you have, again, five minutes.
>> GIOIA SCAPPUCCI: Yes. Thank you very much. I will be brief. First of all, I would like to say that it's not a question of which rights have precedence over others. It's a question of finding the right balance and respecting all of the rights. And in this specific very complex situation, that is also possible. It is not impossible at all. Because there are safeguards and guarantees that can be put in place in order to ensure at the same time that the children whose images are circulating against and are perpetrating the fact that they have undergone sexual abuse and therefore are retraumatizing them by the simple existence and knowing that they are online are taken down. And that can be counterbalanced with the fact that when we are not in a situation of abuse, obviously there are data protection and privacy rights that need to be safeguarded and respected, and when there is no reason to enter, a fundamental reason to interfere with that, that should not be done. And there are different kinds of technologies that can be used, and the technologies which are less intrusive and respectful of the privacy are those which are being considered, and the interplay of many factors has to be taken into account, and that is the whole difficulty of the negotiation, which is at stake. But the final aim is that of ensuring that children which have been subject to abuse are supported and helped also by knowing that those images are no longer there. This is something that law enforcement so that states need to count, to factor in the policies and legislation that they are putting in place in this area. And it is definitely one of those areas where there is no single actor that can find the right solution alone. All players are important, and all types of cooperation at all levels are crucial, and that is why it is extremely important that the private sector is also involved in the constitutions and understands and contributes with its point of view to what is feasible and not feasible so that the right solution can be found. So this is with respect to the issue of balancing rights in a way which is doing right to the persons that are the right owners. So that's very important.
But I wanted also to reply to the question concerning what kind of Internet governance, really how can the private sector help and empower children as the representatives in today's discussion which were speaking more from the direct experience of children and youth's point of view said very clearly, it is important to multiply the possibilities for children to find information which is ‑‑ well, explained in a way which is accessible for them and reflecting their ability to understand very complex phenomenon, so it has to be accessible for children, appropriate to their age, and very well framed information but also information as regards to spaces where they can speak up, report, not have the fear of addressing the taboos. This is what children are telling us as well. And this is definitely ‑‑ there is a need for putting more out there online to explain and increase all of this and share the good practices that exist in certain countries. One of the speakers asked for for examples of practices in empowering children and understanding sexting properly in a safe way. Those promising practices are collected by definitely the committee collects these promising practices just as the We Protect Global Alliance, but also within the context of the We Protect Global Alliance, the Tech Coalition has made it clear that it is engaging with respect to certain principles and safeguards. So it's really also important to share this information worldwide and to make it accessible to the children themselves, so the private sector can definitely help in making this more correct information available. And to do that it needs to be in a dialogue with Civil Society and with the state actors which are working in this area and meetings like this help to raise more awareness, and I will send links as well. I cannot do this on the iPad of my daughter, which is the only device with which I have managed to connect to the meeting. I tried with my computer at work, but it doesn't work. So I will send you tools which already exist and that can be adapted also in other work ‑‑ in other languages because it was very important what was said by Hazel that a lot of these tools exist in English in particular, but we need to make them more known and available in an infinite number of other languages because not all children understand and read English. Not all children actually read, so that is why we also need material which is interactive and more child friendly in this area. So I think I will ‑‑ looking at the watch, I will stop here. And I'm happy to answer any other questions that there might be. It's very important, I think, to avoid the misunderstanding on terminology to adhere to the international benchmarks that exist.
>> JUTTA CROLL: We need to save five minutes to wrap up.
>> GIOIA SCAPPUCCI: Yes, exactly.
>> JUTTA CROLL: I want to turn to the online moderator, Sophie. Do we have any questions in the chat or someone who has raised their hand?
>> SOPHIE POHLE: Yes, we do. We had a question in the chat, which relates to what Gioia just said, that we need to save child‑appropriate and empowering spaces. Samira asked how can the issue of verification a child's age can be handled? Given that some children give false age when signing up on certain platforms. We are very happy that Chloe Setter already answered in the chat and gave a link with an overview of education to answer this question how this can be used. Thanks for that. And then we also have a raised hand from Lea Peters. I am about to unmute you now.
>> Lea Peters: Thank you very much.
>> JUTTA CROLL: So Lea, the floor is yours.
>> Yes, thank you. I'm Lea Peters. I'm from Germany. My video doesn't work right now, apparently, so it's without, but it's fine. I have a question because we were, like, talking a lot about, like, the discussion on how to best involve the business sector, but I was wondering if there's also discussions about how to involve also the financial sector, because especially with, like, online abuse via livestreaming, for example, or also when we talked about children seeing this, like, as a possibility to actually make money from, like, sending pictures, we know that the financial sector is, like, a big part that could be involved. Like, for example, Western Union told us that it is really hard for them to report cases because, I mean, there are indicators for what money transfers are related to, possibly sexual exploitation or sexual abuse. But this still falls under, like, money laundering when they report it, and since it's, like ‑‑ it's not compared to other money laundering, a lot of money, with livestreaming, I think it's mostly around 15 to 30 Euros. So it's not that much. But there are indicators of why this could be, like, a case of sexual exploitation, by, for example, livestreaming. So I was wondering if, like, on a global level, there's also discussions on how to make this accessible for more countries and for more national legislations to also involve the financial sector and get their help in prosecuting crimes.
>> JUTTA CROLL: Thank you, Lea, for the question. I don't think we have anyone from the financial sector here in the room. So I would like to refer that question to a further session. We know that the financial sector is really present at this year's Internet Governance Forum, and we will try to get in contact with them and have an exchange on how far we could go with such cooperation.
Now it's my task to use the final two minutes to wrap up this session. And first of all, I would like to thank all of you, those of you who are here in the room, those of you who have been participants online, and especially a great thanks to all the speakers in this session. We are generally asked to have a gender‑balanced approach in our IGF sessions, and I was really ‑‑ I was wondering, we had only female speakers on this issue. But when I have been looking around in the room, I saw that many males were in the room were interested in the topic, and so I would like to encourage you to stay in contact with us so that probably next year we can have a session which is a bit more gender balanced.
We had the question, who is looking to learn from whom, and what can we learn from each other during our debate? And I do think we have lot lots of ideas and input how we can go further with that. And I don't think that the global south should always look to the other areas of the world. We need to look to the global south and then better understanding that not all approaches fit in all countries. The General Comment number 25 gives us a really good framework because it is addressing all children's rights in this regard, in regard of the digital environment. But nonetheless, like the U.N. Convention, also General Comments address the states, and then the states need to translate their obligations into national law. They are also paying attention to cultural differences and to what they have on the table. But still we can rely on General Comment number 25 to give us a framework in which we can discuss all these questions. We will also continue the question of children's rights in the digital environment on Thursday morning with the Dynamic Coalition session at 9:30. So it's really early, even more early for people from Europe, which is in the Banquet Room. And there we will discuss how we can translate data and laws into a child rights‑based approach, and you are very welcome to join that session as well. For today, now, I think this session is closed. Thank you so much for your attention, and have a nice evening. Bye‑bye.