IGF 2022 Day 2 WS #318 Gen-Z in Cyberspace: Are We Safe Online?

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.



>> LEVY SYANSEKE: Good afternoon, everyone, I hope we are good and have had a good day and lunch.  We come to this session, specifically looking at Gen‑Z in terms of the cyberspace and safety online.  To a great extent we might be biased for online safety for children, mainly because this is the coming generation, and they are getting into technology faster than we did when we started getting into it.  I'm sure almost all of us can attest it's a period where cybercrime is on the rise, even the number of children being exposed to different online information and technology are more susceptible to cybercrime or other dangers online.  The need to have them protected becomes critical, even this discussion we'll talk about how internet should be governed.  Some have taken an extreme of restricting how children restrict the internet.  For some who have taken to the extra mile, restricted internet access for children.  We cannot avoid the fact even children need to have access to the internet.

I'm sure almost all of us can agree.  The fact we are looking at children having access with a growth of technology, this is critical to development of every society, it becomes then important that we actually consider the safety of children online.  And so at the beginning, before I introduce our speakers, just want to get ‑‑ a few thoughts from everyone in the room.  What's your brief tech on what online safety would look like from your perspective.  Let me have two or three people to share their thoughts on that before I introduce the speakers.  Anyone?

What is online safety for children in this case would mean to you.


Maybe even from those that are participating online, yeah?

>> AUDIENCE MEMBER: I don't know if I can participate in this topic.  This is a very important thing, the most important thing I think is the restriction of the sexual content to the children, which is the most affected things to them.

>> LEVY SYANSEKE: Thank you.  Anyone else, what are your thoughts, what would online child safety look like for you, in your view?  Sure, you can, right.

>> AUDIENCE MEMBER: Can you hear me?  So I agree, I think there are certain topics that are ‑‑ that worry us as adults in particular, but one thing that I think about a great deal, I think as adults, both as parents, I think as people that work in this space, we tend to make a differentiation between online and offline, and I think for kids, that is a somewhat superficial, that is somewhat of a superficial bifurcation.

For them, there is not necessarily this online or offline, for them it is life.  They are increasingly living their lives in digital spaces all the time.

I think in many ways we need to think of able in much the same way we think about offline and trying not to create such a differentiation in online or offline.

>> LEVY SYANSEKE: Thank you so much.  Let me have one ‑‑ let me have one last person and move on.

>> AUDIENCE MEMBER: I wish I could have able world where he will not be able to find any gain, which is ‑‑ which is engaged in fighting.  Less advertising for more games, most probably I am thinking it is making my child disintegrated.

>> LEVY SYANSEKE: That's quite interesting.

So that literally a lot of different views about what online safety could look like, especially for the development of children.  From the Internet Society Zambia chapter, one of the things I'm biased is internet for education, with specifically access and connectivity for children because of the benefit that the internet brought for me when I was growing up.  Today we are having a session on generation z in the cyberspace.  Allow me to introduce our speakers for today.  We have three, if not two online, and we have three right in this room.  So let me start by asking them to introduce themselves.  I'll start with Jennifer, you could introduce yourself.

>> JENNIFER CHUNG: Always ‑‑ hello, everyone, my name is Jennifer Chung, you might have seen me around the venue this morning, a few days ago, as well, but for this particular session, I'm putting on the hat of the Secretary General of Dotkids foundation, and it is the only child centric internet top level domain.  You'll hear more about what Dotkids is about in a little bit.

>> LEVY SYANSEKE: Thank you.  While we are doing the introduction, I've been asked by the technical team, maybe people seated at the far end on my left, they could move to this side so the room doesn't look empty, as well as for easy connection of the footage.  I don't know if we could move.  You could probably move this side.

All right, thank you.  Ask the next person, Aris, introduce yourself.

>> ARIS IGNACIO: Thank you.  Hello.  Hello.  Okay.  Am I being heard?  So hello and good afternoon to each and every one, I'm Aris, I'm an academician by heart, and my nature is I'm connected with a university in the Philippines, and more often than not, I concentrate more on ICT technology, I'm a technical person by nature, and I do teach students, more specifically in the undergraduate level, and also the graduate level.  I do involve myself with regards to some of, you know, internet governance issues related to education, and also to technology as well.

>> LEVY SYANSEKE: That's an awesome one.  Let me allow two speakers online to introduce themselves then allow one more person in the room to introduce herself.  I don't know if Pavel is in the meeting.  You can introduce yourself.

>> PAVEL FARHAN: Hi, good afternoon to everyone, can you hear me.

>> LEVY SYANSEKE: We can hear you.

>> PAVEL FARHAN: Perfect.  So good afternoon, good evening to everyone, I'm Pavel, I'm actually based in Thailand.  So unfortunately, couldn't make it to the IGF this year, but fingers crossed to make it there next year.

What am I doing?  Right now, I'm working as a program officer and as the AP star secretariat at the internet education and research lab in the university in which I'm working, and I also graduated from this university, based in Bangkok.

I have a passion for internet governance issues, and I like to advocate for issues very close to my heart.  Of course, this online child safety being one of these issues that I have been like advocating for quite some time.  It would be interesting to hear everyone's views on this as well.  That's all for me.

>> LEVY SYANSEKE: Thank you, Pavel, next Andre introduce himself.  Andre.

>>  ANDRE:  Thank you for the speak.  I represent the registry, and I'm a Croatian specialist, and I think that with the protection of children on the internet is very party things because children have a fragile psyche, but it is wrong to simply prohibit children from the internet, and we need to teach them to use it safely.  Thank you.

>> LEVY SYANSEKE: Thank you, Andre.  Lastly, among the speakers we have, just a small recognition, Mauricia couldn't join.  The last is Arivek.

>> Good morning for everyone here.  I'm Arivek.  I'm a researcher, head of the youth educational and scientific project, international information security school.  You may be heard about it, we have the United Nation working group, and I'm also ambassador 2022.  Thank you.

>> LEVY SYANSEKE: So I'm informed that Mauricia is online.  Introduce yourself and.

>> MAURICIA ABDOL TSHILUNDA: Good afternoon, good evening, everyone, wherever you are in the world.  Very simply, my hat for today is the CEO of communications and I have over ten years’ experience in youth development and advocating also over the years for the safety of children online and making sure that their voices are heard as well.  I have a background in psychology, gender studies, as well communication, so I come from all of these various backgrounds and hats when it comes to addressing youth development, youth online safety of children online safety as well.  Thank you so much.

>> LEVY SYANSEKE: All right, thank you so much, Mauricia.  Let's get into the discussion.

We talk about online safety probably in this very room there are a lot of definitions based on what we've come across or the dangers exposed to, so on and so forth.  Let me get a perspective of bringing this discussion together from the participants.  I'll ask probably almost all of you to just give your perspective on what is child online safety.  Online child safety.  Let me start with Arivek.

>>  ARIVEK:  It is complex policy.  When we speak about children protection online, be sure to mention that shouldn't be also just the responsibility of their parents.  It should be also the zone of responsibility of government, civil society, of scientific community, and there is not just one site that should be responsible, all our society is responsible, and we should take more steps to make the internet more safe for children.

>> LEVY SYANSEKE: All right, thank you.

Let me move to Jennifer, what's your perspective on child online safety.

>> JENNIFER CHUNG: Sure, thank you.  Maybe I'll speak a little bit on the standpoint of, I guess, the domain name space.  I think a lot of people I heard, I heard from participants and from my esteemed panelists, child online safety is a very complex thing that requires a lot of different stakeholders, requires education and especially now speaking from the tech community side, a lot of platforms have a lot of responsibilities and children are online a lot, and we have a little bit of a tech issue in the room, we have the lights back on.  So it's very imperative for us to understand, to not just limit certain types of content, which is clear there is questionable content, extreme violence, pornography, things that should not be available to children under a certain age, of course.

At the other end of things, it is very important to realize that they need to be able to express themselves, to be able to create content that is appropriate for them to learn and take the benefits of what internet can bring to their education and development of children.  So from the tech community standpoint, there is kind of a double‑edged sword here.

We need to allow to give them space for them to develop and learn in a safe environment and we also need to be very responsive to any abuse, any questionable materials, and this takes a lot of coordination with law enforcement, this takes a lot of coordination with trusted partners, trusted notifiers and takes a lot of coordination with child rights organizations and specific ones like internet watch foundation, who are experts in, you know, taking down child sexual abuse materials.

It does require quite a complex stance, but I think we are all here in this room and also online as well to try to make this happen and try to make this work.

>> LEVY SYANSEKE: Let me shift a bit online before we get back to the room.  Andre and Pavel and Mauricia.  What's your perspective on child online safety.  How do you look at it?  Maybe Mauricia can start.

>> MAURICIA ABDOL TSHILUNDA: I love starting with very simplified definition and making sure that we are all on the same page in the room.  I know we have varied experiences and backgrounds represented in terms of our audience today.

For everyone in the room, just safety can be understood as the frame works, the policies, the regulations that are focused on making sure online engagement for children is done safely, is done securely, and that we are ‑‑ our children and our youth are in no way exploited, taken advantage of or end up losing their lives because of being active in the ecosystem and online.  We all know that the internet with mobile phones and other electronic media format has provided our children and young people with levels of access to information, the culture, to communication and entertainment that 20 years ago we would have never imagined would be possible.

So now we are faced with a myriad of challenges, as Jennifer also beautifully outlined for you, and we are grappling with this.  So essentially what it is, when we look at the definition of online safety for children, it's really just looking at the various frameworks, the various policies, various regulations that we are discussing putting in place and formulating and continuously updated to make sure that our young people and children are not exploited when they do engage in the internet space.

>> LEVY SYANSEKE: Thank you.  Pavel, your take.

>> PAVEL FARHAN: Thank you, Levi and thank you to all my esteemed panelists as well for the different definitions and different backgrounds they're coming from.  I want to come from the background of the risks and responsibilities.

So when we say child online safety, we can say it's the risks to the children and young people of having abusive images of them shared on the internet.  Of being groomed or lured into sexual conversations or exploitation by adult offenders.

And also, we also talk about cyber bullying, talk about sexual harassment online.  So they are all ‑‑ these are all come under the umbrella when we talk about child online safety.

How do we tackle these crimes against children, because you could say a single angle is ineffective?

I would say to systematically protect children, it is up to legislators, regulators, law enforcement organizations, social service providers, teachers, educators, parents and of course the private sector.  We have to work together because single handedly, no one will make it work.  Yes, it's true in the modern age, there are a lot of kids who are capable when it comes to looking at the risks presented to them to the internet.  I would like to give them that advantage that, yes, kids today are quite smart, but at the end of the day, they are kids.  Despite their agency, we shouldn't overestimate the children's capacity for self‑defense.

They are children, after all.  Adults are ultimately responsible for creating a system that guarantees that children's fair and equal access to the internet, so that they have fair and equal access to the internet while they are safe at the same time.

I have had this conversation with some older generations, so, you know, what they always say, kids today shouldn't be having SmartPhones, you know, it's the parent's fault for spoiling their kids, et cetera, et cetera, or, you know, something even like, you know, in my time, we grew up fine without technology, and I'm like, well, spoil era letter, technology is here now, and it's here to stay, so we got to adapt to it.

Of course, we have to also understand that according to the convention to the rights of children, which is by the U.N., all children have fundamental rights and access to information, participation, leisure, play, and it is our collective responsibility to guarantee that those rights of children, whether they are in the digital or physical environments, are protected.

That's from me.

>> LEVY SYANSEKE: All right, thank you.  Let me quickly get the thoughts of what online child safety means from Aris and move to the next spot.  Aris.

>> ARIS IGNACIO: Thank you, Levy.  It's tough to give a definition if you're the last one to give.  Because ‑‑ but it's okay, I would love it last.

But it's really tough, it's really a complex idea, as you can see, everyone has mentioned all the aspects and all the things that need to be included there.  I'm going to look at it in an academic perspective and a more simpler way, wherein when we talk about online child safety, we are going to talk about more on providing awareness to people, awareness that they need to know in order for them to move forward because without awareness, they would not be able to know how to do things, how we are going to craft policies, how we are going to do tech, for example, to protect them, and so on and so forth.

We all know that if we are going to provide this to our children, we also need to know on what range are we looking at?  Because we are not only looking at kids here, but also looking at people who are young adults as well.  Adolescent age, because we cannot guarantee their safety also as of this moment because there are a lot ‑‑ maybe some of the races that some people around the world do have this ‑‑ do have this perception this adolescent ones in the proper age already can think straight, but, you know, other people around the world, especially in some parts of the world that doesn't really have an advanced type of technology to deal with, and they do have that certain point, that, you know, they're kind of lost.  They cling on technology, and that is what we need to look into.

>> LEVY SYANSEKE: Thank you so much.  Having heard the different perspective on what child online safety looks like, looks like it's not a single approach to ensuring safety.  As a result, it brings in different stakeholders at play and different rules for different stakeholders in the tech space.  My current question gets to be about, are there any current frameworks that are ensuring there's actually child safety.  Let me start with you, Jennifer, again, before I get to Aris.

>> JENNIFER CHUNG: Okay.  Thank you, Levy.  Bringing back to what I know, there are national frameworks, but if you're looking, of course we are at the U.N. IGF, we look at the United Nations conventions on the right of the child, I think that is the instrument from where a lot of national jurisdictions take that kind of framework and codify it into regulation and law.  Speaking actually really from the perspective of Dotkids, being a top level domain, we are not a platform or a content provider.  It is an internet top level domain where people can create content and have a domain name and create content.  The most important part when we are thinking about policies and frameworks in the space created specifically for children, and when I'm talking about children, talking about the United Nations convention on the rights of the child, any person under the age of 18.

So as Aris has mentioned, there is a sliding scale, both in terms of maturity going through the actual physical age of the child or young person, and also there's a cultural aspect of it, in Asia, in Africa, and in Europe and Americas, it is quite different, what is appropriate or what is the children, or the young people are exposed to.

For Dotkids in particular, we are created a set of guiding principles, really it's based on the conventions on the rights of the child, to create and foster an online space with a kid's best interest at heart.

What does that actually mean?  It means there are certain types of content that should not be in the space, types of contents such as child images, type of content, extreme violence, things like alcohol and tobacco, which is legal for adults above the age of majority, but not legal for children under the age of 14 and depending on whatever jurisdiction you're looking at.

Things like drugs and child substance abuse, child labor, trafficking and soldiers and also things like that should not be there.  It is quite interesting for us to look at it from this standpoint.  When you're looking at other parts of the internet, name space, such as dot com, dot organize or dot Asia, there's no such restriction on this particular type of content, we are not in the business of trying to restrict content, that is more on the side of things like platform providers like Facebook and Meta and other things like that.

But if you are looking at to create a safe space, we need to make sure there are guardrails in place that not only you create the policies where these are guardrails there, you also create a way for people to report abuse, so that I think is very important.

Having the policy set in place, having guardrails and having a channel from which these abuses can be reported, and also periodic review of such frameworks, periodic review of people in child rights organizations.  But the agency of children themselves, they are quite sophisticated in certain ways to know what types of content, which would be desirable or educational to them, and they also need to have the agency to have a say in what they would like to see, what they would like to create.  There's a lot of kids and young people who are content creators now on different platforms, on their own website.  So balancing that in the framework is quite essential, I think.

>> LEVY SYANSEKE: Let me take the view from the academic perspective, Aris.  Some of the best practices with regard to regional biases, with regard to making sure there's online child safety.

>> ARIS IGNACIO: Being in the academic sector, there are a lot of frameworks that are out there.  I would like to emphasize and concentrate more on digital competencies, because digital competencies can lead you to providing more security and maybe students or young ones would know how to protect themselves.  That is one thing we need, how to protect themselves.  That is really the first thing.

We must first think about how we are going to obtain this.  There are a lot of frameworks out there, with regards to digital competency, one crafted by the European Union, the ditch com framework, but there are other digital comp ten tees with regard to the framework.  The international society for technology in education is another thing, which concentrates more on the younger ones, the k‑12 or the portion owe K‑12 levels, where all of those have been crafted and for each and every level from the beginners point of view, learner, beginners, learners who are beginning to learn up until the point that they were able to know what is right and what is wrong, based on standards created.

That is encompassing everything, this would develop the skills ‑‑ not only the skills, the awareness also and how they would be able to roam around with regards to how technology is being used, how would they be responsible in utilizing technology with regards to anything that they would do moving forward.

Those are quite some of the frameworks maybe schools or even organizations might look into and integrated.

>> LEVY SYANSEKE: Thank you, Aris.  Let me get to people online, let me get some ideas on the current frameworks, maybe you could speak biased towards the region you're represented tore the sector you're represented.  Let me start with Mauricia and Pavel can come up and Andre.  Mauricia.

>> MAURICIA ABDOL TSHILUNDA: Thank you, actually wanted to mention ‑‑ mentioned quite a bit on the frameworks as well, what is important here is the cross‑sectional and cross‑regional work and ethnicity groups that have come up and contribute to these frameworks as well, this is one of the important practices, which is also important that we focus on as part of the overall question we are looking at.  Let me Shedd a light on that.  I'll mention that ‑‑ shed a light on that.  For some of the organizations that have been involved in developing the frameworks and have done fantastic work when it comes to the cross‑sectional and cross‑regional work and collaborations, has been the Dynamic Coalition on children's rights in the children's environment.  They are the Dynamic Coalition part of the internet governance body or ecosystem, as some of you may know, and many of you now have been made aware of.

Other cross‑regional groups that I've come across that I've been very happy with in terms of the frameworks they've employed to ensure all the voice in the multistakeholder model is taken into consideration when looking at child safety regulations and policies has been the Internet Society special interest groups.  You can ‑‑ standing groups as well now, that's been updated.

Please take a look at the online safety group as well as the internet for education group.

Then one of the other ‑‑ well, organizations and kind of movements that have focused also on developing key frameworks and collaborating, the national children's rights ‑‑ intersectoral coordination committee.  The national children rights intersectional coordination committee is established to facilitate, collaboration between government and civil society organizations, right, so it consists of your children, civil society organizations, which are ‑‑ recognizes key partners in overseeing the protection and promotion of our children's rights online.

I want to quickly come back to what has been achieved through this, in terms of best practices and what it has ‑‑ when I just look at the Dynamic Coalition on children's rights in the digital environment, which is, again, the Dynamic Coalition that focuses on children's online safety within the framework of the internet governance ecosystem that we have, they have actually been able to get in the voices of 60 individual and organizational members from civil society, private sector, academia and government.  This is across the world.  The individual outputs of these individuals and organizations are actually quite considerable, and it ranges from service delivery, law, and policy development and implementation, goodness, also training, research and advocacy aspect is taken, and all of this, right, is done with the goal of understanding, promoting, and protecting children's broad and diverse rights in relation to the digital environment.

Yes, it's being done region necessarily, but there's, regionally, there's been a shift a more cross‑regional approach and collaborative effort.  For time, I'll leave it there and contribute a bit further on this and expand as we continue with the discussion.

>> LEVY SYANSEKE: All right, thank you, Mauricia.  Pavel, your take maybe briefly.

>> PAVEL FARHAN: Actually won't take too much time on this because I feel that most of the technicalities of the frameworks have already been spoken for.

The frameworks, the governments have already, like, tended to tackle online related sexual exploitation and abuse with an emphasis on building the architecture to protect and rescue children, raising awareness, reducing harm, et cetera, et cetera, and yes, these are essential components to the framework as a protection response.  Internationally, however, progress is quite patchy.

I'm talking about legal jurisdictions that Jennifer mentioned a bit about, because we fail to enact legislation sufficient to combat child abuse images, especially, or laws to criminalize grooming, for example.  Finally, there is a lack of awareness among parents and agencies with child protection responsibilities because this is what we should be building the framework around.

If I were to give an example from where I'm from in Bangladesh, for example, during the COVID‑19 pandemic, a lot of kids who really didn't know ‑‑ who didn't really use the internet as much before, finally got on the internet and were obviously ‑‑ a lot of kids were a victim of inappropriate messages, cyber harassment, and they told their parents about it, but their parents didn't know what to do.  There was no law in place on how ‑‑ how does a parent help their child in this aspect because they don't have that prior knowledge either because it is such a new thing to them, not being ‑‑ not being online during this whole time.

So yeah, if we ‑‑ if this could be a priority for policy makers and build a framework around it, that would be my suggestion.

>> LEVY SYANSEKE: All right.  Let me get Andre and then get back to the room as we proceed to the next part.

>>  ANDRE:  So the main question is what are the current frameworks and best practices in ensuring child online safety.  Probably could speak from your back.

>> LEVY SYANSEKE: Could speak from your background and region.

>>  ANDRE:  I will share with you Russian initiatives in the field of creating safe internet for children and their education, I believe protecting children is primarily protecting them from destructive content and dangerous on the internet, and many children are subjected to abuse, deception and bullying on the internet, and they need to be protection, not only from schemers, but from peers who can bully them.

And Russia has its own Cyrillic domain focused on children's projects.  The children.  The first Cyrillic domain for projects related to childhood.  I can tell you a little bit more our experience.

The main dot is designed to promote the development of a safe internet from ‑‑ for underage users and to become a platform combining Russian language, website for children and parents and the domain's mission, dot data, is to help improve the quality of internet use by children and teenagers, internet space of trust.  And that consolidates high quality and attractive entertaining and educational internet content and makes the state of children and teenagers on the internet comfortable and safe.

The main task of that domain, dot data is combining high quality and safe content on one platform, and to do this, two‑stage monitoring system is configured in the domain zones, which allows you to find and quickly eliminate threats and so we are positioned dot data as a safe online space in which is not scary to let the child go alone, and we also have the project of the internet governance, aimed at improving the digital literacy of children and teenagers, and since its launch, the internet governance, has developed into a multi‑platform combining educational forums, the basis of the project is an interactive portal with games and with the help.  You can easily and effectively study the internet, there are technologies that make it up, innovations, internet culture, rules for the safe use of personal data, artificial intelligence, smart technologies and much more.

As part of this project, we conduct annual online championship for school children.  And the project is more than ten years old, it is useful for the development of digital literacy of children.  Thank you, that's all.

>> LEVY SYANSEKE: Let me get back in the room and ask Arivek who have more on the question.

>>  ARIVEK:  When we speak about ensuring child online safety, one aspect of this is the protection of minors from malicious contact, and issue is beyond doubt.

So harmful effects on minors on the internet can be expressed as follows, malicious content, content is malicious interaction design, malicious communication and online advertising.

In my opinion, the main areas that are now becoming the subject of Croatian and the Russian Federation, but I think it is the trend for all countries now, and if we speak about global prospects of national regulation, in my opinion, it's necessary now to develop legislative framework at the national levels that will consider the status of the youth as vulnerable group in the digital environment and establish their comprehensive protection, which will need in addition to provide access to the internet and information in the digital environment, ensuring respect for the privacy of child, prohibiting the discrimination in digital environment, and the formation of more, safer internet space for children.

At the moment, Russia mainly provides for a system of parental control, and there's responsibility for what the child sees and reads, practically lies in the hands of parents.

At the legislative level now, there are some discussions taking place on the possibility of adopting their requirement, the categorization of online content, including with regard to children, as well as requirements according to service providers who have appropriate systems and processes to protect children from access to inappropriate content.

I think that it's also general trend for all countries.

>> LEVY SYANSEKE: That's quite interesting.

From this conversation, I would like to believe there are people who have more experience in looking at child online safety with regard to internet access for children, maybe a quick check, are there any suggestions, let me bring it back to the audience, any suggestions with regard to how we can actually improve with regards to ensuring children are safe online.  Any text from the audience?  It can be online or even in the room.

Do you have any thoughts or suggestions with regard to how current frameworks that you may be aware of or how you've looked at different laws and regulations to see how it can be improved or updated.  People online or in the meeting.

>> AUDIENCE MEMBER: I can take place, I'm Dara, part of United Nations ambassador, and Russian youth consult.  So I think the main point here most of the parents and older people doesn't really think that children are ‑‑ need to be protected in online, this topic isn't discussed that much apparently in Russian legislative system, they just like my colleague said, only parent control.

So don't really think they need to develop rules, I think that's supposed to be discussed more.  So it's supposed to be, how to put it?  People have to really think about it.  So maybe that's what is missing, at least for now.

>> LEVY SYANSEKE: Thank you.  Any other thoughts?  You can go ahead, sir.

>> AUDIENCE MEMBER: Thank you for the opportunity.  My name is Ethan, and frankly, I'm the old version, but I'm the IGF youth ambassador for 2021.  Yeah, but I just wanted to say in terms of regulation, that's one thing, I want to add into the conversation, current regulations but what I can speak of, before we go into regulations and the effectiveness there of, there's a prior conversation that is to be had around implementation, right?  So in many countries ‑‑ well, in some countries, we have data protection acts that speak to how data and privacy can be handled, others don't.  But one common theme that you find in most of these countries is there's still a lack of implementation on those laws.  For example, the U.K. has the GDPR, considered one of the strongest data protection acts, regulation, in 2021 we saw Tik Tok being fined 92 million for privacy concerns that children are ‑‑ that affects children.  In 2022 being fined about 13 million.

So we have the laws in some of these countries, most of these countries, yes, laws are good, but I don't think the conversation should just be around the law itself.  It should be what implementation measures do we have in place so if we have a law, we can implement what the law says, if we don't have a law, as we build for that law, we have the mechanism it is that are strong enough to ensure that we have a secure digital space for online usage.

>> LEVY SYANSEKE: Let me get maybe another input promise the online moderator, for Stella.  Any contributions from the online audience.

>> STELLA ANNE MING HUI TEOH: I'll go through some of the questions we see online.  From the internet, there's a question how can we guarantee child online safety when adults are always busy in professional agenda and leave their kids online unattended.  Also a question from Judith on what are your opinions on age verification systems?  Perhaps you could ‑‑ any of the speakers, would you like to take these questions.

>> LEVY SYANSEKE: Okay.  I don't know who would be up to take any of the questions.  I think‑year‑old like to get Aris on it, because he made mention of something on that line.  Go over the question again.  I think I can see it in the chat.

>> ARIS IGNACIO: Two questions, first is the ‑‑ yeah, how to guarantee online safety when parents are always busy in their professional agenda.  Oh, wow.

Well, it really falls on the responsibility, and if, you know, parents should be ‑‑ also be given awareness programs at the start, because, you know, if they will not be aware, a lot ‑‑ well, the past generation, sorry to say, we cannot guarantee that they are really more versed on what technology is now, not unlike their kids.

So it's really hard for them somewhat to comprehend on what their kids are doing.  So at the end of the day, if they would be given something like a program ‑‑ very simple, you know, trading program or maybe an awareness type of ‑‑ maybe a webinar, as simple as that, just to emphasize the things that what current technology has been doing and what is the effects to their respective sons and daughters, maybe that can lead to this particular parents to be more responsible in taking care and looking after their respective children.  So I ‑‑ it's really difficult as of this point, just like what you mentioned earlier.  There is really no implementation really with regards to what we really need to do.

We really need to start at the bottom level, that is really ‑‑ it's really difficult as of this point.  But, you know, that's only my opinion.

>> LEVY SYANSEKE: Jennifer.

>> JENNIFER CHUNG: Maybe I'll take a little stab at the question about age verification systems, it's right now quite patch work.  I think if I'm speaking from the U.S. standpoint, a lot of these websites really try to do this, implement this because they're trying to not let minors look at things like gambling sites, things like alcohol or adult beverage providers, things like tobacco, things like adult entertainment and porn, as well as things like movies rated above a certain rating for viewing.

But the flip side also, I think, the participant next to me has mentioned is implementation.  There's always this flip side as to how are you going to verify somebody's age if you're trying to access a certain data boss, there's always data breaches and different standards for different jurisdictions exist right now.

That is a million dollar question, if we are able to find a verification that is international, of course that would be ideal for, I guess, content providers, platform providers and things like that, even when you're looking at social media sites right now, you have to when you create an account, are you above the age of 13.  They don't ask you to put in anything, any kind of identification, it's relying on an honor system.  For some content, that's fine.  For a lot of more streaming content, that's not fine.  I think the first step in trying to create something like that is to look at creating standards, creating secure standards that are able to access certain databases.

There are the data privacy laws, and then there's a whole other part that is quite interesting and problematic.  If you're looking from the Asia‑Pacific standpoint and other communities underserved, there's a lot of undocumented people, not just children, undocumented people in children.

How do you serve these people, these vulnerable underserved communities if you require them go to certain government agency to get a certain ID.  It becomes very problematic, there's the refugee population.  If you're looking at age verification, you have a whole gamut of issues ranging from data breach, data privacy, data regulations to how to implement it what time of content are you trying to stop people from looking at.  What type of audience are you trying to serve.

>> LEVY SYANSEKE: That's kind of interesting.  We have two hands online maybe we could get the questions in a row and proceed to the next part of the conversation.  We can start with Nicholas and then I think there's another question from the Internet Society of Zambia.

>>  NICHOLAS:  I'm Internet Society lawyer, I'm a youth ambassador.  I've had work in online safety cyber security.  Within the digital safe, it appears to be one of the things that there's some man I meant among.  It's not ‑‑ you man I meant.  In my view, it appears the conversation on standards is one that should be had on high stake platforms like this.  If together as a people and countries we can move towards establishing some minimum standards and norms, as far as online child safety is concerned, it would make it much more easy for developers of applications to by this time make it difficult to assess some of this content.

For instance, mention has been made of sexual abuse content, pornography and the other things.  Applications, for instance, made by design have inherent switches that enable parents to indicate whether that device will be used or whether at the time that application will be used by a child or otherwise, that way it makes it easier for them to monitor who accesses and what it is.  I think that it may help if we don't box all children into one box because, as we are all aware, these days, children have access to technologies and digital devices very early, as early as 3 or 4 all the way up to 16 and 15.  The sort of content you may not want a 5‑year‑old to be exposed to may not be the same content you don't want a 15 or 16‑year‑old to be exposed to.  And maybe stratifying the child demographic may help in terms of deciding which content is too bad for which ‑‑ and which is better for the other.  My two pennies, thank you.

>> LEVY SYANSEKE: Thank you so much, Nicholas.  Let's have the comment from the Internet Society of Zambia.

>> AUDIENCE MEMBER: My name is IGF a Zambia.  My first question is how feasible is it to actually ‑‑ how untenable is it ‑‑ what possibilities do you actually attach to actually achieving the child safety on the internet because, for example, in countries ‑‑ third world countries like mine, which is Zambia, we have a very huge gap in terms of the digital immigrants and the digital natives.

Kids know more about the internet than parents, so you have a parent actually asking a child how to actually use a device, and then here we have no controls over who is using a device.  A child can access pornography and other materials on my phone while I'm not there.

So I think one of the alternatives would be in consideration to the larger view, why not just have a creation of an internet for kids, where you'd have restrictions of what is posted there, why, and basically it would be an internet for kids.  Then I feel most of the framework being provided are only achievable in countries that have, I think, achieved a certain level of internet, I think, performance.  In countries like ours, we don't think those frameworks are implementable because I think we have a lot of digital literacy around especially with adults j.  Thank you.

>> LEVY SYANSEKE: Thank you.  So while he was speaking, Jennifer actually is appreciating the fact that you need to have more of a space just basically for kids, right?  And I don't know if this is the right time to ask Jennifer, you have Dotkids, would you like to explain more.

>> JENNIFER CHUNG: Thanks, Levy.  Thank you, the intervention from Internet Society Zambia.  I think that kind ‑‑ it's almost like I coordinated with him, but I haven't met him before that.

But it is interesting that you mention there should be an internet space for kids.  That is exactly what Dotkids is trying to build.  We know that ‑‑ there's multiple ways and multiple things that we need to do to keep children safe online, it's not just one thing, it's multiple things.

But giving them a space that already tells you, okay, in this internet space, there will not be content that is questionable, will not be pornography, no child sexual abuse materials, gambling, extreme violence, that kind of thing.

Having that space available for kids to then explore and create their own content and learn things and for schools to have that space there too.  That's one thing, that's one playground that's available for kids, I'm not naive, I don't think anybody around the table or online need to think the children will only be on this certain space, right?  It also requires educators, teachers, parents and people who are around the children to keep that protection going on, because I think there was a question earlier that Aris answered, how do we make sure they're safe if we just give them a device and leave the room?  The same analogy goes, you don't leave the room with a child with a knife, right?  You do not do that, and I'm not saying that it's the same analogy of a device like a phone or a laptop, but the potential of certain things that are abusive are important enough that you need education, both from the child's standpoint as well as the people who are providing care for that child or that young person.

Of course, it would range from whenever ‑‑ the age will range, and then the kind of protections and the measures that need to be taken will also decrease or increase as appropriate.

>> LEVY SYANSEKE: Thank you, Jennifer.

Let me get a quick comment from Aris and move to the last two parts of this session before we get feedback from the audience again.

>> ARIS IGNACIO: Just a quick comment.  I do appreciate what was mentioned by the Zambia chapter.  I did remember one thing that is very, very appropriate to kids, for them to be able to understand in the most simplest way on how they're going to deal with the internet.

I think safer internet had released a comic strip of how, you know, you're going to deal and how you're going to work around the internet.  It's like ‑‑ especially kids, who would not want to read comic strips, right, because that is the most simplest way to do it.  You know, showing images and graphics and putting some conversations in a way where in not just purely text, right?  Let's go down to the level of the kids, because if we are going to show them those kinds of materials, they will be more inclined in reading it.

I think if that would be implemented within the spaces that was being mentioned, which, you know, that would be a spark of material, reference material for them to know what is necessary.

>> LEVY SYANSEKE: Thank you.  So let me ask the second to the last question I have from my end.

When we talk about the whole of the frameworks, what needs to be done from the different stakeholders around the internet space, especially being biased towards having children safe online, my next question then is, how ‑‑ what are some of the things we need to consider in developing mechanisms to ensure that different stakeholders are fulfilling their responsibility with regard to online ‑‑ child online safety, let me start with you Arivek.

>>  ARIVEK:  I think that the need to make global efforts and now have international convention on the issue.

I think that here today, at the IGF, at international forum, I would like to speak about international, so returning to the issue of child protection on internet, it's important to know that the protection of children, as a vulnerable group, it is urgent topic, and one of the examples of the initiatives in this direction, it is the initiative of digital companies, that have proposed an initiative to create an alliance for the protection of children in the digital environment and the main goal of this alliance for the protection of children in the digital environment is to create child friendly internet space which will be based on creative and safe technologies and digital solution that will allow children to develop.  Meet the needs of the digital world, learn responsible behavior online and form moral guidelines.

So I think that it's important that the internet regulations should be carried not only at the legislative level from the top to the bottom, but also from the bottom to the up, on the initiatives of IT companies and civil society, and I think that this initiative is a great example of this, thank you.

>> LEVY SYANSEKE: Let me maybe throw this question back to the guys online before we get back to those in the room.  How do you ensure we develop these mechanisms that ensure that almost all the stakeholders involved in the internet governance space, ensuring they are fulfilling their responsibility to child online safety.  Push it back to Andre, Pavel and Mauricia.  Start with Andre, your talk.

>>  ANDRE:  ‑‑ is Andre still in the room.

>> LEVY SYANSEKE: Pavel, then Mauricia.

>> PAVEL FARHAN: Yeah, so I wanted to add to what was just being spoken by ‑‑ from the questions that came in from the previous question.

So, you know, the children's use of the internet and their behavior and vulnerabilities online differ according to their age.  To be effective, protection strategies need to incorporate measures and messages appropriate to different levels of ages of understanding.

This effective protection strategy requires obviously children's participation.  We are talking about children, and we need their input on what they think, you know, could be a safer measures for them.  We need their feedback as well, we can't score on making strategies without consulting the people we are making the strategies for.

Of course, particularly adolescents I believe have reached a certain age of understanding where their feedback can mean quite a lot, especially in terms of the design and implementation, as well as the empowerment of these kids.

Of course, you know, we talk about awareness so much.  So obviously we have to empower the parents of these kids.  We have to empower the adults who work closely with young people, the educators.  We have to enable the support.  They need support as well, to understand ‑‑ to understand this issue.

We talk about the lack of understanding with children's ‑‑ what online safety is, and we need to use this information and communication technology and the risk of hazards that may encounter and prevent that.  So I feel like ‑‑ everyone here needs to cooperate with each other, and we need to understand from the perspective and viewpoint of the people we are making these strategies for.  Yeah.

>> LEVY SYANSEKE: All right, thank you, Pavel.  Mauricia.

>> MAURICIA ABDOL TSHILUNDA: As you know, Pavel and Arivek reading from my notes, I completely agree with everything they just shared.

I wanted to add on as well, I love the saying that was developed at the IGF Poland last year as well and I continue to say this in every space as well, a little less talk, a little more action, please.

When we talk about action, I want to see that the cross‑regional collaborations, for example, should continue to be consistent and actually be increasing globally.  I want to see there be a facilitation and coordination as well as collaborations, you know, in the work of both the government and civil society organizations, when it comes to the promotion, comes to the protection and the fulfillment of the right of our children.

And then to round off, again, like our previous speakers have emphasized over and over in the various ways, we have to strengthen the capacity, we have to strengthen the systems and processes that relate to the realization of children's rights, and we need to bring everyone to the table.

I specifically thought about the voices of parents and also the awareness of parents, we cannot emphasize the importance of them ‑‑ the involvement and inclusion, you know, more.  It's critical because your first point of contact for your development, as we know, for children, is  ‑‑ are there caregivers?

I was grappling, I was listening to everyone's contributions, are we developing the messaging for parents in a way that parents can receive?

I really think we should also work this into every aspect of developing frameworks and advocating for the best practices that come from these collaborative engagements.

When we are speaking to a specific audience, like we have been very intentional.  When it comes to children, to come down to the level, as one of my previous colleagues have mentioned, and produce the content in a way that they can absorb.  Are we doing the same for parents based on their level of understanding?  And where they are.  So meeting them where they are too, and communicating with them in ways that will grab their attention, being a parent is not easy, I'm a mother of one and soon to be a mother of two, I'm pregnant, so I know what it is to be overwhelmed with responsibilities, I know what it is to want to do the best for your child, but there's just so much you need to get to, and you wear so many different hats.  It's important that we are aware valve this when speaking to parents trying to bring them to the table that we are cognizant of the life of a parents and the life of a parent based on where they are, whether they be a parent in Europe, a parent in Africa.  The level of experiences is completely different.  We need to bring this also to the fore when we want to keep parents accountable for making sure their children kind of absorb what we want to bring across in terms of online safety for them, and ‑‑ yeah, for the sake of time, I will leave it there.

>> LEVY SYANSEKE: All right.  Let me ask Jennifer and Aris, just maybe internet, share your thoughts before we move to the last one.

>> ARIS IGNACIO: Well, I've been telling everyone about awareness, that is really the thing here.

More often than not, again, I would go down further in maybe putting the challenge to each and every institution out there, being an academician, implement something that would make the students, even they are in the beginning stages, learning stages, to be aware of all the things that can harm them, and all the things they can do that can benefit them moving forward, and I think that would be the key, for them to be able to work on something and prevent ‑‑ you know, protect themselves from any danger that they would be encountering in the future.

>> LEVY SYANSEKE: So as my last question, before I get comments and questions from the audience in this room and those online, is social media companies or the big techs ensuring that they're accountable when it comes to online safety.  If your answer is no, please give us a solution how we can make them accountable.  If it's a yes, give an understanding of course how they're doing it.  Under a minute ask each speaker to give their contribution, start with those able.  Pavel, you can go first, Andre and Mauricia, and then I'll go back to the room.  Social media companies, big techs accountable when it comes to child online safety.

>> PAVEL FARHAN: Absolutely.  Given the centrality of the private sector to the internet, it has major responsibilities in relation to child protection online.  Social media companies have an obligation, both to respect human rights and to seek to prevent or mitigate adverse human rights which impacts directly linked to their operations.  Not just their operations, but products, services, advertisements they show, which is a byproduct of what people search online, especially what kids search online.

So child abuse and exploitation are what I would say manifestly adverse human rights impacts and social media companies should be held accountable.

Just one more thing I would like to add before we drop off.  So obviously the ads targeted for kids, we are talking about the data privacy of these kids, I feel like social media companies need to put something in place where they don't collect the data of children, you know.  As we know that this data is then stored somewhere to then advertise certain games or products to these kids.

So I guess there should be some kind of framework in place where parents, at least, have the knowledge of what kind of data of their children are being stored.

Social media companies need to give us access to this knowledge as well.  So that's it.

>> LEVY SYANSEKE: All right, thanks, let me have Andre and Mauricia a minute each, social media companies or the big techs accountable when it comes to child online safety.  Andre?  Okay, I've been told there's connection issues, Mauricia.

>> MAURICIA ABDOL TSHILUNDA: Really quickly, I agree with Pavel, again, I couldn't have said it better myself.  Regulations around age verification needs to be responsible.  Big tech is responsible.  The issue with, for example ‑‑ I just need big tech to come to the table more, here's what I want to emphasize, the example with data on WhatsApp is very problematic.  To be very clear, this includes the data of our children and teenage users, okay.  So we need to call for transparency here on what is being shared and also in terms of how long they are being kept online.  Yes, I've noticed good improvement in terms of images and data concerning your videos and those being removed.  However, text is still often not deleted.

So the right, for example, to be forgotten seems to have been forgotten.  So big tech really needs to come to the table 100 percent.

>> LEVY SYANSEKE: All right, thanks.  Let me get Arivek.

>>  ARIVEK:  For me, it's an interesting question, I have a lot of research on it.  In my scientific opinion, one of the best practices presented by the U.K., and I speak about bill, which was mentioned today, the bill on internet safety, and it marks the first attempt to codify the regulatory ‑‑ the regulation of online content, and it aimed to create legislative regime to combat digital and harmful content on the internet.

So the bill currently under consideration will establish the application for online platform to take care of the protection of their users, including child, and they should comply also with the courts of practice and ethics, and also will be regulatory institution, and this body will control for the issue of online harm content.

And so in my opinion, in order to ensure the safety of children, it's necessary to introduce a responsibility for global platforms at the legislative level using the example of this bill.

So, yes, online platforms have a court feature, they must moderate contact, but these courts doesn't work always good, and even possible to implement some difficult intelligence so it monitors such content, but if we speak generally, I think that the main that responsibility should be comprehensive, and all stakeholders should be responsible, including online platforms and social media.

Thank you.

>> LEVY SYANSEKE: All right, let me just get a quick thought, I'll ask Jennifer to be the last.  Let me have Aris, social media companies rather.  Really responsible when it comes to online child safety.

>> ARIS IGNACIO: Very, very short.  A big, huge and absolute yes.  Thank you.

>> LEVY SYANSEKE: Okay, Jennifer.

>> JENNIFER CHUNG: I will be as brief as Aris.  I think panelists and speakers both online and in the room have already said everything that we all feel.  They are very responsible, especially since the time that children and young people spend on these platforms.  In fact, I guess Facebook, Meta now and Google and all of these platforms, they do have safe search and do have the technology and the AI content filtering, humans looking at these too, they have all this, have the resources, it's a disproportionate amount of resources they have.  Of course they can improve and of course they should be at the table.  In fact, if we do a follow-up workshop on this topic in any other internet governance fora, it would be very interest aiming to them at the table for them to listen to our concerns and explain what are they doing, what steps are they improving on, I think that's the kind of conversation we should have.  Especially here at IGF.

>> LEVY SYANSEKE: Thank you.  Just before we wrap up, are there any comments ‑‑ let me get to one, any comments on what we've looked at so far from the audience.  Yeah.

>> AUDIENCE MEMBER: Just a few thoughts.  I think regarding how we're going to strengthen the responsibility of the different stakeholders on these matters, I think it's really important we start with what we already have.  So we do have since last year the general comment #25 on the ‑‑ on children's rights in relation to the digital environment, and I come from Brazil, and looking at the reality in Brazil of the legislators and the regulators, I don't see that document that was put out by the committee on the rights of the child being implemented.

I think we really need to think on how we're going to monitor the implementation of the start of the parameters, I'm sorry, and also cooperate with different stakeholders from different countries to make sure that they are being implemented everywhere.

About the responsibility of big techs, I think they're responsible, and that's that.

But I think a lot of the harms that come from those products and the users of those products by children, they come from the very business models of those companies, so we ‑‑ I don't think we can stop making them comply with certain parameters, we really need to take a step further and think on how we're going to endorse the development of alternatives to those applications that are on the very core harmful.

For doing that, we need to get children involved, and understand what their needs are, what they ‑‑ what it is about those applications that makes them so attractive to children, otherwise we won't develop alternatives that will be attractive to them.

>> LEVY SYANSEKE: Thank you.  Let me have one more comment or two, the last two then.  Please speak under a minute each then we can wrap up.  I'll start we than and come back to you.

>>  ETHAN:  I think to the question of what we can do to make sure big tech is ‑‑ coming to the party is basically I think we have to address what they care about most, and that is they care about making money, and so, yeah, our solutions should be around how we can ensure that if they do, you know ‑‑ if they don't take care of the issues that they should be around child safety, then we are hitting them where it history.  Fining Tik Tok 30 million when they're worth 58 billion is nothing to them, they'll make that in a week, I think that's where the issue is, if we can address that, then they will comply.

>> LEVY SYANSEKE: Thank you.  The last comment.

>> AUDIENCE MEMBER: Hi.  My name is Azuna from Nigeria, I want to speak from someone from a developing country.  I think that we need to create more awareness, how do we do this, we need to educate the parents and the children and most importantly, communicate the punitive measures.  If we focus only on the children and the parents, the predators will keep improving with their antics.

So I think we need to, you know, put more emphasis on the punitive measures and create more awareness, because from where we come from, some of us come from, we are grappling with so many things.  And I also think that maybe from the ‑‑ we need to transcend from policy to action, and implementation and lastly, I think we also need to put more emphasis, we have special days, recognize special days, for some things, online safety for children is a universal issue.

So I think we need to create more awareness, find a day where we will talk more about it and continue the discussion.  It will be on IGF in our various smaller communities.  Thank you.

>> LEVY SYANSEKE: All right.  Thank you so much.  And thanks to everyone for participating for, taking time to be part of this conversation.

So in closing, three things have come up, let's all be sharing and creating awareness, null one.  Secondly, ensuring that we push for implementation of some of the things we are advocating for, because without implementation, we get back to the conversation of what needs to be done without necessarily having them achieved.

Lastly, we still need to take action from all ends regardless of the sector we are coming from and with seriously need to ensure that.

As a way of concluding, I want to say thank you once again for being part of the conversation, I wish you a great interaction at the Internet Governance Forum for the rest of the sessions left.  Enjoy the rest of the evening.

I forgot one more thing, sorry.

Apologies.  I would like to say thanks to the team that coordinated this.  Special things to Stella, Dalili, Daria and being the background people online.  Special thanks to the participants and the speakers as well as the team that did it this year.  Enjoy the rest of the evening.