The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
I will start with you, Keith, introduce yourself and then we will go around.
>> KEITH: Thank you so much Millenium, I am hearing myself so I will remove the mic and hope I am also audible. My name is Keith. I have served previously as the coordinator for the African youth IGF.
And pleased to also be part of the coordination team for the youth IGF that has put together today's summit. Happy to be here and look forward to a great presentation.
>> MILLENIUM ANTHONY: Thank you so much, teak.
>> NIKKI: I'm VP of public policy at roadblocks which is a gaming company but we have many different ages that play Roblox. I am based in the U.S. in the San Francisco Bay Area but I'm pleased to be joined by colleagues here from the UK and also the EU.
And I had the pleasure of being at IGF and Ethiopia two years ago and I remember the youth council there I guess I'm also in IGF this is my third consecutive and I'm thrilled to be here so thank you.
>> MILLENIUM ANTHONY: Yes.
>> ANTONIA NIRVANA GREGORIO LIMA: Hi, everyone, I'm sorry I'm kind of late I lost my badge and I need to get another. My name is Nirvana Lima, I'm a researcher on digital codes especially in kids and teens and youth.
And I'm here to speak with you about my research and my work in Brazil. I am ‑‑ I have a master's degree in communication and a solid experience in this topic so thank you so much Millenium for the invitation, it's a pleasure being here with you today.
>> MILLENIUM ANTHONY: Thank you so much my dear panelists, today we are also joined with an online moderator. We have Selma here who will help moderating online participants so this session mainly we're going to be discussing about online safety for children and youth. So we have seen children getting more exposed to online spaces and online spaces that are no longer safe for them. We have seen in the couple previous months we have seen parents shooing the big tech companies because their children are no longer safe. And there's a challenge there and strategies on how we can help now these young people to stay safe online. We have different questions that are going to be guiding the policy questions that are going to be guiding our discussion today and I will mention them quickly, but as I go through my panelists, I will ask them individual questions. And the first question that are going to be guiding us once is how can stakeholders collaborate effectively to empower parents and children in ensuring online safety. What strategies are most effective in securing the digital space? And the second one says in the dynamic environment what key indicators should be considered when designing online safety programs to ensure their relevance and effectiveness in addressing emerging risks faced by children and youth? And the last policy question says, considering the diverse cultural context and legal frameworks, globally what innovative approaches can be adapted to reconcile differences and establish versatile standards for online safety interventions particularly in regions with varying levels of access and digital literacy? So I'll now go to my speaker and I will start with you, Nikki.
From your experience, forms like Instagram and Roblox what innovative safety tools and policies have proven effective in empowering parents and children to ensure online safety?
And also how can these approaches be adapted to different cultural context?
>> NIKKI: Yeah I wish I had a perfect answer but I will tell you what I think and I will speak mostly from the tech perspective since I work for a private company I think we are in a moment I think it's longer than a moment but we're in a place where there is acknowledgment that tech companies need to do more and a lot of the practices that used to kind of feel good or, can you hear me? I just lost sound. Okay. Have to be updated for where we are in 2024 going into 2025 and so I remember a time when we talked a lot about parental controls and empowering parents to make decisions and there's no doubt in my mind that that is critical. And that all companies Roblox included need to give parents these tools and they need to put them in a place where they are empowered.
At the same time, I am also a parent. And I look at my phone. My kids don't have smart phones but parents have tens and tens and tens of different apps on their phone. Even the most curious and technologically savvy parents cannot possibly navigate to each individual app, set parental controls, I think it's not realistic and so of course those controls need to be there. And we need to be providing these tools. So that parents have them and they need to be dynamic and they need to work. But I think that where we are now is really looking at defaults and looking at what are the initial settings on these platforms including Roblox and acknowledging that we need to provide that safety net in addition to the parental controls and that is a responsibility that tech companies have and defaults will look different from platform to platform but acknowledging that it can't just be about parental controls because parents are overwhelmed we actually have to establish these tools on the tech side. I think to me those defaults is important.
>> MILLENIUM ANTHONY: Thank you very much, Nikki, I want to come to you Keith, following whatever Nikki has said, do you think it's right, now we have seen the safety of children and youth online is now endangered.
Do you think from your perspective do you think it's right to just stop these kids from accessing online platforms, the internet or just putting some measures to control the youth what is your role in that?
>> SABA TIKU BEYENE: I will speak in two fronts. One as a general user and secondly as somebody who is coming from a global south.
I think right now we are moving everything online. Be it government services, be it education, life post‑pandemic has shifted and has necessitated technological use to the extent that hadn't been there before. So for us as adults, wanting to engage in sessions such as IGF for example here and we have colleagues and friends who are connected online. It eases a lot of pressure on how we can engage here. If for one reason or another someone's not here they are still able to connect and follow this session. Rightfully like you said we now have an online moderator something that perhaps before COVID was not anything we would think of like can we even put online engagement, for example? So post‑pandemic, we can't go back to prepandemic era. In the sense that it's an oxymoron for us to want to have government services online. We have working from home concepts which before pandemic the freelancers were being seen as somebody who is not serious so for children I see it's zero sum math if we want to deter them from utilizing online applications and all of these things for not just on apps but even on internet when you are talking about internet there are certain apps that are not open, free, accessible, secure, and all manner of things so this kids are already going. How do we deter them yet we're going online and in the next two or three years they will become adults. So I see that we, as stakeholders, in whatever form that we're in, whether civil society, whether government, whether youth or all these people, we need to start thinking by design and one of the things is having the kids here. So that we are not speaking for them. But we also are listening, what they are saying so we've seen many children, I mean youth in Nigeria but I long to see the children component within the IGF space so that these young kids can come and speak what their challenges are but not having me as somebody who is on the larger bracket of youth speak about children and can't identify with them. So I think to sum up the question, we really need to protect them. But we also need to have them come and speak. So that for me is something that I would like to see going forward. More children in IGF spaces. Thank you.
>> MILLENIUM ANTHONY: Thank you so much Tiku the discussion is on how we can protect kids and youth online, I want you Nirvana to tell us, I think you did a research on digital cultures in childhood, right?
And what are the most pressing challenges that kids and youth face online? What do you think are the most pressing challenges that they face?
>> ANTONIA NIRVANA GREGORIO LIMA: Okay, I'm going to bring the Brazilian perspective to the field. First of all I would like to thank you for the opportunity to speak to you all today and I want to express one more time my gratitude. And to my youth members. Please raise your hands, okay, who I have the pleasure of facilitating in the year of 2024. I'll begin my answer saying that this question is quite complex, but so is the phenomenon that is the kid influencer. The term digital influencer has undergone discussive shift in media, and research, even academic research, especially since 2015. This shift is linked to the entry of new platforms in into content, production landscape, because the concept is still evolving. Its definitions change fast however. One thing is undeniable young creators are driving discussions around celebrity culture and consumerism while also remaining as vulnerable as the audience is. Considering the content. Kids and teens today are consumers just like me, just like you, and all of you. But do you know the meaning? So the term per consumer was created by Alvin in the early 80s and when he predict that the roles of producer and consumer would increasingly blur.
Nowadays even children who are not yet older can play a role in vital content with either commercial or entertaining proposals. According to the last Brazil survey, 88% of children and adolescents aged 9‑17 have social media profiles. This is a milestone reflecting the growing digitalization of Brazilian society. On the other hand, we cannot ignore the risks that come with this online presence.
Which exposes young people to potential harm. Such as personal safety, reputation, or security. The truth is that kids and teens are not digital natives. Despite what they, what some may claim. This generation learns how to navigate online through trial and error. Just like everyone else. They are vulnerable to exploitation, not only of their personal data, but also within the influence economy which includes advertising agencies and talent agents who, for child celebrities. The data they generate can be used for commercially exploit and for targeted ads or even to manipulate their emotions, beliefs and opinions, for an entire generation, the internet is a double edge sword like we used to say in Brazil.
While it offers incredible opportunities for learning and social interaction it also exposes children and adolescents to significant risks. Such as violence, pornography, cyber bullying and misinformation. I firmly believe that initiatives aimed at educating children and teens, but educating parents as well educating educators, actually, it's very indispensable because it will help them become aware and responsible online users. This is an important path toward. It's a responsibility that we as a larger community must share in ensuring their safety online.
>> MILLENIUM ANTHONY: Thank you so much Nirvana from the discussion that we had here I'm getting that online safety.
The ways that we can use to protect this children and youth for them to be safe online is not a one man work it's a stakeholder thing, right? Now Nikki please help us understand how do you think, what key indicators can the stakeholders prioritize when designing online safety programs to address emerging risks? For example, Nirvana has said different risks that children face online.
And, yeah, like what key indicators should be considered by the stakeholders?
>> NIKKI: Yes. Okay. So I think first is a question of are there technical limitations you asked do bans work if we set a ban on technology does that help children and I think there are lots of different opinions on that and in Australia we're seeing there's a social media ban for under 16s and there's discourse about what that means and whether that's the right course for children. I think one question is, do technical implementations actually work? So do we actually have the ability to ban children? And very often I think what we find is no. And for that reason, we're seeing nine‑year‑olds on social media when the minimum age for a lot of these sites is 13. And so, I think there's ‑‑ I think there is a valid argument that outright bans don't work. And in terms of indicators, I think that what you were saying before about safety by design in terms of working with multistakeholders to bake safety in at the product conception phase. So that we're not retrofitting to keep kids safe but actually designing things with them in mind. Is the right path forward and I think that much of what we build instead of being super prescriptive, which says children can or can't do this or 14‑year‑olds can and can't see this or, you know, girls should or should not see that. I think it's so dependent on the child and it's so dependent on where they live, what access they have to technology, that it's much better instead of being prescriptive to be principle based and say that the policies that we write should be created in the best interest of the child and that there is a responsibility I think the best practices would say let's be principle‑based rather than super prescriptive because I don't think those ‑‑ I don't think that that works as technology changes and as innovation occurs.
>> MILLENIUM ANTHONY: I think I really like that point of Tiku you had mentioned about there's this design thinking approach that you bring the stakeholders and the tables and you use them to understand the needs that they have but also to innovate, creative ways of solving problems. So I think that could be really a nice approach to have the kids in the space, understand their needs and the challenges that they face, rather than just coming with solutions that we think maybe we just need to ban the internet from them while maybe there could be another way or another solution that we could use. So now I want to turn back. So we have talked about, we just had a discussion here about the importance of collaboration and stuff. So what do you think are the biggest challenges in ensuring online safety for youth? From your experiences in your specific regions or countries? What do you think are the biggest challenges of ensuring online safety for children and youth? Anyone? If you're ready, can just raise your hand and we'll pass the mic. We have a mic over there I think I saw three hands. Oh, yeah, you can start. Just turn it on.
>> AUDIENCE: Is this one working? Hello? Hello? Am I on now? Okay, yeah, sorry, I came late but like, but I ‑‑ this subject is like, I work in a rural ‑‑ I came from India, I work in rural communities. And we have a lot of first-time internet users. And especially the targets are, the youths. Who are not aware of how to use the internet first of all. Like. So when they get internet like they are so much attracted to everything that is there. Like. So they don't understand what is there and we had an incident with a girl who was abused online and she didn't know what to do about it. And she couldn't tell her family nor to any friends but some very close person who she thought could help and when they approached the cybercrime they said you need to bring more proof. You know? So the girl, she still doesn't know what to do with the content that is available when she fell for the other person, you know? Who faked her to be her friend or something this is still a challenge. We still don't know in India, especially in rural sectors how to address these problems. And when we asked about like how safe have you kept your accounts, and when we checked with just other girls in the community not just the youths but the boys also, like they said we don't care. So somebody could actually hack your account and use your name to like target somebody like so how... these things, these standards, how do you, like talk about it like awareness? How do you talk about these problems like why not the girl didn't go to her family and tell them. I've been subjected to some ‑‑ this problem, like, how do we address? So these are many challenges in India like in especially in rural areas. Nobody wants to talk about it. It's only to the peers. I told my friend. That's all. It's done. So even the boys need to understand like how to like keep their accounts safe and teach ‑‑ talk about it to your friends and families. It's okay. Like to tell. This is happening. So this is one of the incidents and still we don't have like proper support, proper standards, proper awareness system. How do we do ‑‑ so we still don't have a model for this.
>> MILLENIUM ANTHONY: Wow thank you for so much for your contribution. I think we have a lot of work to do in training these young people. Building capacities when it comes to online safety how do you use the internet safely? How do you put yourself out there so you don't attract maybe people that abuse people online like bad comments and all that. I think it's really something that we must invest to train our youth and children and I ‑‑ yes, please. After her I will take one more contribution. And then we'll see if we have a contribution online.
>> AUDIENCE: I am also from India. Basically to answer the point that you know we need to look at boys as well because there's an element to sharing any content that is online. Sharing resharing of the content. Also there is a huge generational digital divide so like we talk about parents, we talk about educators, we also talk about children educating them because with diversity there is the challenge of parents still catching up with technology while children have fast pace and that is the reason why we are unable to bridge that gap and that can only happen once we do awareness programs so we do one awareness program about use of gadgets but then what happens technology advances and we need to come back and talk about something new that has come up. It's a vicious circle but yes the lack of support is of course there. Which kind of puts things on a standstill as well or low paced I would say. Other point is of the shared devices, like children probably having their own devices but in many parts of the community children don't have devices.
So they share with their parents so when they start exploring devices of the parents they come across content that is not appropriate for them. That is one of the challenges we face and also in low income groups especially or in marginalized communities, one phone in a family is a big deal and having internet on their phones because probably parents are watching some content that is appropriate for them but not for the kids but through advertisements, through other channels they come across content which questions them and their mind but unable to ask because of the, you know, the sensitivity that we have in the cultures. To the point of, sorry taking too long but yeah, to the point of having multistakeholder interactions I think it is more about knowledge and experience. The ones who have expertise has knowledge but the ones to the point you know you made involving children is because of the experience so we need to have knowledge and experience both in the same room to talk about the fact that, you know, how the policies needs to be shaped and adopted. Thank you.
>> MILLENIUM ANTHONY: Thank you very much. I will take one more from here and then I will move online and get back to my panelists and we can get back to the floor again.
>> AUDIENCE: I can talk?
>> MILLENIUM ANTHONY: Yes, I'll come back to you.
>> AUDIENCE: Yep, thank you. Him name is Joshua from Uganda from the internet society chapter Uganda and I'm also in building systems and that sector by your point, I think one of the stakeholders we often forget is the personal ‑‑ person who actually abuse the system.
We have someone from Roblox.
My kids are fans.
>> NIKKI: I don't build that system though. You don't want me building that system.
>> AUDIENCE: That's a good point too. Well, girls who build the systems often get forgotten in these discussions and the same you're building such a system you just are looking at a target and to get the software out by this and this day so most of the times we're saying the system should do this, should do that, two factor authentication and all this other thing but if I'm working on a deadline trust me I'm going to leave all that stuff out and I think one solution to help is to involve the open source community in this discussions. Because that is one of the ways you can have that development team so that all these measures that we are talking about, these things that should be by default are somewhere that anyone can just pick up and use. And now we have applications across the board that are safe for all our users. And those are the guys I think we need to involve in this IGF. An open source communities. Thank you
>> MILLENIUM ANTHONY: Wow thank you so much Joshua, coming to you, Tiku, we have the global regulations that are set but then considering that we have different legal and culture frameworks in our specific regions and countries.
How can we balance now between the global standards of online safety and considering our own culture, our own legal frameworks and our specific countries is there like a way that we can balance this?
>> SABA TIKU BEYENE: That's a very interesting question in the sense that even when we have global standards, I think for children safety we might not have a universally accepted global standard why do I say this? Kids from my village somewhere deep in Kenya are not as privileged as maybe kids from India. I use India in the sense that the parents are already technologically aware. And I use the word aware vaguely. Because not to say that all parents in India for example have access to, you know, the technology. But the parents of India, for example, are more aware that, you know, that this is the context through which they can adopt in technology and we've seen India as a success story on how they deploy technology in all spheres of in all their life and this is true if you want to look at the global north versus the global south. And Africa is also very unique if I was to go and speak about Africa. It's very unique because there's legal frameworks that we do have. So you find different African countries struggling with even basic framework such as data protection, computer, or cybercrime laws. I think we cannot adopt a very universally accepted kind of policy. But I can share a few pointers that perhaps can guide and arrive to the same, you know, response that you're look for, maybe one is are we able to develop region specific policies? So that if we are looking at Africa for example, and we are looking at Africa as a region and the context of Africa and even when we do that from an African Union point of view, then we can scale it down, you know, that just becomes a global framework. Through which some cross border kind of crimes and issues can be addressed in the context of that framework? But we really need again to break it down so West Africa, and East Africa and southern Africa because the needs and the cultures of West Africans and southern Africans are different. Perhaps we also need to look at cross border collaboration. How do we go outside of legal framework for example that Kenya has? And look at other country and pick and harmonize this framework so that whatever is illegal in Kenya is also illegal in India, for example, so that if somebody's trying to perpetrate some crime from India, then it's not very different from Kenya, you know? Then, you know, we can use this kind of harmonized framework to support this cross-border collaboration and enforcement as well. Again, the issue of legal frameworks how do we build capacity to build these legal frameworks and ensure that we have a comprehensive cybersecurity data protection laws that also consider local context if you look at Meta I think one of the key things they grapple with is local and cultural context of issues. What is head in Kenya is not in another country. Even people who are now doing content moderation when you flag something as hate speech, they may not see it as hate speech the context is different, I think we also need to perhaps digital education and digital literacy because then what that means is we are going to promote accessible and ‑‑ accessible digital education in schools but not just for kids. How do we look at parents as well for digital education? I think disparity is also a big issue. There are some countries that are rich. There are some countries that are not rich. We find different countries grappling and struggling with catastrophes ‑‑ catastrophes such as flood and draught. How do you ask them to put money into this when they're just trying to keep them safe.
So I think one of the ways also is look at, addressing the resource issue to the extent that we can support countries with limited financing. And also technical resources. Because you've come to Kenya. Everybody's almost a geek but then you find a country like maybe Madagascar or Mozambique, the cybersecurity professionals they have is such a small number.
If you look at different reports they say Africa has a shortage of cybersecurity professionals. To the extent that they are needed up to 20,000 professional a year. Now what does that mean if you're now contextualizing that in terms of resource that you can, you know, develop and support this kind of issues to ensure that we are building a standard that is also localized. These expertise are things that we should export. Joshua here with all his technical expertise, that expertise he has can be supported. And support the people. Because if he sits in just Uganda or Tanzania, I don't know, how do we take advantage of such expertise? I'll stop at that. Thank you.
>> MILLENIUM ANTHONY: Wow, thank you so much. Do we have any contributions from online? Okay, I will get to the online. Back to you Nirvana, you have worked with kids, you have worked on different projects on how to protect children and youth online. But so can you share with us is there any strategies that you find effective in promoting active participation of children and parents in securing a safe digital environment?
>> ANTONIA NIRVANA GREGORIO LIMA: So over the past years I have been conducting research and working on issues related to internet governance with young people. But age 8‑25 who are the result of the program. But I had the opportunity to teach workshops on responsible internet use to young people age 16‑18. However, I must admit that unfortunately I haven't yet worked directly with children though I believe in the importance of doing so as soon as possible. This is not just because of my responsibility as a researcher and popular educator but I also for the civil society and government organisations, children and teenagers are facing serious challenges and as adults we need to address them since 2018 the World Health Organization has recognized digital addiction as a disorder sounding the alarm about the excessive time children spend on screens. It's more than clear that we need to develop and implement effective methodologies focused on media literacy for children, adults, parents and educators, in Brazil along with other countries in the south global, we must begin integrating media, education into school curriculum I think this is a great start from primary to high school being connected to internet is a reality. All of us know. But is for this and forward generations we must ensure that they are equipped to use internet in ways that serve their best interest. So we all are responsible for them.
>> MILLENIUM ANTHONY: All right. Thank you so much, I now welcome questions from the floor. If you have any questions to any of our panelists, I ‑‑ yes, I will take you and I think you were first and then you and then we'll do one from online. So one, two, three, four, five. It's a contribution. Allow me to take the contribution from online first.
>> AUDIENCE: (Audio distorted). So as one of the main challenges is the enforcement between the different versions in Brazil like in the north and allow digital enforcement for child, for everything it's very different from enforcement for the south and as we, as (audio distorted) Amazon and the south. But my question is for any of you guys can respond me. It's what is the main challenge for the stakeholders when you are designing a tool for children? Considering the different legal frameworks around the world? There are some main convergence between the difference between the countries, do you think there's something common between Brazil, the United States, is there something in common or do you think we don't have that point in common between all the laws focused for our children and youth? Is the question.
>> MILLENIUM ANTHONY: Any of my panelists who is ready to respond on that in one minute?
>> NIKKI: So I'm not a lawyer. I am not a lawyer, so I can't speak to commonalities in the actual law. But I think often what countries try to do is synthesize much of what is illegal in many places into what is often called their community standards for community guidelines and that's kind of the governing document for a service. And in that document, you'll usually find you can't perpetrate illegal acts, right? And that's kind of broadly defined to cover different geographies, you can't use hate speech, you can't commit fraud these kinds of things which tend to be common across geographies. What I think is much harder which Tiku touched on which the speech issue. Right? So, you know, what may be illegal to say, you know, in France, by law, you cannot deny that the holocaust happened. Right? In the U.S. in theory you can. We would all agree it's hateful but you actually can do that. And I think it's much harder to moderate around the speech issues which doesn't mean that the legal stuff isn't difficult. But I think the moderation piece is very, very difficult. I think a lot of times the governing documents like the community standards, community guidelines try to find those places in common in the law.
>> MILLENIUM ANTHONY: Thank you so much, Nikki.
>> SABA TIKU BEYENE: In addition to what my colleague has said it's also important to note or to remember that we cannot regulate online what we are unable to regulate offline. So what applies offline applies online. If we have freedom of speech here, to the extent that we can say generally to many things which, you know we can pass. How then do I curtail you from typing this if I can speak to you and say certain ones then it becomes difficult just because it's written. You know, to say that we can cur trail or put a standard to it.
So in my submission I think then starts to applying these from offline so that then it's applicable online. Otherwise if we look at it from just an online lens, it will be difficult to enforce because that is not what happens offline.
>> MILLENIUM ANTHONY: Perfect, thank you, please can we take a contribution from online and then come back onsite? Less than a minute if you can.
>> AUDIENCE: Yeah, yeah, thank you very much. Thank you very much. I am speaking from the NRI, one of the main things I want to contribute in this session is having countries implementing an online safety act as the United Kingdom has done in 2023. And that online safety act also protects not only children but adults. When you have an online safety act most of all these problems we have in navigating online safety for children and young people, happens in most social media companies and now the social media companies are now more held responsible. And yes, it might be difficult, it might take time because everybody's doing different things and putting in an act. I think within the context in each country is the way to go about it. In most of the global south, when most of the children really have internet access is in their schools. Whether that's in public or private schools. You can also introduce forms of education on online safety but it's tasked with that and if you look at the UK example, in the UK.gov you'll discover that it's very strong in protecting children and I think that is the way we should go forward. Thank you.
>> MILLENIUM ANTHONY: Thank you so much for the contribution please allow me to take the second question back there on site. The second and then the third. Please let's try to use less than a minute to summarize our questions. What?
>> AUDIENCE: My name's William I'm from South Africa and we do a lot of work with children and in fact what we've been doing coordinated by UNICEF is a group of all entities that work in online safety to try and come together and try and build common approaches and we just last week in fact presented South Africa online guidelines for dealing with online safety. There's an absence from that group and it's Roblox and Roblox internationally has come under fire quite a lot because children find it very easy to get around the measures that you've got in place currently to protect them.
And so that comes to a point of political will and of course the other big one around age verification which is what the Australian government is saying and until and unless you can demonstrate these systems we're going to say not under 16. So I'd love to get your feedback on that, thanks very much.
>> MILLENIUM ANTHONY: You want to take that, Nikki?
>> NIKKI: Sure. So I mean, I guess do you want me to respond to your question about Roblox or in terms of Australia? Yeah, so, let me start by saying that I think like, we are not perfect, like Roblox is not perfect.
And I don't think any tech company is. But I do believe that we're safety first in trying to get the right outcomes and putting children at the center of what we do. Not just because it's the right thing to do although it's the right thing to do it doesn't even make sense from a technological or business perspective to not protect children like our service would cease to exist if we didn't do that. With that said, I think that there is a lot of fair ‑‑ or just ‑‑ I don't know if it's criticism or sensitivity about the role of tech companies in society and what they need to come to the table to do. And I think that's probably been building up for quite a long time. And I also think that companies very often will be covered in the press but the reality is, is that we actually just announced a whole set of parental tools so that children can't talk one‑on‑one outside of games. Right? To prevent grooming. And that's a default. Like what I talked about before and I think we're constantly trying to update that. But we're also a company of 2,000 people so if you compare us to like a huge social media company, we still need to grow in scale so I think the answer is we are not perfect but I do think that we are consistently focused on the right outcomes. I think the question about age verification is an important one if there was a technology that could perfectly verify age it would make sense for companies to use that to the extent that they could. The problem is that those technologies are easily gone around particularly for children who are tech savvy or they might work very well but they're going to collect a ton of biometric data about the child or the person in question and so then it's a little bit of a how do you rate that? Is it more important to keep them off a service and prevent access totally? But collect a lot of their private biometric information in the process? And I think that is the central question that policymakers are struggling with. My hope is over time these tech solutions get better, and ‑‑ but I don't think we have a great answer to that right now I think age verification continues to be the question that really troubles people across tech and policy circles in the public sector.
>> MILLENIUM ANTHONY: Thank you so much, Nikki.
>> AUDIENCE: Hello, can you hear me? My name is Leanne, I'm the executive director of the 5Rights Foundation and we are a global NGO working on children's rights with the mission to build the digital world that children and young people deserve.
And we represent and work with children around the world. And I have a couple of points to make first of all what all people said in the room this has to be clear that this is a global problem and that the issues cited in India could just as well be in Uganda or in the U.S. and they certainly are and the reason that children are having very, very similar experiences online around the world and facing the same risks and the same harms is because they are using exactly the same products. This gets us to the point of is there a global solution and do we need global standards the answer is very, very clearly yes. There are, you know, companies which are representing 25% of local GDP who have a massive power differential between those companies and the children that we are talking about. And indeed Nikki I actually agree with what you have said about this is not the parents to deal with, this is not about digital literacy, this is not about children educating children to navigate an environment controlled by a number of companies and it rarely is about safety by design and about corporate responsibility and this is exactly where we can have global standards and to Joshua's point this is where designing with certain basic principles of standards in mind is very, very possible and we shouldn't overcomplicate things indeed the same applies offline as online and they are products and we do have product safety regimes, we do have privacy regimes and these should apply online. Much of that is already there. There is the convention on the rights of the child there is the general comment 25 which sets out how the rights apply. We have the global digital compact. There are regulations so a colleague online mentioned the online safety act and there's the age appropriate codes which row blocks has endorsed and one of the first endorsers of this code.
And they set out indeed not prescriptive and tech neutral systems for companies to follow. So that is there we must apply it. The big problem that remains is that I think, you know, colleagues who work on policy and who say the right things are very distinct from business interest and the people at the top who are making the business decisions and not feeding back to the designers. They need to implement which are not focused around child rights and safety. Thank you.
>> MILLENIUM ANTHONY: Thank you. Saba can you read for us the question online? We have less than one minute so if you have any questions from the audience you can ask in the end.
>> AUDIENCE: So Omar is asking what do you think are the most important steps we can take to ensure kids feel safe while supporting the digital world and what are the measures?
>> MILLENIUM ANTHONY: Anyone who can summarize in less than a minute?
>> There must be an element of trust and trust comes from a point of confidence. And feeling is not something that we can point at. So I see that if they are confident then the aspect of feeling safe comes in. And confident is not just in using. If I give my phone to a three‑year‑old today even they may not have used an iPhone in no time they will figure out the App Store and they have gone and downloaded a game, thought themselves to play a game and return to me saying I can't get past this level and I say what even is this? And that comes with literacy. Literacy not just in using, literacy in security, literacy in knowing that this is a potential threat. I think I have seen, you know, kids playing a lot of online games. And sometimes the people on the other side may be an adult trying to push these kids towards certain direction. You know? Come and do this. Or if you don't do this, I will give you this. So how do we make them feel safe if they are not aware from a point of literacy? It's something that might not be achieved if we don't look at literacy and security literacy. Cybersecurity, cyber hygiene.
For me it's easy when I see someone trying to spam me what to do but my twelve‑year‑old daughter may do the same. Maybe not unless I have told them look watch out for this and when this happens this is what you do. That's what I think.
>> MILLENIUM ANTHONY: Wow, thank you so much, Keith. Thank you so much to my panelists. Thank you for your contributions. Thank you to the audience. Thank you for being very interactive. We are out of time so thank you very much. Please connect to our panelists. They're going to be here so if you have a question you can connect with them after. Thank you.