IGF 2025 - Day 3 - Plenary Hall - High Level Session 4 - Securing Child Safety In The Age Of The Algorithms

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> ANNOUNCER: Please welcome to the stage, the moderator, Ms. Shivanee Thapa, Senior News Editor, Nepal Television.

>> SHIVANEE THAPA: A very, very good morning to all the, I mean, distinguished delegates, colleagues, participants, present here.  Thank you, thank you so, so, so much for making it here this morning, despite the musical blast we have had last night.  Thanks to the Norwegian Government and the entire people who could put up all of those good things together and for this very, very wonderful management through the professions herein.

With that said, I'm Shivanee Thapa, ladies and gentlemen.  I am a senior journalist working with the state media in Nepal Television, and feeling so, so profoundly privileged to be a part of this very important session of the 20th IGF here in Lillestrøm.

To our online participants as well, a very warm welcome.  Just a quick mention that we are at this session discussing one of the most pertinent issues of our times, an issue which no longer is an emerging risk.  It is urgent.  It is complex.  And it is deeply, deeply personal.  That is ensuring child security in the age of algorithms, ladies and gentlemen.

And we all know algorithms are not passive tools.  We know that quite well, right?  They are, in fact, very, very active architects of this children's digital experiences, who are shaping what, not only children, but what we see, how long we stay on the screens, and increasingly how we feel.  And today, as we discuss child security and safety, what these children see.

As a practicing journalist, I am very much committed to covering issues of public interest.  And as a mother of a teen, my 14-year-old is seated somewhere amongst you.  And both professionally and personally, I see so clearly that what's at stake at the moment is not just screen time.  It is childhood itself.

Ladies and gentlemen, therefore, this session today that we have tailored for you will unfold in three key arcs.  We will hear from experts on the emerging risks first, linked to algorithmic curation.

We will then reimagine what a child-centric digital ecosystem could look like from our perspectives or their expert perspectives.

And finally, we will make an attempt to explore on the accountability that must mean in this particular space from governments, from platforms, and from all of us as community of practice.

I will be joined in this conversation with very, very distinguished personalities and stalwarts who have been at the reins in getting things rolling at that particular domain.

At this panel today, I will be joined by the Honorable Minister of Digitalization and Public Governance of Norway, Ms. Karianne Tung.

The Minister of STI Sierra Leone, Ms. Salima Bah.

Ms. Leanda Barrington-Leach, Executive Director, 5Rights Foundation.

Head of Government Relations and Public Policy, Europe, TikTok, Ms. Christine Grahn.

Director for Future Networks, DG Connect, European Commission, Mr. Thibaut Kleiner.

Senior Director, Policy Development, Roblox, Ms. Emily Yu.

And last, but not least, Director of Innovation, UNICEF, Mr. Thomas Davin.

Ladies and gentlemen, let's give a round of applause as I welcome on the stage my very distinguished members of the panel.

Such a pleasure it is, Madam Minister.  A warm welcome.  A very warm welcome.  Please take your seats.  Such a privilege it is for me to be placed across you as we discuss a very, very pertinent question of our times and, of course, for the future.

And with that, distinguished panelists joining us here and all set, all poised to engage in one of the most defining challenges of our times, ladies and gentlemen.

First and foremost, to anchor this session with clarity and purpose, I have this pleasure to invite Ms. Leanda Barrington-Leach, the Executive Director of the 5Rights Foundation to open the conversation, get it rolling through this opening presentation.  You have five minutes.

>> LEANDA BARRINGTON-LEACH: Good morning.  At 5Rights Foundation we speak on behalf of children around the world, on behalf of children.  Thank you for inviting us and thank you for being here.

Speaking of the digital world, a child not long ago asked me, why won't adults stand up for children?  You watch everything we do online.  You nag us to get off our devices, even though you stay firmly glued to yours, and now you want to outright ban us.  When are you going to stop making out that we are the problem instead of the system?  Why don't you stand up for us?  She had a point.  Digital devices are today one of the leading causes of family disputes.

Google's head of families recently said what parents are spending upwards of 4 to 12 hours a week trying to manage their children's online usage.  Day after day, they are fighting and losing.

Children also are fighting and losing.  They are losing their control, their sleep, their ability to make connections, to pay attention, and to think critically.  They are losing their health, sometimes even their lives.

I am not exaggerating.  I am not generalizing.  Consistently around half of children surveyed say they feel addicted to the Internet.  Nearly two thirds say they often or sometimes feel unsafe online.  More than three quarters say they encounter content they find disturbing, sexual content, violence, hate.  A quarter to a third are bullied online.  Half experience sexual harms, a quarter sextortion.  Eating disorders, child sexual abuse and suicide are going through the roof.  The acceleration of AI is now set to supercharge these risks and harms.

But it does not have to be this way.  Children's digital experience is not a result of the technology itself, but it does reflect the priorities of those who own, build and deploy it, including AI.

To change children's experience, those priorities must change.  Today most of the services where children spend most of their time are designed with three primary purposes, all geared towards revenue generation.  Maximize time spent, maximize reach, and maximize activity.

Typical design features used to reach these objectives, include push notification, infinite scrolls and likes and game purchases, random rewards, making it easy to share and easy to connect with friend or follow with suggestions.

5Rights pathways research showed that social media accounts registered as children were all subject to messaging from strangers and illegal or harmful content within hours of being set up.  While these accounts were targeted with advertisement for games, suites and such like specifically for kids, they were at the same time also recommended harmful content from sexualized to pro suicide material.  By algorithms that weight negative or extreme content five times higher than neutral or positive content.

Children go from a simple search for slime to porn in just a single click, or from trampolining to pro-anorexia in three clicks and nudge to self-harm in 15 clicks.

It is clear that the problem is a feature not a bug of the system.  Indeed, whistleblower reports and leaked internal documents show how time and again tech companies are aware of the harm they are causing children and choosing to do it anyway.  One platform that I shall not name, as it is no particular outlier, sees children as the golden demographic and looks to hook them young.  Internal research notes that it takes the equivalent of 35 seconds -- 35 minutes, sorry, on the app to form a habit, and concludes, I quote, compulsive usage correlates with the slew of negative mental health effects like the loss of analytical skills, memory information, contextual thinking, increased anxiety, as well as interfering with personal responsibilities like sufficient sleep, work or school responsibilities and connecting with loved ones.

It does not need to be this way.  The digital world is 100% human engineered.  It can be optimized for good, just as easily as it can for bad.  It should be optimized for children by design and default.

Children's rights as set out in the most ratified human rights treaty in history must be respected including by the tech sector.  Children have the right to safety, to privacy, to freedom from commercial exploitation, to play, to participation, and meaningful agency.  The best interests of the child must come first.

Tech exceptionalism has to end and, like all industries, the sector must abide by established laws, reflect societal norms, and be held accountable for its impact by democratic oversight.  How to do this is already well established.  The general comment 2025 to the Convention sets out what states and businesses need to do.  The age-appropriate design code principles embedded in law from the UK to Indonesia, the EU to California, set out enforceable regulatory requirements.

Technical standards and protocols provided detailed practical guidelines for innovators.  Rights respecting responsible innovation for children is perfectly feasible.  But it means some awfully profitable practices.  Resistance is, therefore, unsurprisingly fierce.  Perhaps you have witnessed some of it on this stage over the last few days.  The arguments range to claiming that AI moves too fast to be regulated.  The tactics range from never-ending legal battles, to threats of shutdown.

The unavoidable question, therefore is, who will set the rules and decide what good looks like?  Will we continue to allow private for-profit tech corporations to hold increasing power over all areas of public and private life, unanswerable to governments and citizens.

Children are asking you to stand up for them, for their rights and a better, fairer future.  So, I ask you today, innovators, will you stand up for children?  Policymakers and regulators, will you stand up for children?  Citizens, fellow human beings, will you stand up for children?

If we stand together with and for children, we can and will build the digital world they deserve.  Thank you.

(Applause)

>> SHIVANEE THAPA: Thank you, thank you so much, Leanda.  Thank you for grounding us so, so powerfully and with this we believe the tone for the conversation has been set so, so right.  Thank you.

Let us now open the conversation to the full panel, and we have just heard from Leanda, from this very opening presentation how algorithmic curation designed to personalize sometimes can unintentionally shape how children think, feel and interact.

Now, based on the current evidence and lived realities, what sort of emerging risks should we draw from our collective attention, from regulators, tech labs and families alike is I believe the very first question we want to contemplate during this session.

I want to begin with Madam Minister.  Madam Minister from Norway's policy vantage point, what do you see as the most pressing risk children face in algorithm governed digital spaces today?

>> KARIANNE TUNG: Thank you, moderator, and I also would like to take the opportunity to say thank you to Leanda for bringing the children's voice to the stage.  That is really important.  And thank you also for addressing this really important question and one of the most pressing issues of our time.  That's child safety in the age of the algorithms.

As technology continues to shape children's life from the videos they play, to the games they play, and the information they consume, we must really ask ourselves now, are we doing enough to protect them?  And I don't think we are doing enough.  Because algorithms, they are powerful tools for personalization and engagement, but they do also expose children to harmful content, to bias, and to manipulation.  They can shape behavior.  They can influence choices, and they do serious damages when it comes to mental issues, body issues, and also mentioned by Leanda here.

So, just let me be clear on one thing.  Protecting children online is not about limiting their freedom.  It is about empowering them to navigate the digital world safely, confidentially and with dignity.  And it is about ensuring that technology serves their personal growth and not the other way around.

So, in my opinion, the platforms need to take more responsibility for taking down content that is damaging and prohibited, and they also need to secure appropriate age verification, and that's something we are working on in Norway right now so that we can protect our children better.

>> SHIVANEE THAPA: Well, thank you, Madam Minister.  May I turn to Minister Bah from Sierra Leone's context, Madam Minister and the broader global reflections which have been reverberating in any of the forums in IGF.

What risks concern you most as we consider the intersection of technology, childhood and regulation?

>> SALIMA BAH: Thank you so much, and may I use this opportunity again to thank the Government of Norway for hosting us?  They have been fantastic hosts.  And IGF for putting on this session.  And also to all the panelists.

Really from the government of Sierra Leone's perspective, I think such a critical area when we think about the engagements of children online and how the algorithm recommendations are set and the impacts that they have.  What's been mentioned across board is something we also is a great concern to us when you think about addictive algorithms and the potential linkages we will see to self-harm and where that leads to with young people when we think about manipulative recommendations, and also when it comes -- we think children get desensitized because as you are scrolling through one minute, you see violence, the next minute you are exposed to sexual content.

So, the repetition of that and the almost normalization of that, as a government, from our region, we are concerned about how sometimes those are linked to the increase in violence that we see with young people as well, and definitely concerned about how we protect against those.

But I would say maybe from -- maybe well, slightly unique but applies across other regions as well.  One of the most concerning things for us as well is the potential for cultural erasure with the algorithm recommendations because we understand that these algorithms are trained on datasets that potentially don't reflect our diversity or the diversity of our societies or our realities.  And that means children in Africa, from our regions are growing up in environment where they are exposed to languages, to expression, and just identity that don't reflect them.  One of the things you don't want them to see, but it's the fact that they might not be seeing something that reflects our culture and our societies and our identity and our values as well.

So, with that, there's a potential for culture erasure and the adoption of cultures, and sometimes the unrepresentation, underrepresentation from our regions as well.

So, as a society, from our region, that's something we are also really concerned about, is how do we, as much as we know the Internet and social media is good for exposing young people to different other societies, but we also want to make sure that we can also see their own society reflected on these platforms as well.

>> SHIVANEE THAPA: Right.  Let me quickly move to Ms. Christine Grahn now.  With your role at TikTok, how does the platform interpret and respond to this evolving risk posed by the algorithmic systems for the young users?

>> CHRISTINE GRAHN: Thank you, first of all, to Minister Tung and the Government of Norway for hosting us, and to IGF for inviting us to this very, very important conversation.

We as a platform think it's really crucial that we show up and that we engage in these conversations.  And if you will allow me, I would like to just take one step back and mention that TikTok is a place that people come to because it's fun, because you can figure out what book to read over summer, where you can plan your next trip or where you can discover new artists.  And also a place where our community come to express themselves and can be creative.  And this really wouldn't happen if it weren't also a safe and inclusive place.

So, all of the work that we do in terms of keeping the platform safe, we do for our community and to be able to meet our mission, which is to inspire creativity and bring joy.

And coming to your question, I mean, I think we all agree that this is highly complex area.  It's very, very fast evolving.  It's difficult to be a parent, adolescence can be hard.  Every child is different.  And I think the only way we can address this is with the fact-based and constructive debate.

And to address the risks properly, we need to define what the risks are.  And we do that mainly via research.  And research does support, to the Minister's point, that there are a lot of very positive aspects with spending time online, also for young people.

Also, there is research that confirms that those who are at the biggest risks are ones that are already exposed to risks in their everyday lives.  So, if you belong to the most vulnerable part of the community or society, you are also more exposed to risk online.

So, platforms like ours, we need to -- I mean, we have a huge responsibility in general to keep our platform -- our community safe, but in particular when it comes to these most vulnerable groups.  And we do this in many different ways.  We do it by design.  So, when you create a TikTok account as a 13-year-old, you will have a phased introduction to the platform.  So, we don't give access to direct messages, for example, until you reach the age of 16.  When you are under 16, your account is set to private by default.  And we have screen time limits in place for everyone who is under 18.  We have content levels to limit exposure to content that might not be suitable for younger audiences.  And we interrupt disruptive patterns.

But I also think that partnerships are key here.  We have a huge network around the world, and I really, really agree with the Minister of the importance to be a global platform for our global community.  Where we can also support civil society organizations to be able to reach these vulnerable communities on our platform and make sure that they speak in a voice that this community truly can integrate.

And we also support the teens.  When they are in need of help, it should be frictionless and easy to reach out to experts to get that support that they might be in need of.

There's always more to do.  As I said, we are showing up and here to always do better.  But I think an important starting point is that we keep the conversation truly fact based and work to find solutions together.

>> SHIVANEE THAPA: Right.  Thank you. 

Let me turn to Mr. Kleiner.  What would be your perspectives from a regulatory and policy perspective at the EU level?

>> THIBAUT KLEINER: Thank you for organizing this very important panel, and I think hearing the voice of children is very important here.  And I would say that maybe we should stop using the term digital natives when we speak about children.  Because sometimes you get this idea that you can leave the children with the technology and they are very savvy and they can get their way out and we don't understand the technology so well whereas they do.

Actually, the studies we conducted in the EU show that there is a very superficial understanding of the technology among children.  They can use the apps but they don't understand what is underneath.  They can be tricked.  And what we see is really that it is not something you can totally brush aside.  We have to take responsibility.  And I think that in the EU, this is precisely what we are trying to do.  We recognize the benefits and we try to empower children.  But we want to make sure that the technology providers take their responsibility, just don't let this as this is a problem for society and for parents.  Because if you look at algorithm, they are doing serious harm.  And we have introduced in the EU the Digital Services Act so it's hard regulation that precisely aims to give this responsibility to platforms in partnership because we want them to precisely develop also better ways to know what age the children have online.  You know, this is something that today you can infer.  But actually we want to do more.  We want to have age verification through various mechanisms and we will introduce, actually, an app in the coming month so that you can really verify the age of minors online.

Secondly, we are really taking this very seriously.  And we don't hesitate to actually open proceedings when we find that the platforms are not delivering on their responsibilities under the Digital Services Act.  We have already eight open cases, four with adult content platforms that are not sufficiently protecting children, but also with the like of Meta, of TikTok, and Facebook, and I think this is important, because we don't want just to have rules that are not implementing.

So, at the end of the day, I think that today is an important moment because we have the opportunity to keep the good parts of what is an offer, but to also collectively act on the risks, and in that context, we are launching now a very important effort on guidelines, also for the platforms, the consultation just closed.  It will be published hopefully by summer.  As well as an inquiry, to precisely identify these risks and make sure that we take measures to address them.

>> SHIVANEE THAPA: Right.  I think this is the right moment for me to turn to Ms. Emily Yu, Roblox with his interactive environment.  How does assess the unique vulnerabilities that algorithm might introduce for children on your platform?

>> EMILY YU: Safety is at the heart of everything we pretty much do at Roblox, and we evaluate every upcoming product feature with a safety by design perspective, making safety a fundamental requirement for everything that we do.

We, for example, launched last November a set of pretty robust parental controls that include screen time limitations that parents can set.  We have also introduced content ratings or content labeling within our system so that parents have awareness as to what an experience holds and they can, obviously, permit or not permit their child from entering that experience.

We have parental control permissions in place so that if a child decides, I am interested in participating in this experience, they receive direct permission from their parents in doing so.

And with regards to algorithms used for recommendation systems on the platform, our focus there is more on discoverability rather than limiting the content that is seen by the child based on personalization.  There are millions of experiences on the platform, and what we prefer to surface are higher quality experiences and newer emerging experiences that our audience mate be interested in.

>> SHIVANEE THAPA: Thank you.  Mr. Thomas Davin, UNICEF has a global overview of both systemic and child specific impacts.  What does the current research suggest we should be most alert to?

>> THOMAS DAVIN: I'm afraid Leanda has stole my thunder.  She covered so perfectly the risks.  And I want to maybe start by saying from a UNICEF lens, we do look at technology and AI also very positively.  There's so much we can do on learning outcomes, on have outcomes, on climate outcomes and we are bullish about trying to understand how we leverage that technology for good.  But there are a number of risks.  Maybe I will touch on what Leanda touched on to maybe re-emphasize some of the longer issues that are, in our view societal issues, that some of the technologies are currently underpinning today.  So one is mental health, clearly, with so many areas of mental health around nutrition, around self-harm, around so many of these issues.

Addiction as a part of that significant growth, explosive growth of addiction.  I would say in a somewhat genderized manner you have boys that largely go into gaming addiction and girls tend to do more social media addiction.  So, it's not exactly the same resolution mechanism either.

The other one connected to that is social isolation, with, again, significant potential societal cost and consequences to that.

The third one I would mention is maybe a little bit philosophical but is, we are at risk of losing the notion of the concept of truth.  As those algorithms brings those children into more and more of things they believe to be true, they are more and more certain of that truth and of holding that truth and they are more and more reluctant to connect or open up to others who may say, well, that is not my truth.  And so we are, to some extent, jeopardizing that whole concept of what is truth, what is fact, because everybody, including adults, are being fed by what really we are connected to or resonating with.

Maybe one issue that Leanda maybe didn't touch on, but will be interesting and important for us as society to really dig deep into is the impact of neuroplasticity.  What is going to be the impact of those screen time and the fast-paced connection on really children's brains and their ability in different ways.  And we don't really fully understand that yet.

And we believe it should be a priority in terms of research and longitudinal studies to understand maybe there are some gains, we are not certain.  But there are probably also some things that we are losing that plays out in adults, I don't know about you but I read less and less because my brain is less and less patient about longer-term focus, and that certainly is true for my own children.  I have two teenagers.

The last bit I would talk about is maybe agency, that we have a risk of really children feeling less and less able to have voice and agency on how those technologies affect them, impact them, and maybe direct some of what they have access to or what they can say.

So overall, I think we need to treat this as a public health issue.  This is an area that we don't fully understand.  This is an area that will have societal consequences and very likely significant economical costs if we don't manage it appropriately.

>> SHIVANEE THAPA: Thank you so much.

So, what we are hearing across sectors represented in this panel is, of course, a lot of sense of optimism and commitment in your actions and in your thoughts.  And also a very prominent shared concern that opaque algorithmic systems are influencing certainly children's mental health, social well-being, development and so many aspects in ways that certainly demand more coordinated attention and action.  We believe that is what we heard in this round of conversation.

Now, let us pivot to identifying the risks, I mean, from identifying the risks to imagining solutions.

If we were to re-engineer or if we were to redesign the social media environment with children's well-being at its core, centrally placed, not as an afterthought, but as a foundational principle, what would that look like; more importantly, how can we ensure that young people are not just consulted, but meaningfully involved in shaping systems that govern their digital lives?  May I begin from Ms. Leanda.

>> LEANDA BARRINGTON-LEACH: Thank you so much.  Yes, the first thing I would say to your second question, how can we ensure that young people are meaningfully included, children need to be at the policy table.  There are no children in this room today, and generally they are completely left out.

I would also say that I am talking about children and we often replace children with young people and often that can be like 25 to 35-year-olds.  Of course, it's useful to hear their voice, too, but that doesn't mean that we can ignore the youngest.

The second question about how can we ensure that we -- what are the right principles to put at the centre.  I think I mentioned a few.  And so have others.  So, the first is really privacy and safety by design and default.  What that means, a lot of the time, is turning things off, making sure that children's experience stays private and they have real agency and choice.

A lot of this is about process.  I left at the entrance some copies of the children NAI design code.  What you will see is it's a technical protocol, a step-by-step guide to asking the right questions, consulting the right experts, and then testing and documenting who decided based on what criteria.

A lot of the times asking the right questions is the most important thing, rather than other people outside prescribing potentially what is the right answer.  And this is a process which leads to risk assessment and risk mitigation, as I say, by design and default.

A few just guiding principles, don't shut children out.  And don't, please, put the burden back on parents via parental controls, for example.  They are not working.  We know they are not working.  Or onto children.  We heard they are not digital natives, digital literacy is not solvable.  So age verification, parental controls, controlling content and digital literacy, these are not the solution.

>> SHIVANEE THAPA: Let me turn to Christine.  Christine, TikTok, given TikTok's massive engagement among the youth audiences, the questions of embedding child centric values into your algorithmic infrastructures certainly is not theoretical, it's operational.  What would be your inputs to this question or this thought of reimagining?

>> CHRISTINE GRAHN: Well, we very much agree to the concept of safety by design and to, on this point, turning things off as a default for the younger kind of segments of users.  So, I mentioned a few examples, as an answer to your previous question around not allowing access to direct messages, keeping the younger teens' accounts private by default, and having these settings off as a starting point, and then over time, as they grow older, introduce them to more and more features on the platform.

So, we very much agree with that kind of base concept.  And we also agree with the importance of listening to the community and also the younger users on our platform.  And, actually, last year, I think that's the first platform we introduced a Global Youth Council.  We have representatives from 15 countries around the world, Brazil, Nigeria, Poland, Indonesia, just to mention a few.  And it's a forum where they can, in a setting created for them, share their views with us directly.  But also, and maybe even more importantly here in this conversation, it's a way for our most senior leaders to also hear from the youth community directly. 

So, at our first Global Youth Council, our CEO was present.  I am going to be present at one that we have in a couple of weeks.  And I think this is a really, really important platform for that conversation.

We also work with researchers to indirectly hear from younger people via research.  We don't just listen.  We also change, and I think that's an important kind of next step, because otherwise, there's no point in these listen exercises, right?

To give one concrete example, we worked with a British NGO, Internet Matters, that spoke about teens and parents, that told us that authenticity and belonging is very, very core parts of their online experience.

And in response, we made some global changes to age restrict certain beauty filters that would alter their appearances so that they could feel that authenticity on the platform.  So, that's kind of a very concrete example of what we do, what it is that we hear.

We also find other ways to listen to our community.  We know that teens come to TikTok to learn.  And we want to, obviously, encourage that.  We actually see reading in pollings go up in the other segments of society.  So, we have huge projects around book talk to really encourage that.  We also make product choices, conscious product choices to capture this educational interest.  So, we have rolled out a STEM feed, which captures science, technology, math and engineering content, fact checked so it's kind of pre-vetted, and it's on by default for everyone who is under 18, so it's sitting next to your news feed.  And we see on the numbers that this is very appreciated.

So, we really do, and we have ever interest, if you think of it, to listen to the community and really adapt.

>> SHIVANEE THAPA: Right.  That's incredible.

Let me turn to Minister Tung now from our policy and governance standpoint, Madam Minister, building child centric systems, perhaps, requires not just principles, but mechanisms of inclusion rights.  What frameworks do you think can guarantee their voices are heard in the process?

>> KARIANNE TUNG: Thank you.  I think it's necessary to change the logic behind the platforms.  We need to get away from addictive designs, and we have to implement models that really protect the children, I believe.

In Norway, this might sound easy or natural or whatever, but in Norway, we had a white paper where we said that children aren't goods or commodities, because I think that's the main issue, that children, they are seen as goods on these platforms, and we need to get away from that point of view.  And we need be sure that there are some principles that are the foundations for the algorithms on the different platforms.  That is about making sure that we are standing on the UN Convention on the rights of the children.

We need to be sure that we have openness and that we are -- that the platforms are transparent and understandable for everyone.  You have to understand how you can choose your content, both as a parent, but also as a child.  And we need to have age appropriate design, and that's why -- and also to ban behavioral advertisement on the platform as well.  And we need to stop profiling children on the platform, because they are not mature enough to take good decisions on behalf of their self.  And the children's well-being has to be the things that we put first when we are letting our children live their lives on the social media platform.  Because as I said in the beginning, I really believe also the platform can empower the children.  And then we need to hear the children's voice and the children are screaming out, "Please protect us!"  We have to listen to the children, people.

>> SHIVANEE THAPA: This certainly shows there's a very, very clear consensus among us that children centric design certainly begins with intention.  But it must also be followed by inclusive design, design processes, in fact, and reflect that the lived realities and voices of young users themselves, right?

So, with this, let me hop on to the other question, you know, as regulation gains momentum globally, right, because many governments are now exploring or implementing regulations aimed at protecting children online.  We just heard some very, very great concerns that are reflected even during the course of this discussion.

The challenge right now, I believe, lies in translating policy into platform-level action.  So, we ask, what are the promising policy approaches, and what role should companies play in proactively aligning their design choices with children's rights?

May I turn to Madam Minister, Ms. Salima Bah, Sierra Leone, from Sierra Leone, how do you see this from your regional and national vantage point?

>> SALIMA BAH: Thank you.  Definitely, I think the clamor for regulations when it comes to just the online space, even for adults, to be honest, and definitely for children, I think is a mad scramble going around government, policymakers, everybody is looking across to see what's worked in other places, what can I adopt.

And I think it's a testament to just the ever evolving nature of technology.  If you have one solution, I think at some point everybody thought that parental controls were the greatest things.  And now we see that changing in behavior.  So, I think absolutely we see that, and that's definitely true for our region as well.

Even though I think our region actually have been a bit slow to this party, if I could call it that, this child protection party.  And I think it might be as a result of we were slower to the Internet age.  I think we were grappling with issues of connectivity, we were grappling with issues of affordability over that.  We were grappling with issues of inaccessibility of devices.  But, obviously, now we see there's a growing expansion of the Internet in our region.  There's the growing access to devices.  And also our youthful bulge is going up.  So now we are also feeling the effects of the more African children now have readily having access to these platforms, having access to these devices.  We are seeing the impact in terms of rising in cyber bullying and so many other factors.  Even though we take the good parts, of course, which we are not saying is all bad, but we are also seeing some of the negative effects also starting to impact us, and the conversations around what do we do to ensure safety.

In Sierra Leone as well, one of the things we are looking at is an online safety legislation, specifically looking at children and how we ensure that, especially when we look at some of the negative impacts that are not really within our regions yet, but we try to see how we regulate in participation of that because we have seen how that has impacted other areas specifically.

So definitely as we are doing these regulations, we are looking at potential best examples, see what other countries have done, the GDPR, we look at that.  We look at so many other regions, just to see what works, what hasn't worked and how can we adapt some of these for our own specific needs as well.

But I think definitely one of the key aspects of it as well is that I think we understand that regulations alone, policies standing alone really won't get you.  I really think you need to work with the companies, not just the big tech companies, by the way.  We also need to work with the startups as well, right, because I think maybe with the big techs it's one of those things where we are a bit more reactive now.  All of a sudden we are reactive now.  The platforms were already being designed.  Now we are reacting to see how they can introduce these safety measures in. 

But I think with the startups, tech startups coming, or so many of the startups coming, it's about how do we work with them now at this stage so we ensure, everybody has been speaking about, that it is in the initial design and it's not an afterthought, something they think of later.

And we are seeing good uptake, to be honest, within our region.  And I will highlight an example.  There's a bunch of young people in Sierra Leone who are developing ed tech solutions with AI component, and to my surprise the other day I was having a conversation with them and they are telling me how the solution they built, first of all, the AI is a bit home grown, so they don't tap into the global AI dataset because they wanted to make sure the children using it are getting information from our society and our culture as well and our education system.  So, I thought that was great.

But one of the really interesting things, so the platform is like a learning management tool, and it has this feature called a learning buddy where children can go in a chat box and have a conversation with each other and learn from each other.  And what they have done is they introduced an AI which makes sure that the only thing you can talk about is education related.  So, we see some of those solutions already starting.  And I think it's about how as government and policymakers, we work with companies to make sure that in the design of these solutions, we are addressing the issue from there.

>> SHIVANEE THAPA: Thank you, Madam Minister.

Let me turn to Mr. Thomas Davin, and how can regulatory and corporate efforts be better aligned to ensure the additional products respect and reflect children's rights?

>> THOMAS DAVIN: We see quite a lot of progress, depending on nations on some of these regulatory systems.  I guess many of them have started from looking and tackling the most egregious issues for children, right, so child prostitution, grooming online, et cetera.  The transition to understanding how other aspect of technology can be harmful is a little bit slower and quite often many nations are regulating technology in general without actually a specific angle on children.  Some are doing that faster than others.  Part of what we see is really effective is an alliances of the regulatory approach and to some extent the monitoring approach of, is this working, is this implemented, what are we doing if it's not implemented, that quite often remains a little bit vague.  So, what happens if it's not by design, dis agreement for children, what happens if there is a sense of addictive behavior, behavior hold built into the system.  What do we then do?  It's not always clear from a regulatory platform systems.

So, trying to get to that stage and saying, this is what happens.  Having elements to also guide companies and Minister Bah just mentioned this, is many of the companies are actually willing but maybe they are uncertain so UNICEF developed a digital child right impact assessment which is a tool, one of the tools to understand what will happen, what can happen.  And that is a participatory process so we bring in development elements of society, children themselves, adolescents, young people to also speak to what they feel are rights designed approaches, wrong designed approaches and then, again, questioning what then happens once that voice is, and Christine spoke to that.  That's quite important.  Because many of the children we speak to say, you ask us for views, and then nothing happens.

So, that's quite often where we feel we need to really get to that stage where once you engage children, it has to be meaningful.  If you want it to be meaningful, it means action needs to be taken and needs to be visibly taken.  So visibly monitored, transparency is quite important.

Another element that we see really powerful is when companies agree to kind of allow anybody to look under the hood.  In other words, to understand, this is how the algorithm functions.  And once we are going to do this, this is what will change.  And you can then monitor that together as a society, again, to try to understand whether there is a sense of progress on issues we together identified.

One of the interesting -- so I think the Minister Tung didn't really go fully but we think the white paper on digital upbringing in Norway is quite interesting.  I think, again, together with regulatory, we need to look at this as a public health issue which means everybody needs to be on it.  Parents need to understand it better.  Parents need -- many parents, once you start having conversation with your teenager, if your teenager has had a screen for the last five years, it's a little bit late.  And indeed it means you are going to fight, probably a losing battle.  Again, having two teenagers, I know what that means.

So, really guiding parents and trying to understand what are maybe steps that you need to have in mind is going to be quite important.  Bringing children, there is an interesting initiative in Scotland where they have brought the children parliament from Scotland to act as a mirror to the adult parliament on AI specifically with the institutes and the Scottish AI coalition, they are essentially going into various use cases of AI and bringing back to parliament issues and recommendations on legislative pathways to tackle that.

So, there's multiple areas of work that we can bring that together.  Part of what we are trying to do is kind of bring that knowledge back and offer that as a panel of options for different countries and societies to pick up.

>> SHIVANEE THAPA: Building on these very insights, may I turn to Ms. Emily Yu.  How is Roblox even integrating trial rights thinking into its designed decisions even ahead of regulation.

>> EMILY YU: Yes, again, we have a programme called Trust by Design in which we, basically, take at the requirement level what are fundamental rights of children and how do we end up then incorporating them into the product features that we will later publish to the platform.

We also have recently launched a Teen Council, as of earlier this year, where we get a lot of feedback from teenagers throughout the world in terms of if we have additional or updated policies, what their feedback is on that, and find out from the teens themselves what they are interested in and what they want to move forward with.

I think what's really interesting about the Roblox platform as well is it's become a space in which children and teenagers are able to express themselves in areas where they maybe normally aren't able to do so in their real world experiences.  So, as an example, we have heard from a number of vulnerable groups within the Roblox platform that say that they have the ability to communicate with others that are like them that they wouldn't normally have in the real-world space.  So, we are really, really supportive of trying to foster and enable that form of communication and play, while at the same time maintaining privacy and protections in place to keep everyone safe.

>> SHIVANEE THAPA: Mr. Kleiner, let me turn to you.  Which regulatory tools or frameworks do you believe are moving that needle most effectively in holding platforms accountable to children's rights?

>> THIBAUT KLEINER: So, I would say that, first of all, regulation works.  When we have had very concrete examples, just thinking about a recent case we opened against TikTok where we found that in TikTok clarity, there were some addictive behavior, and I think that we could have a positive result because this was withdrawn from the features of the platform.

And generally speaking, this is very much our experience, that if you design legislation and you enforce it properly, this works.  You cannot just count on the good will of companies that are making profits to change their features unless they have really some pressure also coming from the regulators.

But we want to do that very effectively and that's why we are going to publish these guidelines to improve children protection online.

I can tell you that there are several elements we want to focus on, like assurance methods to restrict access to age inappropriate content.  This is very much the case for adult content websites.  We have four open procedures against them in the EU.

But also element like setting accounts as private by default, reducing the risk of unsolicitated contact by a stranger.  Making sure that they recommend those systems, also reducing the risk for children, especially about this rabbit holes where you get in contact with harmful content.  Or elements linked with the possibility for children to take control, to block and mute users and ensure that cannot be added to groups without their own agreement.

I think there are a series of measures we can take.  And through regulation, enforce them.

Another very important element is that we need to be serious about all this.  It's not enough to just pay lip service to the safety of children and that's why we are about to introduce also a mechanism that are robust to identify the age of users online.  You need to know when somebody is a minor, because otherwise, you are exposing them to potentially dangerous content.  And that's why I think it's great what Norway is doing with this white paper.  I think a lot of very, very positive element.  But if I may, I would invite Norway to actually adopt the Digital Services Act and actually invite more countries around the world to adopt the similar provisions in the Digital Services Act.  Because this is today the only way to change the reality and to force somehow also platforms and content providers to take this issue seriously.

And if I may, I think that as a last point, we are here at the IGF, and somehow what is a bit sad is that we don't see enough the emergence of services online or digital products that really are addressing children as an audience.  And maybe that's the challenge for this community.  We need to really develop the right contents and make it also a business model for some companies.  This is not at all the case today.  And I think this is also why we are failing, because we are just trying to fix something which is not made for children.

>> SHIVANEE THAPA: Thank you.  Thank you so much.

So, what emerges here certainly is the growing interplay between regulation and responsibility.  The policy can certainly set the floor.  But true impact certainly comes from when companies integrate children's rights into their design ethos from the very outset, right?

So, building on this, ensuring that digital platform serves the public interest, especially when it comes to children, it cannot rest on only one actor.  It requires deliberate coordination, as I even reap from the reflections here, across sectors, borders and, of course, mandates.

Now, what concrete steps can tech companies, governments, and international bodies take together to make this shared responsibility a reality.  This is a question with which I would first turn to Madam Minister Tung.

>> KARIANNE TUNG: Thank you.  I think that we need to act more coordinated, because no one can solve this problem alone.  I think we all agree on that one.

So, first, I believe that international organization need to be better coordinated.  I think we got great views here, both from UNESCO, the Europe Union, the 5Rights Foundation.  I think we have been given a lot of important knowledge today.  So we are able to be better internationally coordinated.  I think that would be step number one.

Step number two, of course, the governments need to have good regulations, good laws.  We are, as we are speaking, implementing DSA in Norway now, we are sending out a law proposal because I really believe in the DSA.  That is a good regulation for the European continent keeping children safe.

And number three, the tech companies need also to take more responsibility to make their platforms safer for children.  So, if the governments do their part, international organizations do their part, tech companies do their part, I think we will take a huge step forward.  And I also want to give compliments to the tech companies for taking important steps since we started to discuss this topic of children.  But we need to do more.

>> SHIVANEE THAPA: I will come back to Minister Bah shortly after.  Let me turn to Mr. Thomas Davin.  How can international agencies like UNICEF act as the connectors across policy, platform, and the civil society to institutionalize this responsibility?

>> THOMAS DAVIN: So, it's not an easy act because I think we face a reality and several panel members talked to that, of really companies looking for the bottom line and the bottom line is more money and it means more people on that platform and it means that, you know, part of when we talk about addiction behavior, it's part of what brings and keeps people on the platform.  And so we are in the middle.  The international organizations do not really have a power to talk about this.  Part of what we are trying to do is talk about what's an incentivized regulatory platform that enables these companies to feel if I do it well, it's good for the bottom line as well.  And if I don't do it well, there are consequences and the EU mentioned that.

So, really looking into this, but looking at it, again, from a win-win perspective inasmuch as we can.  I think part of what we are trying to build a little bit deeper into where we don't have enough data is understanding also what is the cost of inaction.  So, we are very clear about what's the cost of inaction on smoking, on alcohol, on drugs, on sugar.  Not so much on technology.  What is going to be the economic cost societally?  And the GDP cost?  So we are trying to be better at also telling that story.

I think we also are trying to really bring best practices in ways that enables other countries to benefit from Norway's experience, to benefit from Scotland's experience, about what works, and trying to look as well from an education system, having children as actors of their own lives.  So, we did talk about digital literacy and the sense of digital natives.  We fully agree they need to be better equipped to understand how those technologies actually function, to also be able to make choices and make informed choices.  Even from quite a younger age, once they start understanding all of those metrics, they are better at managing their own risks in many, many ways.  And, again, empowering parents to have that knowledge of how do I help my child in the same way that you say if you are on the playground, you do not go with an adult that you do not know, even if he or she says he's coming from your parent, you have a safe password or you have something.  Having parents to be able to have those conversations with children and helping them understand those are the risks of those platforms, this is the limits that we set for ourselves as a family, that you should set for yourself as an end user.  All of this is important.

But, again, it's going to take, I think, an acknowledgment that this is not just a new thing.  It is a public health issue, we need to treat it as such.

>> SHIVANEE THAPA: Thank you.  Let me turn to Ms. Yu.  As I ask you to ask your perspective to this central concern.  Certainly would be happy if you would add your thoughts on the partnership or co-regulatory approaches that have shown real potential in advancing child protection at the platform level at Roblox.

>> EMILY YU: One of the regulations I found to be effective was the age appropriate design code from the UK.  I appreciate the fact wraps to, especially transparency to children and, basically, upholding children's rights, it really did, I think, a very good job of that.

With regards to trying to solve for this problem, I think there are a number of things that a company could potentially do.  One is, of course, multistakeholder engagement and getting involved in a number of industry working groups to solve problems.  For example, we are involved with a tech coalition and their lantern programme in which we are sharing signals amongst companies to take action against child sexual exploitation and abuse.  We do a number of other -- we are involved in a number of other working groups as well with child safety in mind.

Another potential solution is with, of course, again, safety by design, having safety as a fundamental requirement to all product features that would eventually get published on the platform.  We feel that it's very critical to ensure that given that our audience is predominantly under the age of 18, that we take care of them and have a responsibility of taking care of them on the platform.

Also engaging in participating, getting involved, or getting involvement from youth and getting their opinions and feedback and understanding what they are experiencing on the platform and how we can resolve any of the pain points or issues they may be dealing with.  I think that's also very critical as well.

And then finally, developing education to, basically, express, not only what Roblox is to everyone, but to also provide education to both parents and children alike to ensure the safety, protection, and privacy of themselves on the platform.

>> SHIVANEE THAPA: Ms. Leanda, what does meaningful collaboration look like from a child rights advocacy standpoint, and where are the current gaps?

>> LEANDA BARRINGTON-LEACH: Thank you.  Well, I think there are lots of gaps.  So, firstly, I will echo, I think, a few points already made.  But one indeed governments have an absolutely critical role, self-regulation has not worked and good regulation does work.  And I appreciate the endorsement also given to the age appropriate design code.  And I really applaud the European Commission for the work that the EU is doing here and also the African Union has done some good work.

Governments need to regulate, implement, and most importantly they need to enforce and this really, really does take resources and political will.  They also need to invest in broader awareness raising and capacity building, in particular of civil society.  At the moment, the traditional groups who support communities and ensure that oversight are not aware, are not, do not have the knowledge and capacity to work in this area.  So, capacity building of civil society is something that urgently requires resources and support.

International bodies need to continue to promote coherence.  So this is a global problem.  We need global solutions, global standards.  And no one must be left behind.

Industry and technical bodies have a really important role to play, not only in developing technical standards and certification mechanisms, but also in investing in educating industry professionals, in particular engineers.  Like civil society, they really need support in this area to understand children's rights and needs.

And finally, tech companies need to play ball.  I mean, really meaningly play ball.  They need to get their act together.  And they need to do really what they do best.  So innovate to respond to demand by understanding that in this case the demand is for products and services that genuinely will be beneficial to children.

>> SHIVANEE THAPA: Let me get back to Mr. Kleiner.  How, Mr. Kleiner, can regulatory frameworks be designed to encourage and not to enforce, you know, joint responsibility among these stakeholders?

>> THIBAUT KLEINER: So, I think that regulation as I was saying, we believe works, if you implement it properly, and this is where also we want to have a constant dialogue between the Europe Commission and the platforms that are subject to the digital service act but also to the audio-visual media service directive that are also provisions to protect children.

We also want to indeed have this way to measure the age of users online.  We think this is really essential to be real about what is happening.  But generally speaking, this requires really a constant discussion and debate.  That's why we have opened this consultation on the guidelines that will inform them for, we intend to publish by the end of this summer, Article 28 of the Digital Services Act.

But also we are trying to have now some inquiry precisely on mental health and we are preparing some initiatives around cyber bullying to be specific about this issue.

But more generally what I would say is that this is not a one-off.  It is a constant effort, because the technology is evolving all the time and you have new issues, new problems.  One year is a long time when you are 12 years old, which also means that you need to be very fast and very agile in reacting to new development.  And this is why we have also in the EU this so-called safer Internet centres.  They are really locally established, and we have had more than 35 million single visitors to these websites, which shows the magnitude of the issue we are talking about.  So, families, children, they want to know more, they want to be not only educated, but they want to also have support to address very concrete issues.  Conversations in families are difficult around this, you know.  Try to indeed block access to games or, you know, to social media with your children, I can tell you it's not easy.  And that's where also I think that collectively we need to provide these resources.

So, again, this safe Internet centres is one of the success stories of our efforts to make a better Internet for kids, as we say it.  But I think for me, the end message is that it cannot be something that we just leave to parents or we leave to platforms or we leave to governments.  We have to work together.

>> SHIVANEE THAPA: Turning to Ms. Christine, from your TikTok's experience, what shared standards or governance models have helped bridge the public-private divide in practicing -- I mean, in protecting young users?

>> CHRISTINE GRAHN: As one of the subjects to, amongst other regulations, Digital Services Act, we actually appreciate it because it provides a level playing field.  It also provides a forum for this ongoing conversation, not just with the regulation, but also with civil society and other actors.

And I truly, truly believe that we have a better chance of being successful if we do work together.  And it's not about shying away from the responsibility or passing it on to someone else.  It's about efficiency.

So, maybe let me just maybe illustrate that with an example.  I am from Sweden, and sadly, we have seen a development over the last few years where teenagers, young teenagers even, are pulled into criminality by gangs.  As an isolated player, as a platform in this instance, we can make sure to have policies in place.  We can make sure to enforce those policies.  And we can have channels with law enforcement authorities.

But this is also not going to address the root cause.  This is at best going to be a mitigation once something has already gone wrong.

So when Bris, which is a minor safety NGO in Sweden, was tasked by the Swedish government to increase the support available for teenagers generally, but in particular those at risk for being pulled into this environment, we decided to partner with them and we did so with full force.  So, we found creators that they were not previously in touch with that could be that voice that would speak to those that they wanted to reach.

We found creative agencies that could help them with the expression that would really, really speak to these teenagers.  We helped with the strategic campaigns.  And I have, unfortunately, given the topic that we are talking about, it was a success.  There was a need for this, right.  So, we saw -- or they saw, rather, sorry, a 60% increase of calls to their helpline.  The campaign itself had 26 million video views.  2.7 unique viewers.  And this is in a country where we have around 10 million inhabitants and about 3.2 million users on TikTok.  So, there was clearly a need that we could help really them reach out to this group.

We also did that in a very close dialogue led by the Minister of Justice in Sweden, that gathered indeed public authorities, civil society organizations, and some of the platforms that where we can have that forum to continue and build and iterate.  And I also think it's now grown to a Nordic initiative, so it's also something that we are trying to address collectively at a Nordic level.

I think also, there is a need to have a mutual kind of understanding and transparency, when we talk about this use.  I think as a government, if you will allow me, I think it's actually quite important to use platforms like TikTok to truly understand what it is that you are trying to regulate.  And how you can facilitate collaboration.

On our end, we are trying to be, we are highly committed to, rather, transparency.  I invite everyone to spend some time in our transparency centre, where we talk very, very openly about the number of accounts that we removed because we suspect they are under age.  We talk about the successful enforcement of our policies.  We talk about how the algorithm works and we talk about products more in detail to really get a bit further down into creating that understanding.

And I think that's really, really important when it comes to building that kind of trusting relationship, not just with our community, but also with society around us that we are a very integrated part of at this point.

>> SHIVANEE THAPA: As we engage in this conversation, even at the moment, I think it's important we pay heed to the fact that there is a huge divide between the worlds, a digital divide and a divide in the apparatus surrounding this, which are so central towards empowering the digital world, right?  I think it's so, so important and I'm sure that's a big -- I mean, one of the biggest challenges for governments, for companies, and the players to deal with.

So, as I make this statement, let me turn to Minister Bah.  How can countries from the Global South be meaningfully included in global coordination efforts ensuring that solutions are both inclusive and equitable?

>> SALIMA BAH: Thank you, thank you.  I think I really like to reecho a lot of what my colleagues have really talked about, which is really critical the need to collaborate along international organizations, along civil society, the tech companies and as well, really it's a whole of stakeholder approach.  It can't be just a job of either government or the job of companies themselves to do it.  It really takes all the stakeholders to come on board and really work together.

And I think maybe one of the biggest also points to make is, we do believe that this data platforms inherently are public interest and goods play a significant role in the advancement of society in terms of just exposing when we talk about bridging that digital divide, making sure everybody has access to the same opportunities, and because of these platforms, we see young people now aspiring to beyond what is in their immediate societies and realities.  And I think we have to acknowledge it.  As a government, we use these platforms to make sure that the whole public sphere which used to be controlled by traditional media.  Now as governments, all of our government sectors have, we all have social media accounts.  And that's how we communicate directly to people and to children.

So, I think it's effective that even if you -- I was saying this to somebody the other day.  Surprising, surprising stat, 15% of the traffic in Sierra Leone goes through TikTok.  So, we think that's a huge, huge aspect in terms of looking at the impact it has.

And we are already working with somebody specifically, for example, TikTok, we are working with the African office in terms of looking at how we ensure that these platforms are safe, because whilst we know it serves a huge public good, but we also have to ensure as we have been talking about, that it's safe.  And we are rolling out trainings.  We are rolling out capacity building, initiatives, and we are rolling out platforms in terms of ensuring how can we more efficiently flag some of these harmful contents that we see.  And I think maybe for us within the Global South, one of the problems we faced is it felt as if it was a long, drawn-out process that we didn't really -- was clouded a bit in terms of how do we report some of these contents that we see, because within our region we understand the context more sometimes, rather than maybe somebody from outside.  So, I think we are working with MM to get that. 

But maybe just the final point, to put across, even within government, though, I think there's a huge need for collaboration, even among ourselves in terms of understanding what we are doing.  For example, you talking about this space, you are talking about dealing with ministers such as myself that is responsible for the digital economy, but then, obviously, there's a Ministry of Gender and children affairs specifically so we have to make sure things are aligned that way.  You are talking about also and potentially the Ministry of Education, how we are ensuring that within the Education system the digital platforms that are available and can improve learning outcomes are doing that.  So, for example, with how the advent of the ChatGPTs and AIs of the world.  We have the Ministry of Information, working with them to see issues such as deepfake.  Deepfake now is a significant thing and how we are ensuring that people recognize these are deepfakes or all of a sudden, especially when you're in a society who might not be as digitally literate, and even some people don't even understand there's such things as deepfakes.  Sometimes you see something is out in the social media platforms, people believe that it is true.  And our government has to be flexible enough to quickly respond and to alleviate fears.

You have the parliament also, which you have to get on board.  So I think really the underlying message is there's a lot of cross-collaboration that needs to happen and right across.

>> SHIVANEE THAPA: Thank you, Madam Minister.  Now as we approach to the close of this very important conversation, I'm sure 90 minutes wouldn't do justice to this very huge topic that we have come here, but I'm sure this adds one stone or a pebble to the very silence pond, that's how I would like to put it.

Now as we come to the concluding moments of this session, I would want to invite each of our panelists.  I think given the time, I can allow you less than 60 seconds to each to make your -- I mean, share your final takeaway or your call to action.

So, can we begin with Mr. Thomas Davin.

>> THOMAS DAVIN: I was afraid you were starting with me.  I would prefer to go last.  I'm not quite ready.  I think it's the potential of technology is immense when we look at, again, the outcome for children.  And again, through these platforms that we have talked, all the gaming, children learn so much.  They get exposed to so much that they may not be exposed.  The potential is fantastic.  The risk is very genuine.  And I think what we are seeing is we need to take that with an urgency that we have not yet completely seized across the globe, I would say.  Maybe the EU is maybe slightly ahead.  But I think that is quite important.

And in doing that, we need to build on the leaders, those who are leaning in, whether it is TikTok and the examples we have heard, what Roblox is doing for best practices.  So that, again, we push forward in saying, in building the world we want with children at the heart and with their voices in meaningful collaboration, impacting decisions on designs and shifts in designs as they keep voicing what works for them and what may not.

>> SHIVANEE THAPA: Thank you.

Ms. Emily Yu.

>> EMILY YU: I think I would definitely say that we want to maintain and keep children at the heart of what we do and also take into consideration their viewpoints and opinions.  In addition to that, I think that there's been, I mean, from a historical perspective, perhaps, some tension between safety and innovation.  And we find that, actually, we can in fact innovate with keeping safety fundamental to that.  Yeah, we are, basically, trying our best to not only entertain and allow children around the world to connect with one another, but also to keep them safe and secure and their information private.

>> SHIVANEE THAPA: Thank you.

Mr. Kleiner.

>> THIBAUT KLEINER: I really would like to thank also the Government of Norway for hosting us here the IGF.  It has been already a fantastic conference.  And I think this panel was very, very rich and I think full of very important conclusions.

So, what I would really want to do to, as a close, to have a call for action because I think it's not enough to understand the issues.  Now we need really collectively to take this in our hands, which means look at the regulation that the EU has introduced to expand it around the globe, have real measures to have age verification in place that work, and make sure also that we really make the Internet a better place for children by having age-appropriate content but also services that are really targeting children because technology is a wonderful opportunity and our kids are great.

>> SHIVANEE THAPA: Thank you.  Ms. Christine Grahn.

>> CHRISTINE GRAHN: Yeah, thank you.  So, we want to continue to be a place that people come to have fun and explore and learn, right?  And as I underlined throughout this conversation, safety is a prerequisite for that.  We don't always get everything right.  We put in a huge amount of work to keep our platform safe.  And we are industry leading in many ways in this space.

But we want to continue to listen and learn and iterate and rise to the challenge.  And I also would like to, I'm sure that all of you that have taken time to listen to us today have a lot of very good input as well.  So, I would like to invite all of you to come to our booth as well and continue this conversation so that we can engage more.  Thank you.

>> SHIVANEE THAPA: Ms. Leanda.

>> LEANDA BARRINGTON-LEACH: So, I would reiterate what some have said there is a huge potential.  We know what needs to be done and as Emily also said, we can do it.  But away just a word of caution, do not be naive.  Big change needs to happen.  We cannot allow children's futures to be at the mercy of commercial imperatives, which at the moment they are.  And this is not going to be something that's going to be easy to wrestle back.

So, please keep up the good work, please implement, please enforce, please regulate.  We can help.  And please listen to children.  And please stand up for them.

>> SHIVANEE THAPA: Thank you so much.  Minister Bah.

>> SALIMA BAH: Definitely recall a lot of what has been said in terms of the huge potential that technologies bring and these platforms bring.  And I think specifically for regions such as ours, we see how young children who there's actually research done about the potential of a young person who grows up with access to these platforms, versus another one who grows up and without and outcomes from them.  So, we understand those.

And I think maybe another topic which really we haven't talked about today but I think it's also a huge topic area is also the huge potential economic benefits that are accessible to young children if they can fully participate among these platforms as well.  But, obviously, we also have to ensure that they are safe whilst they are using these platforms.  And I think maybe another point as a call to action, is really responsible use and aim of these platforms as well, might be another conversation to have at another time.

So, really the call to action is, again, to all stakeholders, and specifically the companies, is to really come on board with us and really look at this issue as critical.  Maybe, I know sometimes there's a tendency when we say Africa and there's maybe one representative or two.  But it's really a diverse region.  And we hope that we see that being reflected in terms of the engagement across board.

>> SHIVANEE THAPA: And finally, Minister Tung.

>> KARIANNE TUNG: Thank you.  First I want to say thank you to the panelists, because I think this session shows the importance of IGF and the multistakeholder model.  It brings us together at the same floor, being able to have tough conversation, to be able to move forward, really.

We need to protect the children better than we are doing today.  And as a Minister of digitalization, I'm a tech optimist.  I really believe that technology can help us solve huge societal challenges, help us close the digital gap, and also make new opportunities.  But we have to put the ethical perspective first.  We have to put the children first.  We have to put the citizens first because also it's about the parents and adult people as well.

So, in Norway, we are doing a lot of things.  We have the white paper ongoing of digitalization safe.  We are now trying to implement an age limit of 15 years old to the social media platform.  We work together with Google and Microsoft to put personal privacy first when children are using this platform in school.  And we have banned mobile phones from the classrooms.  So we do a lot of things.  We do it together with the company.  I want to continue to work together, because we can't do this alone.

For me, it is a question, looking back, I don't know, 10, 20 years from now, I ask myself, what kind of side of history do I want to stand on, and I want to stand on the children's side.  And I invite really all of you to be on the same side.

>> SHIVANEE THAPA: Wow.

(Applause)

>> SHIVANEE THAPA: We couldn't have ended in a better note than this.  I reiterate, we were engaged in discussion on a topic which is no longer an emerging risk.  It's too urgent, too complex, and too personal, I believe, to each one of us, right?  And what we have heard today from this very distinguished panel is very, very clear, that protecting children in this digital age and in the age of algorithms have not just a technical challenge for us.  It is certainly a moral imperative, right?  And the future of digital governance must, as I could reap from the essence of this distinguished panel, must be built with, and not just for our young people.

Thank you so much to our distinguished panelist for the great leadership and undertaking you are doing in your own respective issues.  And the value you have accorded to this panel with your very, very gracious presence.  A special thanks to the Government of Norway and to Madam Minister for this great initiative and for the great takeaways we all are having to do from the Lillestrøm conference.

So, with this, I thank the members in the audience for your presence and your engagements as well.  So, with this, I rest my microphone and call off the session, as I invite my distinguished panelists to kindly step forward for a group photo.  Thank you.

(Applause)