IGF 2024- Day 2- Plenary- HL 5 The Key to Unlocking Sustainable Energy- Digital Innovations -- RAW

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR:  Good morning.  We promise to have a riveting discussion about a very important topic.  My name is Marleni Cuellar, and I'm the moderator for today. 

Today's session is all about protecting children's rights in the digital world.  Now, I know many of us whether we want to admit it or not our childhood was distinctly different.  Communication meant land lines.  You couldn't hide any communication you were having, dial up Internet would announce to everybody that you were logging on online and so protection mechanisms and safety were very different when western growing up    we were growing up, but today's landscape is dramatically different and the protection of children's rights becomes even more complicated when we are talking about children and growing up in a digital.

Sphere.

So I will be joined by an esteemed panel to share thoughts about what the current situation is, what the plans are in terms of developing safety mechanisms and, of course, hearing their own interpretation of what needs to be done and by whom.  Some this of the things we hope to cover, identifying the main challenges and risks in protecting children's rights in the digital world, explore technological solutions that can safeguard children's digital rights and the role of education, examine regulatory frameworks and policies and promote comprehensive and collaborative approach among all stakeholders.

So let me introduce you to our esteemed panel for today.

His Excellency, Mr. Sofiene Hemissi, Minister of Communication Technologies in Tunisia, Dr. Sarah Alfaisal, member of the Human Rights Commission, Kingdom of Saudi Arabia, Mr. Eugene Kaspersky, CEO Kaspersky, Mr. Syed Munir Khasru, Chairman of the Institute for Policy, Advocacy and Governance, Ms. Deepali Liberhan Global Director of Safety Policy Meta, Professor Muhammad Khurram Khan, King Saud University, Kingdom of Saudi Arabia, His Excellency Mr. Andrey Zarenin, Deputy Minister of Digital Development, Communications and Mass Media, Russia.

Yes, welcome.

So as everyone gets settled, I want to be able to say welcome once again to all of you who are joining us live and all those who are tuned in online.

Child protection is a very important topic to many, many of you who are parents and, of course, those who are simply concerned about the future for our children.

So everyone is settled in.  Let's get started.  One of the first things to establish in this conversation is knowing from both your experience and from the work that you do what are, what's the current digital landscape for children, what does it look like, and what are some of the most urgent threats that they face?  Let's start off with Mr. Eugene Kaspersky.

>> EUGENE KASPERSKY: Good morning, everyone.  Speaking about the threat landscape and what's wrong with kids in cyberspace and what's the difference, kids and kids right now, they spend too much time with mobile phones.  They spend too much time.

And the main difference is that they play too much games and these games are different, but from my point, well, I have five kids, so when I see them, I recognize that they are playing the games, they train themselves with different skills.

So when we were kids, we were playing the different games.  Now, they play the online games.  It's different, but I don't think it's really bad.  Because it's different skills.  So our parents, they were playing different games like we did.  Our grandparents they were playing also different games.  So right now I don't see this as a major problem with the games, just limit the time they spend there.

The second thing is they consume too much information.  Is it good or bad?  Because they train, again, they train the different skills, they train themselves to consume different kinds of information.  The possible problem is it doesn't get too deep in the mind.  It stays on the top and disappears.

And I thinking about myself when I was a kid, and my parents, it was also different.  My parents, they consumed different kind of information.  We were kids.  It was all different.  We survived.

So it's a good question, is it good or bad?  Is it positive or negative?

And definitely negative is the content they consume is random.  It could be dangerous.  The people they contact online, the random people could be dangerous, especially for their very small kids.

So this is the major problem.  So then the kids, they are in a new cyber age, so when they are in the cyber environment, it's different than our experience when we were kids.

But this is the new world.  It's a new reality.  So it's positive and negative at the same time.  I recognize that the main threats that's manipulation, manipulating the kids with information with online contacts, and with some modern toys, physical toys are connected to the Internet and we are from cybersecurity, so we know that some of these toys, physical toys, they are vulnerable, so it's possible to hack them and manipulate kids online not from a mobile phone, not from a computer, but from physical toys which are Smart toys.

So, of course, the cyber age is different, so it's positive and negative at the same time, so we need to be very careful and understand so don't stop our kids to use the positive, to train the positive skills, but the same time we need to control the content they consume and the people they contact, and their safety of the toys they play.  Thank you.

>> MODERATOR: Thank you, and for you, Your Excellency Sofiene Hemissi.

>> SOFIENE HEMISSI: Good morning.  I would like to thank you for this kind invitation.  As a matter of fact, it's a very sensitive topic today.  The world, the cybersecurity landscape witnesses unprecedented, unprecedented opportunities and risks.  Right now the virtual world can institute the key part of their learning and education and part of their lifestyle.

We have a lot of challenges and a lot of opportunities today the world is an opportunity to express themselves and to approve their existence.  And this is a huge world.

It's an opportunity to avoid the divide, but at the same time, it's at risk for their emotional and psychological health and mental health, and one of the key challenges is the manipulation and abuse and bullying and sex abuse, and for attracting them for some malpractice, this is very disturbing for us.

We have as well some risks related to the protection of the privacy of children.  Today the gamification and electronic platforms combine a lot of information which is hard to consume, and we don't know the real target of such information and the end of this information.

So one of the key risks for the children is the digital divide between different segments of the society.

We have some children who are well connected for the E learning platforms, and we have the majority of the children are under privileged children and this is a huge challenge to ensure equitable opportunities for the children across the country.  In addition to such risks and threats, it's very warning, the erosion of the privacy of people regarding the identity, the cultural identity, and the privacy of the people, especially in the Islamic and the Arab society which is built on specific pillars and such privacy, such privacy can put the cultural identity of our people at risk, especially in cybersecurity and especially for the children and I thank the platform of gamification, I think they wouldn't design, take into consideration the cultural identity and I think we should work together and between different countries to not one country can work alone.  But this requires coordination across border coordination and concerted efforts between the civil society, Private Sector and Governments and families and the educational sector to set frameworks, fit and put the rights of the children at the top priority.  Thank you.

>> MODERATOR: Thank you, and there we see some similarities acknowledging that there is much good for children, but, of course, major concerns as well.  Let's move to His Excellency Audrey Zarenin.  In your opinion, what are the most urgent threats that children face online?

>> AUDREY ZARENIN: So I will be speaking in the Russian language.  The interpretation was great yesterday, thanks to our interpreters.  While this forum is very important for us and just as we are preparing for the next IGF forum, we are preparing to participate but the decision was taken not in our favor, so as we said, the digital world imposes more and more influence on our children and opens more horizons for education, creativity and communication, but simultaneously in spite of those so many opportunities, it poses lots of risks.

Today we can see that the deep changes in the digital landscape that influence the psychological and social behavior of people and teenagers.  One of the most important challenges is the so called virtual masking.  Here we see lots of fraudsters that are using deep fakes, are creating false sense of trust manipulating children's behavior, and that's not only proposes risks, but also causes psychological traumas.  And these virtual manipulations that I'm discussing and when we are using kids' trust, they are creating false identification in the digital world and apart from that, teenagers are coming across different risks, this is cyber bullying, and also bad content, fraud, and also they engage in terrorist communities.

And also we are coming across the problem that also trafficking and sexual abuse.  And all of these threats can be devastating for the emotional and physical health of our children.  For example, according to the data of the All Russian action cyber rights, lots of children pass through it.  Lots of teenagers cannot detect fraud in online gaming.

It's very important to form the right skills of digital literacy when children need to detect fraud in online and the cyber, one of the problems is the gap in the skills.  Sometimes parents do not have the skills to have to discuss threats is that poses the digital world and our research shows that a lot of parents are not certain about the skills to monitor the online activity of their children.

That is increasing a lot these days, and that breaches the trust of the children and makes them more vulnerable.  And considering all that, it's very important to create interstate cooperation where not only governmental bodies and schools have to participate, but also families have to support their children in mastering the digital world.

And I would say that the problem becomes more and more critical when we are dealing with artificial intelligence.  We discussed it yesterday when social media is used more and more widely, and like other tools are developing, this new technologies are posing threat to the psychology of children and I believe we need to create safe environment that would assist children to adapt to the digital world, and minimize those risks.

And this is why it's very important the collaboration of all stakeholders, governmental bodies, families, educational bodies, Private Sector to develop the strategy to protect our children in the cyber world.  Thank you for your attention.

>> MODERATOR: Thank you very much.  A critical point that you brought up there about parents not being fully able to teach their children the protection mechanisms that are necessary.  Let's hear your thoughts on this, Mr.  

>> I would like to thank the UN authority for having been kindly invited (Eugene Kaspersky).  How many of you after you get up from sleep in the first ten minutes don't go near your mobile phone?  Can you please raise your hand?  One, two, hardly three, four.  Zero.

Second question, those of you who have children, how many of you feel very anxious that they are into some world where you think they are exposed to a lot of risks and you are not able to keep a proper watch, how many?  Can you raise your hand?  Okay.

The reason I was pushing the boundaries, basic human instinct is something we have to manage.  We cannot fight against it.  So I would tend to push this hypothesis.  We have to train, educate, inspire the children, and in my opinion, they are very smart even in this year, people in the age group of 12 and below, 35 to 40 million will be using Internet, which is 12 million more than people in the age group of 12 to 17.

So what does it tell us?  It tells us the primal human instinct that a child has is very strong even for parents to manage.  So I would like to propose we first understand what we are dealing with here.  So I have been one of those who have said we need to manage, we need to find better ways to educate them.  We need to find ways because what also is true just as people who are exposed to cyber bullying are twice as likely to commit suicide, just as it is true we have seen in many cases in America a few years back a teenager committed suicide on online games even where he had the game headset on.

So it's a tricky situation, and 60% of the people, children interact with human beings without their identity are there risks there?  Yes.  How do we handle it?  That's what the session is about.

We also have to look the benefit it brings, because for a lot of marginalized communities, children on the fringes, disabilities, to many Internet is a life line.  Many of the people do monetization with very little investment.

I will give you a simple personal example.  A few months back we have a female cook in our house, so there was a dish she was supposed to prepare and she said I have never done it.  In the evening when I saw the dish on the table, I asked her how did you do it, you said you had never done it.  Her 6 year old daughter picked up a phone, went to YouTube, downloaded it, and taught her mother how to do it.

So this is the age we live in.  So I'm sure there are many, many positive stories going around, and as your excellence the Minister has mentioned which will come later is the multistakeholder engagement.

Children in our first thought is children, five, six, 10 year olds, no, they are the future Netizens.  They have more power in their hand and this simple device could change everything.

My open remarks can be, and I will give you many, many examples, South Korea tried to control the gaming time in 2011.  It failed.  European Union tried to put parental guidance.  It failed.

That means kids are very smart.  They can outsmart their parents.  So we have to find a better way to inspire and encourage them to be better netizen and net digital literacy will be a key thing in absence of which it will be difficult for them to navigate these territories and maximize the benefit, minimize the risks.  So that would be my opening remarks.  I'm looking forward to more engagement with my fellow panelists.  Thank you.

>> MODERATOR: Yes, of course, and digital literacy sounds to be a resounding theme when we speak of protecting children online.  We understand that children interact in the digital space in many ways, but let's go ahead and say one of the primary ways they interact is through social media.  So Deepali Liberhan, as the Director of Policy at Meta, we have seen continuous development of safety mechanisms for children online.  What are the main challenges and risks in protecting children's rights online?

>> DEEPALI LIBERHAN: Thanks for the question, and it's a pleasure to be here.  At the outset, I want to say that we want young people to use our platforms to connect with their families and friends, to be able to explore their interests without having to worry about being unsafe or being subject to any kind of inappropriate behavior.

And that's why at Meta we worked with experts as well as consulted parents and teens to make sure that we are building these safe and age appropriate experiences for them on our platforms.  Some of the risks and challenges that we face and the panelists also have talked about it, are essentially threefold which is echoed by parents as well.

The first is content.  Parents are worried that thirteens online are exposed to inappropriate content.  The second is what you call contact risk.  Parents are concerned that thirteens may be exposed to unwanted interactions that put them in harm's way.

The third, you know, as my esteemed panelist also mentioned that parents are really concerned about the amount of time that their teens are spending online.  So based on these concerns and other risks that we see online, Meta has worked for the last couple of years to build and develop oversight features and tools to address these risks.  It's about many things, like we want to have robust policies which say what is okay and not okay to share online, and we proactively enforce those policies and we can say that a majority of the content that we have, we are able to remove it even before somebody reports it to us.

We want to build over 50 tools and features including parent supervision because we have heard loud and clear from parents that they want to be involved in their teens' lives, and we have heard that some don't have the skills to be able to have conversations with young people and, therefore, we have worked with safety partners to make the resources available.  As you mentioned, comprehensively we have all of this together where we have launched teen accounts in September this year, and we are globally rolling it out.

Teen accounts is essentially all teens on InstaGram are placed in an in built protective experience which address a lot of the concerns that I just talked about.  What is the content that you are seeing?  Who is able to contact you?  And being able to spend meaningful time online.

I will quickly go through some of the default protections that are there in place.  So teens are defaulted to a private account.  We also put on the strictest messaging setting so they are not exposed to unwanted interactions for adults.  We have put on the strictest control content settings so there is limited exposure to sensitive content which goes above what we already have in place when are our community standards which deals with violating content.  We also heard from parents that they were worried about the time that the teens were spending online, so when you are on InstaGram, teens will get a reminder if they have spent more than an hour on line and asking them to leave the app.

We have automated an automatic sleep mode so between 10:00 p.m. and 7:00 a.m. in the morning all of your notifications get muted.  So these are some of the things that we have done when we have launched teen accounts to address all of the risks that we have talked about.

We have already rolled out teen accounts in the U.S., U.K., Canada and Australia and we are globally rolling it out in the rest of the world.  I think one of the things that's most important to note is with the rollout of teen accounts, we have actually engaged and heard from parents, we have heard from teens themselves, and we have worked with experts in this field to make sure that we are building the appropriate safe experience for young people on our platforms.

>> MODERATOR: Professor Muhammad Khurram Khan, let's get your thoughts on this.

>>  MUHAMMAD KHURRAM KHAN:  The topic of child online protection is close to my heart because I have been working on the area close to ten years and before we dive into the challenges to understand them, how they are evolving, we need to understand how technology is transforming for children and will keep transforming for the future, especially when children interact and engage with technology.

So the first important thing is that because children are using Internet more than ever before, okay, so there are challenges every day emerging online, for example, we have digital, our children are going and accessing websites and they are going to the virtual environment.  So all things are blurring the lines between the physical and digital realm.

And the most important thing for me to foresight and to foresee the future of the challenges is how the technology will permeate into the people.  Now, we are interacting with the technology.  In the future, we will be having technology permeating inside by having brain computer interface.

We will have the chips implanted in our minds, and I'm afraid that what would be the situation look like in the future because if children are getting a lot of mis and disinformation from the platforms, but when the platforms will be embedding such kind of information in their brain and really want to control them, so that is a very challenging task and we have to be vigilant about that.

So there are two kinds of risks I can categorize.  The first is active risk in which a child is specifically a direct victim of any kind of abuse, and the second is passive risk.  So passive risk is children they put their photographs online and they do not know what is going to happen with them.  So these photographs, for example, from the social media can be picked up by data harvesting groups and they can use those photographs to create deep fake of the children and children don't know what happened with them, nobody knows.

And this is becoming a very pervasive challenge because the GenerativeAI can do anything.  So anybody can create the deep fake videos and images of any person.  So we have to see from the perspective how can we address the active and passive aspects and how can we build the policies like our distinguished panelists told that we have to have a multilateral and multi  pronged approach to address the challenge and we have to have governance on the platform so nobody goes beyond their limits.  So that is very important.

>> MODERATOR: And Dr. Sarah Alfaisal.

>> SARAH ALFAISAL: Thank you.  I'm very pleased and honored to be with you today.  I think one of the main challenges in the cyber world is the privacy violation.  And also what we see from all of the attacks that happen to the children in cyber bullying and all of these things, from human rights perspectives, we need to focus on the right of privacy because the right of privacy is a big right, and it's included in the universal declaration, and also it's included in the treaty of children's rights.

So when we lose this huge right of privacy, this will make the children face all of these maybe difficulties.  And I think maybe if we focus on this right and try to promote a solution, you know, our duty is trying to protect and promote human rights.  So I think that we need to focus, number one, on this, on these rights.

>> MODERATOR: All right, and His Excellency Sofiene Hemissi, challenges and risks of protecting children online, tell us about your perspective.

>> SOFIENE HEMISSI: Child protection on the Internet as my colleagues have mentioned, there are formidable challenges associated with it, among which is the lack of legislations and policies on the level of every Government, every country and on an international level.  It's correct that many countries have placed, have enacted laws and policies regarding child protection on the cyberspace, hour, remaining all of these legal frameworks and policies be inconsistent and there is a lack of unified framework that we could all rely on among other challenges as was mentioned is the abuse and exploitation on the Internet as well as the issue of data privacy and monitoring mechanism on this data on the level of each country and on an international level specifically with increase of solutions related to AI, the analysis of this data as it represents an opportunity and room for more development comes with it or associated with many risks for the children.

Also the inappropriate content for these age brackets, for these lower age brackets and the teenagers exposing many of them to a great deal of issues that amount to suicide as was mentioned in the past.  I would also like to address the issue of the digital divide between the various brackets of the society as there are youth and children that have all of the opportunities and the means on the other hand, there are millions of people and millions of children cannot access the networks, especially in some of the African countries and some of the Asian countries where the access percentage and remains very low, and I think it's our shared responsibility to provide equal access.

I would like to emphasize the importance what we are witnessing of the reduction of cultural and societal values reflecting what the Internet content provided reflecting what the Internet content that has been provided over the past few years as well as the phenomenon of addiction, of social media which is impacting the safety and the sanity of the youth, and the psychological health and physical health.  In order to address all of these formidable challenges, I think it's necessary to adopt a comprehensive approach that included all of the parties, Governments, civil society, content producers and I think this is the only way to move forward in order to come up with unified framework to protect the rights of the children especially in an era where we witness challenges that we have never witnessed in the past.  Also we need to protect the privacy and as well as enhance their wellbeing in order to create an equitable digital future that included everyone.  Thank you.

>> MODERATOR: Thank you.  Now, one of the things I think that we have all clearly identified is that there is a level of literacy that has to be established.  There are many stakeholders in the protection for children.  We talk about Governments, we talk about regulatory bodies, we talk about parents and we talk about the educational sector.  Dr. Khan, from your perspective, what are some of the technological solutions that perhaps can help to safeguard children online, and in that what role does education play?

>> MUHAMMAD KHURRAM KHAN:  That's a great question.  There is a famous saying about technology, if you think technology will solve your problems, you do not understand technology and you do not understand your problem.  This is very true when it comes to safeguarding children online because when we giving access to a child to the Internet, which means that on the other hand, we are giving online world access to the child.  So in this manner, so we need to have some kind of safe guards which should be implemented into the platforms and, for example, there should be some kind of protocols we should be following.

So there is no single solution which can cover all of the problems we have for the children.  To are example, we should have to have a multi  prong and multilateral approach which I said the first time you asked me.  What is that about?  So first, we have to have a framework or an approach which should be at different levels.

For example, strategic level, tactical level and operational level.  When I talk about the strategic level, so I'm talking about some kind of regulations, laws, governance and compliance, this kind of stuff we have to build.

And the second is tactical.  Tactical means we should have more standards specifications for the tech platforms when they build the system.  For example, when they build a system for children, then there should be child appropriate verification, age verification, age assurance protocols as well as there should be some operational approaches which means that the tools we are using for protection of children, for example, parental control tools, right, so and on the other hand, we should have awareness and education.

So that should be inculcated into the curriculum of the children.  Not only for the children, for the digital parenting, for educators in the schools and our environment.

So all of the things are very important to build technological solution.  It's not just one thing we can build the technology.  Technology, you know, sometimes fails and technology obsoletes.  So it is important that we need to have a multi  prong and multilateral approach to build a system which can protect our children while they go online.

>> MODERATOR: Mr. Kaspersky from a cybersecurity perspective, you have seen the multitude of threats that people are exposed to daily, and perhaps in that experience it's challenging to imagine children being able to decipher what puts them at risk and what doesn't.  When we talk about technological solution, on which end is it more important?  From the user or from the implementer?

>> EUGENE KASPERSKY: I think it's from both.  Technically speaking it's possible to build these parental control systems which you just have mentioned but unfortunately    well, fortunately it's a good use, the kids are growing and getting more smart.  So they are able to disable these tools.

Unfortunately, their mobile phone operating systems and I'm sorry I'm getting into the technical details, they don't guarantee that all applications they run forever.  So it's possible to find tricky ways to disable some applications.

And when the kids are 13, 14 years old, they are smart enough to find these tricks and disable any kind of protection.  So when we speak about mobile phone, for 10 years old, 11 years old it still works.  When it's older, when they are 16 years old, parental control works in a different way.  The kids control parents.

So speaking about the technical measures, it works for the some level of the age, but maybe it's also the good news it makes them find ways how to bypass the protection, training the mind how to behave if they are facing the challenges.

The other story is you have to protect them from the other side, from the content side, from the Internet.  You have to find the wrong information, abuse information, forbidden information on the Internet, that's technology.

And this is also we need the help of the regulators to make Internet services, make them must do control the traffic they provide.  And also the law enforcement to find the bad guys which generate this content, which generate this abuse content which try to manipulate kids and in this case we need to have most strong international cooperation.

Unfortunately, the geopolitical situation is some countries they don't talk to each other, and unfortunately, this just gives more opportunities to the very bad guys which use this fact that Internet doesn't have borders, and it's more easy for them to do any kind of crime including the child abuse and stay unrecognized, absolutely unrecognized.

So that's my final point is that we need to have more strong international cooperation to find not any kind of cyber criminals, especially their child abuse criminals.  Thank you.

>> MODERATOR: Dr. Sarah Alfaisal, one of the things I wanted to say in setting the back drop, we are talking children's rights, the establishment of the United Nations Convention on the Rights of the Child took many things into consideration, ultimately the protection and ensuring children reaching their full potential.

This was created before there was the digital realm, and obviously it gets more complicated now.  But the structure of how different entities come together to protect the most vulnerable citizens in our communities remains the same.  So when we transfer this idea into protection for digital realm, how do you see that collaboration playing out?

>> SARAH ALFAISAL: If you allow me to switch to Arabic.  In answer to your question on the collaboration between the Government with the civil society organisations is an important question and it's essential in providing protection to the children.  There are many modalities among which is exchange of expertise and increasing the awareness level in order to achieve important outcomes.  In that regard, we have all seen the Riyadh announcement yesterday that included many, many important elements in that regard that focus on the enhancement of digital inclusion, the role of AI and the sustainable development and making most use of AI in innovation and enhancing the governance of AI which is an essential important point related to the child protection and the cyber award on the management of risks, potential risks of AI through creation of safe sustainable systems.

This cannot happen without the collaboration of the three partners, Governments, Private Sector and civil society.  They need to collaborate together in order to reach comprehensive sustainable solutions to achieve the safety aspired for the children in the cyber world.

>> MODERATOR: Dr. Syed Munir Khasru.

>> SYED MUNIR KHASRU: I think this is most as one or another have expressed an opinion is multistakeholder engagement.  I would, in my opinion, it starts with the tech companies, because today Google, Meta, Microsoft, their cumulative worth is almost $12 trillion which is more than four times the common GDP of African continent.  So I believe they have a more active role to play.  With all due respect with my colleague with Meta, you have been doing content moderation.  In my opinion it is far short of what is required and what I have observed it is more reactive than proactive.

We can have different opinions.  What is happening in between these transition a lot of children are getting affected.  One of the things this has to keep in mind is this has to be bottom up approach not top down.  It is difficult for anybody to go to that nano level.  It starts with more family, then society, then community, then nation, then region.

Before this meeting I was speaking to His Excellency that regionally what other thing can be done.  And it's my impression technology always leapfrogs policy in any part of the world whether it's authority regime or democracy it takes six months to two years to pass a law.  By the time you pass the law, the technology has moved to the next level.  So you are always catching up.

So I think we are heading towards a situation where the UN probably will have to step in and all of the Member States willingly or reluctantly will have to give UN mandate, some central force which adopt these laws and other countries follow.  The risks we facing now, when AI will kick in within the next decade, that will be a very, very different situation.  We now talk hypothetically, but when practically, starting with grades of students to what is original, what is copy.  It will be a murky world so for 198 countries to cluster around, it will be a big mess unless there is some central force.

So UN probably will have to scale up the operation.  So that will be my answer from bottom up how to scale it up.  And I think tech companies have a very important role to play.  To be honest with you, they are more powerful than Governments.  They are spread around the world.  They have access to our personal data, what we eat, where we live, what is our favorite color, so more information than FBI, more information than agencies.  So the onus is on them.  They are doing, but I think they need to do more.  Thank you.

>> MODERATOR: His Excellency Audrey Zarenin.  Let's talk about the collaboration.

>> AUDREY ZARENIN: Many thanks.  We heard a lot about the cooperation between Government and the civil society.  We are studying fathers and children and it's about the perception, the gap in the perception of the world between the generations.  I'm sure when the author was writing the novel, he couldn't comprehend how much it will decrease this gap in the age of technology and in today's digital world.  We have essential role of technologies and, of course, it plays a great role in providing the digital protection for the children.  In Russia, we work with Internet companies, with providers of Internet, and they should adopt according to their current threats.  It was mentioned that children in many ages can go through, navigate through bringing down the barriers, and in Russia we are working, we have special portals from the Government bodies who tackle these problems in the portal of Government services we have special departments for children protection in cyberspace.

We have their recommendations for parents and together with our partners alliance working on the child protection on the Web and we are planning to do the mailer from the Government services.  None of the current web filters can replace the human cooperation.  And we have developed the charter of protection of children and there are principles protecting the rights and identity of the child and we highlight the protection of the identity and privacy of data and it's very important in the world of technologies.

Also I would like to pay attention that united responsibility to protect the children.  As adults, this is our task to show to the children how wonderful and diverse is the world and using gadgets and the Web is one of the tools of learning things about this world.

And, of course, we are working in our ministry we have the program of cyber hygiene and the goal of this program is to explain to children in simple language their roles of behavior on the Web.  Over ten million and including four million children and adults participated in different projects related to cybersecurity.  We work with nonprofit organisations, we have run so many events which were related for child protection on the Web.  The lesson on digital world, the digital dictation, we support such initiatives and we are always open to productive interaction and we always work to protect the children in cyberspace.

>> MODERATOR: Thank you for the update on what is taking place in Russia.  Now, Ms. Deepali Liberhan, let's talk about from Meta point of view.  How do you see the collaboration playing out between civil society and Government.

>> DEEPALI LIBERHAN: At Meta we have an approach to safety which is a multi layers and multi pronged approach where we think about having the right policies, features, tools and working with parents, experts and young people to build sage appropriate experiences.  Over the last years we have gone beyond that because we know a lot of bad actors don't restrain themselves to one platform and they move across platforms.

I think this is why it's really important to underscore collaboration between industry, between civil society regulators to be able to be able to address some of these harms at scale.  I want to give two examples of what we are currently doing now.  One is project lantern.  It is run by the tech coalition where all participating companies have opportunity to share signals about accounts and behavior of those who are violating the child safety policies.  Now, this goes beyond just Meta.  It's available to all participating companies and what this helps in doing is it helps all of these platforms do their own investigation so we are able to address these harms in a scalable way because we know predators will move from platform to platform.

The second thing that I want to mention is stop and NCI.org or as well as Take It Down which is run by the hot line which is a way to address these harms in a scalable way.  All participating companies are part of this and what anyone who is afraid about their intimate image being shared can actually share a hash with stop NCI.org or Take It Down.

And companies like us will receive those hashes.  So when that content is actually updated, we are able to remove the content.  The reason I'm giving these examples is that we hear often that technology companies are not doing enough, but these are some of the ways that we are coming together to address these harms in a very scalable way, and not just at a Meta level, but at a cross platform level.  It is also useful to have discussions because education about these features and tools and services is really important as well.

So what is the role of educators?  What is the role of parents?  What is the role of regulators?  And how can we as a community come together to address these harms and identify what the gaps are and work together to address those?

>> MODERATOR: Picking up on points that have been discussed so far, one of the things that still sticks out to me is when we talk about educating our children on how to stay safe online, we are laying the expectations, quite frankly, on a generation that is not up to date with the threats that exist. 

In my own work when we talk about media and information literacy, we are constantly talking adults on how to detect this information and misinformation and the understanding of deep fakes.

And so when we talk about these persons then having to teach children about those risks and now they are other manipulation tools that exist specifically for children, how do we pass on the responsibility when they are starting from a place of knowing less than the person they are educating.  Dr. Kahn, do you want to start us off.

>>  MUHAMMAD KHURRAM KHAN:  Absolutely.  So as I mentioned, technology is not a panacea or a solution for everything.  So regulations, governance are very, very important, and especially when we talk about the AI, which is just the narrow AI we are having now, by the way.  So we are not talking about the artificial general intelligence and artificial super intelligence.

When we have these technologies coming in the future and in computing interfaces so the challenges will be at the unprecedented level.  So what I would say is that because child online protection is a global problem, and it also needs global solution.

And what solution.  The solution is that because the western world, the developed countries, so they already have skills, resources and technologies, but what about the rest of the world, the least developed countries or developing countries.

So they should have some kind of exclusiveness in the safety and protection of children not only from the developed world, from the rest of the world as well.  So what we need to do is that we need to have some kind of cooperation under the United Nations ITU, G20 and G7 and many other multilateral organisations to build the curriculum, to build the tools and technologies, to build the policies, governance and foresight of the technologies because like it was mentioned that we are reactive to the technology.

That is not a solution.  We have to be predictive about the technologies like I'm mentioning.  We have, for example, quantum computing, which is changing all of the ways.  So how will be the impact of these kinds of technologies on the children?  So it has to be studied, it has to be communicated with all of the stakeholders who are building the technology and solutions.

>> MODERATOR: I think Mr. Kaspersky said earlier, let's remember there is good that comes with the bad.  I know for parents who are probably watching, as they hear about the potential threats they may be unaware of, the inclination is let me protect my child by limiting any access whatsoever.  What's the balancing act here?

>> EUGENE KASPERSKY: That's a good question.  I have to find a good balance.  What I do with my kids, the limit is flexible, and it depends on the school results.

So if you perform at school better, they have more time on the Internet.  If they below the acceptable limit, I exchange their smartphone with an old style Nokia.  That's it.  And speaking about education, cyber threats and wrong information, well, I'm very sorry, I have a completely different opinion.  I think that education of adults is more important than education for kids.  Because the danger from non educated adults is, the youths can do much more dangerous mistakes.

We as a private company do a lot of cybersecurity, we do a lot of education, for any kind of audience, we do it for professionals, for students, for kids, for very small kids.  We will see the book about cybersecurity, the book, the pictures were of very small kids, and we do it a lot, but my experience is that kids, they are learning much faster than adults.

>> MODERATOR: They outsmart them very often.  So this is an open question if there is anybody else who wants to share.  Thoughts?

>> SYED MUNIR KHASRU: I think I will add to what he said one of the things we are oversighting the role of parents.  One of the things which I fear in this generation, the proportionate loss of appreciation, understanding of the human component.  So no matter what AI, what advanced technology we embrace in the coming decades, the human compassion, empathy, the human touch will be the number one factor.  In medicine you may have the best AI driven medical facility, end of the day the patient wants to hold the hand of a nurse or doctor.

One of the things I can see right and left, I will give a simple basic example, when kids accompany their parents, even go to social events, I see kids with mobile and parents busy, and I have seen very few parents even to tell them that you need to interact, engage with the people who have come to your home, very basic things.

So what is happening, what I fear we are unleashing a generation extremely talented, very well informed, and imagine our forefathers.  Whatever cerebral capacity they had in the same garage you are now parking four cars because over flooding people with information.  So there is information overload.  But God has not changed the composition of your brain.  In the same garage you are trying to park four cars where you used to park one car.  So people are very stressed.

One of the reasons you may have noticed many of the social media led movements has always not been successful, why?  Because they lack the proper human face.  If you look at last 15, 10 years even the political landscape of what's happening why so many young people come to the street, put their life in the line of fire and then fail, that's the question we have to answer.  Where is the missing link?

I think there parents have important role to play.  We have talked a lot about children, absolutely, but we must not oversight the important overarching role parents have to play.  So buying a kid a smartphone and giving access, and the value of bedtime stories is still as important as it was 20, 30 years back.  My concern is in the drive and natural inclinations toward technology, we are side stepping basic human factors which continue to shape how we make the future citizens.  Thank you.

>> MODERATOR: Deepali Liberhan, before we move into our closing statements, I wanted to kind of broaden the conversation back to technology.  Give us some insight, because I think earlier it was said that technology companies seem to be reactive versus proactive, but it seems that that's a situation that we find ourselves in, the technology comes out, there is a rapid uptake, and the manipulation that comes along with it.

But are some of the technology that's emerging, are they going to be helpful tools in being able to guide or ease the protection mechanisms for children, verification of age, for example, using AI to detect some of the deep fakes?  Is this also an opportunity moment for us?

>> DEEPALI LIBERHAN: I think that is a good question and it is an opportunity.  And let me give you an example.  I have been with metaphor over a decade.  When I joined Meta we had community standards which were clear about what is okay and not okay to post online.

But we were very dependent on user reporting to be able to tell us that content is bad content and we need to Take It Down.  Over the years we have heavily invested in proactive technology that now we are able to remove a majority of the content before it's reported to us by anyone.  So if you look at the latest transparency report that we have where we publish these numbers, we were able to remove, I this I, for child sexual abuse material more than seven million pieces of content, for harassment more than 7 million plus pieces content for suicide and self-injury, about five million pieces of content, and the majority of this content we were able to remove before anybody reported it to us.

That's the work of AI.  And with continuing to work with proactive technology in the areas that you have mentioned whether it's verifying age, you know, what are the signals that we can look for to be able to identify it when somebody is under the age of 18, or continuing to see how we can use this technology to be faster and more effective, and I do think technology has a very important role to play to be able to address some of these harms.

And we have been using it a long time, and we are going to continue to build on it to address these issues that have been raised today.

>> MODERATOR: Well, there is a lot more that we can discuss on this particular topic, but I do want to give our panelists the opportunity to give their closing statements.

>> Thank you, it was a very enriching discussions, and I know that the digital wealth, we have one true fact which is the protection of human, that children rights.  It's not an option, but it's a shared responsibility in the digital world.  The child has the right to safe digital space which ensure growth and provide the opportunities for skill development.  It's a space express themselves and talents.  To achieve this end it's not responsibility of Government or civil society or private companies, but it's a shared and joint responsibility without any exception.  The Government is responsible for certain framed policies which is, which we cannot prevent or apply the conventional methods, however, the civil society has a greater role to play for the awareness, education as well as the private companies play a greater role to develop and create appropriate content to children and to their identity and character, as well as the comprehensive awareness and appropriate as I said to the complexity, and cooperation between international organisation and the states.  It's very important, no single country can achieve this end without the international cooperation at different levels.

And I hope that we create a very safe digital space and landscape for all.  Thank you.

>> As we conclude our session, I think we have to remember the key points we have discussed and the action that we need to take to move forward, I think that together we have the power to make great impact.  It's all about commitment.  Thank you.

>> EUGENE KASPERSKY: The abuse content on the Internet, that's bad.  Manipulating the kids, that's really bad, we need to fight with that.  Kids spend too much time with mobile finds, but the previous generation was spending too much time playing computer games.  The generation before they were too much watching TV, before they were too much reading books.  So I am optimistic.  We will fix the problems.  We will save our kids, and that will be a happy generation.  And they will have kids too.  And they will face the same problems.

(Applause).

>> SYED MUNIR KHASRU: I would like to conclude what IGF stands for.  I for inspiring the children to learn, to navigate the Web in a manner that adds value to their life.  Then comes the G, the governance part where we have to do a better job to try to make it as safe and secure as possible.  Then comes the F, that free the human spirit. 

If you go by UN Convention on Child Rights, IGF is all about child rights.

And another thing I think since we are having a very free and frank exchange of ideas I would like to draw attention because I think it would be incomplete.  Everybody in this room must recognize any child anywhere in the world life lost, endured, sick, should have equal value of life.  That means a Rohingya child born in a camp in Bangladesh refugee or a Palestinian child struggling for life all have equal value.  And I must say, it's my opinion, many of the social media companies are not being just, fair and equal.

And I do not believe they should be dragged into bigger geo politics.  They should be like the United Nations where every country, every citizen has a voice because I have seen many people deserting social media in recent times which should be a wakeup call.

So people are smart, children are smart, youth are smart, teens are smart, and we must understand every life, every child matters.  Thank you.

(Applause).

>> DEEPALI LIBERHAN: I want to close by saying we are grateful for the opportunity to be here because the discussions helps inform the work we do on a trust and safety space.  I want to thank the panelists for their very clear and helpful comments.

The other thing that I would close by saying is that at Meta we have adopted the best interest of the child framework when we are thinking about building products and features for young people which has been guided by the UN Convention on the Rights of the Child.

I won't go into all of the considerations, but two important ones are one is engagement with the families and teens who use our apps and which is why we are going to continue to work with parents and teens and experts as we think about building for them, and the second is, the second consideration is building safe and age appropriate experiences and everything we have done including teen accounts, including working with other organisations whether it's Project Lantern or Stop NCI.Org or Take It Down is in furtherance of making sure that we can do all we can to keep young people safe on the platforms, but it is a collaborative approach.

And there is a role for educators.  There is a role for civil society.  There is a role for regulators as well.

And I think it's really important to have more opportunities to come together and develop those roles.  I also think that it's important to have legislative solutions.  And we have also proposed a legislative solution where we think that age assurance at an app level or an OS level and parental consent at that level really will make things easier for parents to be able to navigate their online experience and minimize the data collection and address some of the age assurance concerns that our stakeholders have raised.

And I think it really is a multistakeholder community approach and we need to come together to address a lot of these issues as we have been doing.

>> MUHAMMAD KHURRAM KHAN:  Saudi Arabia has two global initiatives one is child online protection and empowering human and cybersecurity.  So in my opinion, the child online appropriate should be woven into the fabric of our society so it should be important and we should have specifications, protocols, formulations and standards which we are actually lacking to guide the social media platforms and online platforms.  When we build that technology, they should include into their systems because not only Meta and big organisations, they have the resources, right?  But what about small startups and small companies.  They do not know about these things.  So if there are specification standards and protocols, this he would include those things into their systems when they build.  So it will assure that child remains safe and sound and their wellbeing is protected while they go online.  Thank you very much.

>> AUDREY ZARENIN: I have a very responsible mission in my closing word because I am the final one speaking so I will try to be positive.  I would like to stress that security and legislative environment for children is also a social challenge.  And as we have already discussed cooperation of families, of business, society, without all of that, we cannot be effective where our children will feel safe and protected.

And from my point of view Internet, we have discussed it, it's just one of those windows to all of this world, and the key point from my point of view is the human communication.  This is trust between children and parents and no filters and technologies might replace this actual dialogue.

And from my point of view, we need to accentuate on complex solutions that will unite education and technical protection and psychological support in some situations.  We as Russia, we are ready for international cooperation.  We are ready to share with our tools with our educational programmes, and be in some cooperation to protect our children.  So let's create a safe and inspiring environment for our children and let me conclude with this phrase that our children will be smarter than us.

(Applause).

>> MODERATOR: In some cases they already are.  So with that I want to say thank you to all of you for joining us this morning, and for those of you who took the time to join us this morning as we had this conversation.  What is the resounding theme here is that there is a recognition that there has to be a multistakeholder approach in ensuring the protection of children in the digital realm.

Just up here we have a wide representation Private Sector, education, Government, civil society organisations, and the technology organisations as well.  So this is perhaps one of many steps to be taken to formalize what the plan of action would be, but I do think the point that you made, Mr. Kaspersky is probably most important.

We will survive this too.  But all of our children are important, and we just want the best for them as they interact in the digital world.  Thank you all for joining us.  And enjoy the rest of your day.

(Applause).