IGF 2025 - Day 00 - Press Conference - Dynamic Coalition Collaborative Session (Raw)

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> JUTTA CROLL:  Hello, everybody, welcome to Dynamic Coalition's first cluster session.  My name is Jutta Croll.

Dynamic Coalitions have been set up as a certain format at the IGF 20 years ago for grouches and individuals.

Nowadays we have 32 Dynamic Coalitions if you are interested in the work of them just come to the booth in the IGF Village which is positioned directly opposite the Plenary Hall entrance.  Today we have six different Dynamic Coalitions here and we want to showcase some of the work they are doing in regard of emerging technologies so we are talking about the future of the Internet, and having that said I want to introduce you to the first panelist.  We will go through them and introduce them one after another when they are going to speak and to my left is Dino Dell'Accio, colead of the Dynamic Coalition on emerging technologies so he is in the perfect role to speak about the issue today.  And I will give you the floor Dino.

>> DINO DELL'ACCIO:  Thank you very much.  Happy to be here to share the experience of the digital, the Dynamic Coalition on emerging technology as well as the contribution as the CIO of the United Nations pension fund we have been giving to the work the Dynamic Coalition emerging technology and in turn to the mandate in the mission of the IGF.  So very briefly to introduce the work and the meaning of the work of the Dynamic Coalition on emerging technology is that we recognize that when we are dealing with emerging technologies we are also dealing with emerging standards.  And as we, as we know, as we, of course, align ourself to the principle, to the values, to the mission of the United Nations overall as well as the values initiative such as the Internet Governance Forum, such as the Global Digital Compact, such as the UN Secretary‑General on digital cooperation, we want to make sure that emerging technology, innovative technology, transformative technologies are used in a responsible and ethical manner.

And here is the mandated mission of the Dynamic Coalition emerging technology which is exactly to support, to coordinate, to document tangible and concrete examples of when these emergency technologies are used and how we can determine whether or not that has been done in a responsible and ethical manner.  Thank you.

>> MODERATOR: Thank you, Dino, you gave us the perfect start for our session.  I'm now turning to Surabi representing the Dynamic Coalition on journalism.  I'm pretty sure you are not only writing as a journalist about so on but you have background to explain what the DR is doing.

>> SURABHI SRIVASTAVA:  Thank you, Jutta.  It's nice to be here with fellow panelists.  I am Surabhi Srivastava. I work at R and W media, a media development organisation based in the Netherlands.  I am representing here the Dynamic Coalition on sustainability of journalism and news media.  So the DC, civil society organisations are concerned about the impact of technology in terms of the news media and journalism landscape, globally, nationally, regionally, and also very much concerned about the impact that technology and Artificial Intelligence has had and is having on freedom of expression, on censorship, the spread of mis and disinformation online, and also the intersection of digital policies and regulatory frameworks on the state of media globally.  In terms of our work, we produce reports, we gather evidence look at data which is shared by members from, that is part of the Dynamic Coalition and also analyze how regulations are impacting both journalists but also media organisations at large.

And so the new report that we have just launched and I have a copy of it here with me, which is looking at the impact of Artificial Intelligence on news media organisations and especially public interest and independent media and it's trying to look at different facets I will get into later in the discussion, but particularly looking at how Artificial Intelligence is shaping the stability trajectory in particular especially for independent journalist.  What is the impact of AI policies and infrastructures on sustainability of news media, but also more broadly how do we build more inclusive digital infrastructures and policies that ensure equitable participation from wide ranging stakeholders, which includes journalists, media, civil society organisations tech organisations, digital rights activists, but also Governments as well.

And I think we are really emphasizing on the multistakeholder approach, which is also the spirit of Internet Governance Forum came about, and how can we ensure that the process of including multistakeholders is not just reflected in the process of how regulations are formed but also in its outcomes.

And I will dive into that a bit later as we continue the discussion.  Thank you.

>> MODERATOR: Thank you, Surabhi.  I am going directly to Dr. Gupta on my right side who has a long standing experience in the Dynamic Coalition on digital economy.  But also experience beyond the Dynamic Coalition having running already the global think tank health Parliament.  You can tell us about that, and I hope you can have a link to what Surabhi has said regarding ethical use of emerging technologies.

>> RAJENDRA GUPTA: I think the Secretary for hosting this and a round of applause for those behind it, Roman, Markus, sell lean.  I will not forget to name them for making it happen.  IGF is on the way to becoming an adult.  We are 20 years old, so we believe we have to be more responsible to ride the change.  As someone who drives Dynamic Coalition on digital economy, we have always looked at getting to address the basics.  While the narrative has shifted to Artificial Intelligence, I think that 2.6 billion people do not have access to Internet.  That means we are keeping them out of the economy in the current times and we live in digital age.  That's the first thing we should look at addressing because if you keep them out, not only are they losing, we are losing out.

I believe if we bring 2.6 billion people more which is one third of humanity, we would grow phenomenally as an economy with new ideas and innovations.  For me that matters most.  The second thing is that what I said in Berlin a few years ago at IGF that it is a small number of large companies that drive the Internet.  If you really want to democratize it has to be large number of small companies.  So that brings the question of governance of Internet and I think an organisation like IGF and the wonderful work that we do as Dynamic Coalitions we have to do.  We came up with this report this year, Internet as a great equalizer, you can go in a few days to download that.

And on the point of, I would say it's ethical for us to hear talk about what Artificial Intelligence is, what that while there is a lot of things that talked about how much it will add to individual efficiency, but each level of the deployment will lead to displacement.  So we have to be mindful of that as well.  So we shouldn't lead to digital divide which skiffs but it should not lead to AI divide which will be far more dangerous.

>> MODERATOR: Thank you, Rajendra.  Picking up on the title of the document you referred to, the Internet is a global equalizer.  I would like to turn to my colleague, Torsten, who is talking about equalizing the opportunities for children.

>> TORSTEN KRAUSE:  Thanks, Jutta.  To prevent metaverse divide and ensure that all children can participate equally, I would pledge to implement child rights impact assessment, but to, I would like to take a step back before and come later back on that, and what remind you that children are early adopters.  So the Internet wasn't developed with children in mind, but children will use all of these opportunities, will use the applications, the services, the games what is offered online and they will use this without, with fun, with joy, and we don't have such kind of a precaution in their mind or and will use this for socializing for education for entertainment and for all of these but they face risks and often they also face harms online, and to get a feeling about the amount, you have to know that one out of three users of the Internet is a minor.

So when you talk about children, then we talk about minors, so in general we talk about persons until the age of 18, and the solution could not be that we, we don't keep children out.  Children have the right to access media and to take part in the digital environment also.  That's why in 2021 the United Nations Committee of children's rights released the General Comment 25 on children's rights in the digital environment and declared, described and also prescribed how to realize the protection, the provision and also the participation of children in the Internet.  And when we say that the Internet was shaped without children in mind, then we now are in the situation to do it better when we are going to the further metaverse to emerging technologies and I think also that we must do it that way.

And to realize an equal participation of all children without an experience of discrimination of any kind, we have to take in mind the wellbeing of the child, the best interests should be taken into account, and to realize that, and to measure out how to realize that child rights impact assessment could be a real good tool and assessment to bring this in the forefront.

And, yes, maybe I leave it with that.  Thanks.

>> MODERATOR: Thank you, Torsten.  I think you provided the perfect sector to Janice Richardson, a colleague of mine for nearly 20 years, I would say, having also been a member of the Dynamic Coalition on Child Online Safety as it was called beforehand, but now you are speaking on behalf of the Dynamic Coalition on Internet safety and security standards.  Please, Janice, you have the floor.

>> JANICE RICHARDSON: Thank you, good afternoon, everyone.  What is the ISC3?  Why do we join safety standards and security?  Well, first of all, because for me, for our group standards are at the heart of safety and security.  The IS3C is a little different than most of the Dynamic Coalitions in that we have a number of Working Groups.  Usually there are about eight working at the same time so that we can deep dive into various aspects.  For example, quantum computing, Internet of Things, procuration.

So these Working Groups together can bring the knowledge in, can exchange ideas, can have a discussion because when we look at the Internet, one of the big problems is that we are all working in our silos.  The cybersecurity people aren't speaking to parents, Government is not really listening to the voice of the user.  So my Working Group is education and skills.  We have conducted study, we have found there this enormous gap in what industry is expecting and what is being learned by university students.  There are many gaps, and I think the discussion is why don't we start working much more closely together in a hub so that we can share information.  We can learn from each other.  It's not happening, and it's really great to be here today because it looks like it is happening for the first time.

That's something that we will be going further into this session, and something we would love to talk about.  Thank you.

>> MODERATOR: Thank you, Amrith.  Your input has forced me to initiate a dialogue between Dr. Rajendra and Amrith because you told me beforehand that you would like to talk about the fear of missing out on the opportunities of emerging technologies and I'm sure you can explain us why you want to talk about that and then we could turn back to Amrith to tell us what is the team perspective of this kind of fear of missing out and to say it's FOMO for those who are not familiar with the abbreviations.

>> AMRITH KUMAR:  Thank you, I think one of the things that happens with the Internet is not just the democratization of what goes around but it becomes vital.  Everyone gets to see things and watch them and get so excited that there is a fear of missing out, but I think there is one more fear that has started, the fear of starting late.  That's why we see every generation of people wanting to start now, about technology, why some of them have not reached a level of maturity that.

We will not be doing the same way as we are doing.  We will not be doing like this.  Two years back I hosted a meeting where we had robot as a panelist, I think it will be for some appear to be unusual, but the robot answered a question as a human being.  I think we are in for something good, but rather than fear of missing out or fear of starting late, I would encourage everyone whether you are a teen or an older adult is to start playing with technology before you start using it because at the end of the day we are living in a digital error, and as an economy on the Dynamic Coalition on digital economy, we believe Internet for all and livelihoods for all.  Thank you.

>> MODERATOR: Thank you.  Amrith would you like to come in on that as well.

>> AMRITH KUMAR:  In order to respond to that, we have to understand that teens are playing an important role in regards to call of the technologies.  So the main goal talks about emerging tech, so we are talking about innovation, AI and other technologies, but before we can get into talking about the future of these technologies, we have to understand who is shaping this and ultimately at the moment as we are raising it who is systemically excluded from doing so, despite being at the front line of these digital landscapes.

So teens are playing a large role in how these technologies are being shaped and as fellow co‑panelists Torsten mentioned teens are accessing these technologies in these online spaces, so we have open science and international discourse and taking part as creators, innovators, but due to the age barriers in place, it makes it nearly impossible and we are included in governance spaces based on our audits.  And what this shows is that without proper inclusion, teens cannot partake in governance, and ultimately it's our right to do so.  That's some of the challenges in place.  It's, therefore, important that that happens.

>> MODERATOR: Thank you, Amrith.  I would like to turn to Dino again because when Amrith mentioned the systematic collusion that we first also to the emerging of metaverse divide, do you think we are at risk of having such like a second digital divide in regard of emerging technologies.

>> DINO:  Did so, definitely.  I think that, again, using two concrete examples, the Dynamic Coalition on emerging technology started some significant work in identifying those instances and to try to also document and acknowledge the solution that had been found.  So first and foremost, again, as a representative of the United Nations pension fund within the Dynamic Coalition on emerging technology, we had a concrete example.  The UN pension fund in 2021 implemented a digital identity solution for proof of life.

Basically an application using emerging technology such as biometrics, facial recognition, such as Blockchain, Artificial Intelligence to enable more than 72,000 individuals of the UN in 182 countries to confirm every year that they are still alive, and, therefore, confirm the eligibility to receive payment.

As we went live, we used an application that we can download on the smartphone.  Very, very soon we actually realized that although in small cases in certain parts of the world there are users that do not have a digital divide or even if they have it they cannot use it in adequate manner.  Practically speaking the they use the digital device for voice call.  They cannot afford to turn it on on the data plan.  So that was a concrete example of a digital divide that we had originally not identified.  And immediately we had to think out of the box and become creative.  That's when we came up with a solution in collaboration with other UN entity, the United Nations international computing centre and some of our colleagues in the UNDP, the United Nations Development Program, to install a kiosk mode of the application.  So basically equipping the field offices of the United Nations in those areas of the world where we identify those limitation and constraint and enable the user, the retirees living there that do not have the possibility of using a smartphone to go and visit the office and perform proof of existence.  The second example is that the digital, the Dynamic Coalition on emerging technology is hosted by innovation network global.  It's a Canadian‑based organisation based in Vancouver that started to experiment solution for the use of emerging technology by indigenous population of Canada.

And they also encounter a situation where members of the indigenous population did not have access to public health services.  And with them, they started experimenting with the solution using remote censoring in order to enable them to have some sort of a diagnostic that could be sent to a specific healthcare provider and provide them services that they need.  So indeed to answer your question there are cases where unfortunately we still experience the digital divide.  At the same time there are ways to resolve that.

>> MODERATOR: Thank you.  With regard to the solutions you have described, they are kind of showing creativity, but will they still fit into the standards that the Dynamic Coalition on safety, security, and standards is asking for?  What do you say, Janice.

>>  JANICE RICHARDSON: Yes, first of all, we need to understand what standards are and why they are important.  Because they should be guiding the way that the emerging technology is being developed.  They need to be very flexible to encourage innovation.  But they also need to be well known and taken up not just by the people developing the technology, but also by all of us.  I go into school and ask can you cite five human rights.  I very rarely found someone who can actually cite them.

I think it's the same thing here, standards of important, rights are important, but what use if people don't understand about them?

We did a study three years ago, the study I just mentioned, and it actually came up with something very interesting.  In the cybersecurity and the technical industry, the very big downfall of young graduates is that they don't understand the architecture of the Internet, the architecture of the cloud.

We have all become these superficial users, but we don't really understand what's underneath.  So, and this, I think, is ‑‑ it's a different sort of divide because people, I think most of us here really understand what the Internet is built on, what can happen if I do this or that, but the people who don't understand that, who don't understand the impact that the Internet that digital technology is having on their life, they are the ones being left behind in the digital divide.  So that's my idea with the standards which are crucial but they have got to be well known otherwise they are useless.

>> MODERATOR: I think we will go back to human rights and standards when I turn to Torsten, but firstly, Rajendra, there is also economic aspect to further metaverse divide or digital divide.  Please.

>> The challenge I see today is that technology fights between two P's, people and profits.  The third P, purpose, is missing.  What we need to look at some of the things that have to be classified as digital public goods and digital public infrastructure, if we do that like Dino has done in the UN is to use Blockchain as a digital public infrastructure, the moment we do that, profits will be there, but purple be above profits when it comes to people at the centre.  I think that's the biggest thing, otherwise the digital divide will aggravate.  There will be disproportionate aggregation of money in the hands of few.  Thank you.

>> MODERATOR: Torsten, my question would be how do we prepare children, teens and then youths impacts of the Algeria of the future and that might also be shaped by a divide that not all have the same or have access to the same opportunities.

>>  TORSTEN:  Firstly, when it's possible, I would like to refer to what you said.  Yes, I would agree that children and also adults too often do not know what rights they hold, and which rights are exist, but maybe in different standards children don't know that it is a right, but if they express their feelings and interest, then you can find that it is a right, even they don't know that it is a right.  I hope I express it in a way that you can follow them.

But how to prepare children that they don't miss, I would say to raise a child in today's society in a digitized world means to raise them with media literacy too.

So as I said, children are early adopters so there is a duty to prepare them to work, to use, to realize these tools and opportunities and this duty is a shared one.

So it's duty, of course, of the parents of the children, but not alone.  It's also a duty of the adults surrounding the children of the educational staff, of other responsible persons, but also of the services, of these organisations and firms who offer this application, these games, these opportunities and so on.

So it is I think there is a need for a mix of measures to prevent this gap, and it will be an ongoing process of teaching media litter assay.  From my perspective it should be part of the kindergarten and preschool further on to the high school because it's an ever evolving system and environment and there is not, we have learned it once and we can live it our whole lives.  So it's an ongoing process, and it should be, come together with an accompanied media usage especially when we or when children and we together go into an application or a service or a game to experience it together, to face what's happening there, why or how you can gain the joy and the fun of it, but maybe also the risks and harms.  So we could maybe put it simple that strong children are raised by strong relationships.

So we have to listen to each other, to follow what we are doing and to be open also for the experience and keeping kind of a trusted relationship.

But we are also facing services that are not created in this way, so it's necessary to put a responsibility and accountability to these service providers, and to hold them accountable for safety by design for precautionary measures.  We could have as said before try to figure it out with child rights impact assessment, but we can also put in place other, highly discussed around the globe are age issuance measure just to put one example on the table.

>> MODERATOR: Thank you.  I think we can come back to that also further on in the session.  Please be prepared if you have any questions to the panelists or also to other representatives of Dynamic Coalitions who are here in the room.  I will open up the floor after Surabhi has been speaking.  Torsten has been speaking about children rights and freedom expression and access to information as laid down in the UN Convention on the Rights of the Child, Article 13 and 15, but it's also a general human rights which the Dynamic Coalition on journalism has put a special focus on.  Do you think that these rights to freedom of expression and information access are under pressure with new emerging technologies?

>> SURABHI:  I think it was useful to listen to the panelists and tie in the points that I wanted to bring here on behalf of the Dynamic Coalition on journalism.  Yes, I mean, for us, freedom of expression is certainly a very important stand, journalistic standard we hold dear to media and anyone working in the media space and access to information has suffered a huge blow not just in terms digital divide but factual information as we deal with more and more misinformation propaganda but also AI generated content.  When it's hard to sift through what is really the truth, what's really fake, and so, of course, there is a huge infringement to the right to the freedom of expression but also access to information that can save lives is useful for us in a meaningful way.

And we are now suffering from information overload in a more polluted information ecosystem more broadly.  Hence it is important to understand this intersection of both access to information, freedom of expression with emerging technologies like Artificial Intelligence.

And I wanted to refer to also something that was tied in from one of the panelists on a paper that's appeared in our latest report by Jenna Fung where she talked about access to information and news by Gen Z on social media, and while they are experiencing FOMO, I think it's important to also recognize that they are also not necessarily having access to the best news or information out there.  There is a lack of nuance.  There is often a lack of perspectives and plurality in the kind of information they are receiving, and, of course, we are all immerse in our own echo chambers when it comes to accessing and using information.

So it really needs to be said that young people also need to be part of the processes in which we decide how our information systems are governed, how these ecosystems are regulated, how AI again rated content can be regulated and addressed or regulated in Internet spaces.

And another point to mention here is in terms of the ethical use of AI.  I think, again, going back to FOMO, there is also a lot of pressure in news rooms and among news media outlets to start using AI tools and while there is a fear of missing out, I think it's important to take a step back and also think about the ethical and mindful responsible implications of using these AI tools.

We often do not have the space or resources or time or the pressure within which we work to really think about how are these AI tools being developed?  Where is the funding coming from?  Who are the people behind it?  Who are the companies?  And it's really important these questions are not, I mean, they may be controversial and provocative but they are important when we try to understand how are the AI tools being deployed and how do we use them as consumers and in our case as media makers globally.

>> So it's important to understand these aspects and I think we come back to the platform accountability questions, but it does have a long term implications, and we are already seeing that on freedom of expression and access to information.

>> MODERATOR: Thank you for these very relevant thoughts.  I do think from the journalism you are kind of, you have been talking about being under the pressure on the one hand, but you are also kind of the gate keeper to giving people access to information.

So you are in a double role, I will say.  But turning to Amrith, do you think you are, as the teen coalition or as teenagers at large in a perfect position to exercise your rights to freedom of expression and access to information or where do you feel there are certain barriers or even divide?

>> AMRITH KUMAR:  I think we can take a look at this from a rights‑based approach.  So as mentioned before, we have frameworks like the UN Convention on the Rights of the Child, and we have General Comment Number 25.  So these ultimately affirm that teens do have the right to access, create, and participate in digital life.  So now it comes to ensuring that these frameworks and these digital governance models have the pathways that allow teens to partake in these discussions.

So in this manner, some challenges we are experiencing is that the Internet Governance Forum and other multistakeholder institutions systemically exclude teens by the fact that youth and how it is defined is not very inclusive.  So at the UN and nearly every global structure, youth begins at 18, and stretches to 35.

This essentially conflates youth with young adulthood and even middle age.  And as you can see, it completely erases teens.

And the issue is that it is not a symbolic situation.  It's structural, as we have seen in our national regional audits, the IGF 2025 youth mentorship program only accepts participants age 18 plus.

So this is just an example of how every teen innovator under 18 regardless of their impact, experience or readiness to contribute, the aspect of readiness which was mentioned by Dr. Rajendra and Janice before, and the issues of the outlier.  So it's part of a pattern we have been noticing so various you youth programmes and initiatives show the same systematic issues where the participation, simply to participate is set at 18 plus or the age category overall is undefined.  The issue is that it creates quiet, but as you can tell very powerful barriers that are keeping teens from participating in digital governance models.

And I think that's an important issue that needs to be addressed to ensure that teens are exclusively represented in these spaces.

>> MODERATOR: Just to make this clear, Dynamic Coalitions are open to everybody.  There is no age limit to join a Dynamic Coalition.  That is also demonstrated by having a dynamic teen coalition.

We also have a dynamic youth coalition on Internet Governance.  So I do think that we try to be as inclusive as possible but maybe we have voices from the floor not only in regard of this aspect, but also in regard of all of the other issues that have been brought up by the Dynamic Coalitions here on the panel.

We have two motorcycle phones so, and one in the back as well, the microphone can be brought to you.  Just raise your hand and then you will be given the any.

Maybe we have someone in the room and we have a look whether we have questions also from online participants, not to exclude them.  You want to have the floor?  Oh, I thought you had raised your hand.

  Let's go first with the panelists in regard of that inclusiveness that we are providing with the Dynamic Coalition's work.  Would you like to go? member.

>> DINO:  The concept of inclusiveness is foundational, and this was part of the design.  So usually one of the common terms that is referred to when talking about the digital application is privacy by design.  In our case we added to that inclusiveness by design, and especially when you are dealing with potentially disadvantaged groups such as in the case of the UN pension fund with certain demographics, that level of inclusiveness was, if you will, a different level, was not just about the age.  It was not just about geography, but was also about the potential level of difficult easiness, if you will.

So adopting a concept of human centric.  We definitely focus on making sure that during our testing, we were able to capture whether the user interface, whether the experience of the client with the application was such that anybody could have understood what was expected of them when using the application.  But going beyond that, we also realize that it's not only about using the application, but also trying to prevent certain fundamental questions.  For example, in an application that uses biometrics, an instinctive question would be where is my biometric profile saved.  Is it protected?  Is it transmitted during the course of the use of the application and so forth so on.

So the concept of inclusiveness went beyond the concept of just making sure everybody could use it, about that everybody could use it in a meaningful way but able to also understand what they are doing with the application and what are the pros and cons of the application itself.

>> MODERATOR: I do think, Torsten, you have a certain position with regards to biometrical data of children as a vulnerable group.  Would you like to get into that and then I turn to you with regard to the economic effects of what Dino just said.

>> TORSTEN KRAUSE: You are right, the data of children are protected by the general the comment or by the child rights Convention, and there are protected not just for children, of course, we have several legal systems and laws and constructs in place to protect the data of all people, but what it comes to the data of children, then I think we have to raise the bar.  We have a higher level of protection even when it comes then to the biometrics data of children, then we have to keep this in mind.

And so I think we have to develop and establish systems that are not in general are based on this data so that children are able to decide if they want to provide this data and not to be excluded from the application, from the service, from the offerings, if they decide not to do so.

And that's why I think it's necessary that we put this also as a kind of standard.  Maybe it is also a standard that it's not obligatory to provide your biometrics.

>> MODERATOR: Yes, thank you.  So I gather from your answer that obviously we all come sometimes in the situation that we have to decide whether we give away our biometrical data or more data than we wanted to give away on the one hand or otherwise we are excluded from using certain service.  This is also an economical question, isn't it?

>> RAJENDRA GUPTA:   I don't look at few decades, I look at entire millennia.  This is interesting for all of us to understand.  So if we look at the, like last two thousand years, first few centuries, any economy that was strong in agriculture was a strong economy.  India had 27% of the world GDP.  Then one who had a strong military or fire power was the strong economy.  No wonder U.K. and U.S. started controlling the world.  In the times we are living now it's about trade.  So we see China as the world you power, but the future we are transitioning to is technology.  So I think anyone who is strong in technology will control the world.  So for all of us, one facet we have to understand, this is not technology change.  This is a societal and civilizational change unless all of us move to this and start using technology, it will be a big disservice to society and the economies.  Etc. an economic imperative but it's a societal and civilizational change.  Everyone has to be on the Internet and using technology.

Digital economy comes to still one third of people have no access to the Internet.  We are missing a huge economic opportunity.  Thank you.

>> MODERATOR: Surabhi, you have the floor.  I know you have to leave early so use it also for your final statement?  SURABHI:  I wanted to make a point on the financial aspect of the impact of AI and emerging technologies on media and journalists.  In our paper where we interviewed media outlets and primarily smaller media outlets that are doing more public interest work in the global majority and Global South countries but in the paper which is included in the report, we find that the cost of accessing these AI tools is quite prohibitive for a small media organisation.  And, again, because of FOMO, there is this whole measure of using certain AI tool pressure of using certain tools, buying subscription plans, or if you are doing local investigative journalism there is an increasing reliance on data analytics, and for that you need certain kinds of tools to get the latest insights.

And all of this has an added economic cost for news rooms and especially if you are a news room that is struggling with financing and funds, this adds additional cost burden to you, and you have to make tough decisions around whether you are going to use a certain AI tool or not, or what impact it might have on the new story that you are trying to produce.  So I think it's really important to also look at that financial angle of what these emerging technologies are doing in terms of shaping the financial viability and sustainability of news firms in the long term.

Since I have to leave unfortunately for another session, I do want to mention just maybe one last point on platform accountability.  I think we need to start moving beyond just looking at access or looking at, for instance, gaps in content moderation or online censorship, but also really advocating for anti-concentration of power and anti‑monopolistic or monopolization of these technologies.  If you have read the book empire of AI by journalist Karen Howe it was released last month or if you follow the journalist who is writing a sub stack on how to fight the Broligarchy.  You understand who is the companies shaping the tools and the empire of AI it's being called and it demands us as media makers and journalists but wide ranging stakeholders to stand up and ask the questions why do they get to hold so much power over a tool that will change not just the shape of our lives but the trajectory of human civilization.

So I think that that's something as journalist and media makers we need to start questioning along with everyone who is interested in how AI governance needs to shape the next few years and decades of the work they are doing on Internet Governance.  Thanks.

>> MODERATOR: Thank you for your input and also for being with us on the panel.  I do think that you will also be able to be at the Dynamic Coalition booth for those who want to pose a question to you.

Do we have a question from online participants?  No question.

>>  Rajendra:  If anyone in the room, please.

>> AUDIENCE: For Timothy hole born maybe it can be put on line so he can ask the question.

>> MODERATOR: Can we get the person speaking.

>> AUDIENCE: His name is Timothy Holborn.

>> MODERATOR: Can you open the microphone to the Zoom participant.  9 technicians are working on it.  They can type it but if someone from the room will come in.

>> AUDIENCE: Hello, I'm Timothy Holborn, I'm Australian, I have been working on W3C standards and royalty‑free technology to support the means for people to earn, not merely be a licensee of the thoughtware and to be able to have electronic evidence to support the human rights.  So some of the work I have been involved with include verifiable credentials which you might know as digital identity.  I think the point about children was at the heart of our work, and I'm very sad to say that after over a decade, 15 or so years whether those objectives system seem too far away.

There is fundamental piece about technology, about Artificial Intelligence, about how we are collecting increasingly high Resolution of who we are as people, of our consciousness, of our mindware, of our relationships, of our footprint, but the infrastructure to make sure that it's actually earned by us as natural people, somehow that's allowed.  Seems to have a lot of friction and part of that might be intellectual property and trade where it's kind of like people falling over something that is a physical thing rather than the fact that we are sharing this conversation so we have implied and express moral rights around that.  And perhaps those sorts of things need a different series of considerations, but in my technology work, I wonder which is the best jurisdiction to do this sort of work because at the moment, if I did more, I'm concerned it would just end up in a bunker or perhaps on Facebook.

So that impacts adults as well as our ability to be parents and to protect their children.  So I'm not sure if that question is clear.  What jurisdiction in the world is leading the way in how the natural rights of people, the Human Rights of people, the fundamental needs of the agency, personhood and digital, those sorts of things are able to be preserved through this wild road of digital transformation.  Thank you.

>> MODERATOR: Thank you for your statement and for the question you have put into that statement.  I do think it delivers is a perfect segue to talk about the multistakeholder model of Internet Governance because I don't think we would find one jurisdiction that is in the perfect situation to regulate this, but as a multi‑stakeholder community we can address it.  What do you think about that?

>> RAJENDRA GUPTA: I think it's an important point.  We have moved from the youth to children, the teen.  I think we are in gen beta.  Anyone born on 1st January 2025 is gen beta.  So the bigger issue is ethics, not regulation.  I think we overly focus on regulation and standards.  We give less importance to ethics.  What we need is a course on ethics for people who work in technology so that we first understand ethics, then get to the regulations and frameworks.  And without ethics, no regulation is going to work.

>> MODERATOR: Okay.  Dino, please.

>>  DINO:   thank you.  Very, very good observation, very good question.  I can answer from the point of view of more conforming to a standard rather than complying to a specific regulation or jurisdiction.  And here again concrete example the UN pension fund, being an international organisation, we are not subject to any specific national jurisdiction, but we are subject fundamentally to demonstrate that we practice what we preach, vis‑a‑vis the principle, value and Resolution of the United Nations.

So my colleague, Janice, before, made emphasis to the need to have standards, but even more to the need to have standards that are understood.  So in our case we took this at heart, and we did not only adopt emerging technology implemented, but we also went further in making sure that the use of those emerging technologies was being done responsibly in accordance with the standard.  So now that the level of jurisdiction, but I can answer the level of technical standard.

We certify the application according with the international standard ISO, and specifically most recently the ISO standard on the responsible use of Artificial Intelligence, ISO42,001.  I will spare you reading the tens and hundreds pages of standard but focus on two things that are expected by organisations that wants to demonstrate the use of AI in a responsible manner.  There are two.  One, to explain and two to be transparent.  And?  So doing, I ‑‑ in so doing the best practice in order to address what Janice was saying before, how to make that conformance to a standard be understandable is by adopting a standard that is technology agnostic, that is not based on a specific LLM or a specific technology, but which is process based and principle based.  This is where, for example, we had to explain first and foremost how complex technology works and, two, to do it in a transparent manner by demonstrating how the use of that technology was done in an ethical manner vis‑a‑vis the principle and the values of the United Nations.

>> MODERATOR: Thank you.  I'm turning to Janice.

>> JANICE RICHARDSON: We are talking about multistakeholder, but there is one thing because my background is in education.  There is one thing that absolutely intrigues me.  For the tech industry, what is education?  A consumer.  We know with the data sets that they develop, with the products that they create, that they see education as nothing more than a consumer.

And I'm waiting for the day when they become real partners.  Who in this room works in education?  I think you see my point.  There are, what, four of us, five of us.  But the day that education and the tech industry become real partners, not just multistakeholders which is nothing more than a term, then we are going to see a big difference.

My next point, we did talk about societal change, and there is one huge change under way, but we are not, we don't seem to take it into consideration.  We are throwing everything nowadays at education.  We are all expected to be able to look after our own cybersecurity online.  But who is actually doing the teaching?  And do these people really know what they are talking about?

Are they really up to the latest in what is technology?  Multi‑stakeholderism, my third point would be just look AI.  This is a fabulous case in point.  UNESCO has developed, I can't remember if it's an chapter or whatever, the European Commission has developed the AI Act.  The Council of Europe is developing some other sort of Act or Convention, but why don't we have this joined up with the institutions that are actually working together and getting a single set of standards in this very important area that is simple enough for all of us to understand and, therefore, for all of us to file complaints and to make sure our rights are respected, but that are really flexible enough that we can keep creating and keep this technology emerging until we get to a point where everyone can have access because it's free?

>> MODERATOR: Thank you, Janice.  Torsten?

>> TORSTEN KRAUSE: So many thoughts are going through my mind.  I tried to sort it out a little bit, but first, maybe let me put your intention ‑‑ attention to the Convention of the right of the child.  Children have the right to participate.  So the voices of children must be heard and not just be heard, but taken into consideration seriously.  And so when we are coming again to the digital environment, the United Nations declares that all children should be involved and listen to their needs and give weight to their views when developing legislation as you talked about, policies, programmes and services.

And the camps on several level.  If we are talking about legislation, then I would like to see, and also the UN would like to see that children are involved, that their perspective is brought in and taken into consideration.  And also when new services, applications are developed, then why not talking to children?  What's the idea?  What's maybe the way to create this new application and to bring the perspectives of children in this developing process and not reacting afterward when it's wrote out and we see that there are some hinderings, there are hurdles, maybe it's creating harms, and then redeveloping it afterward.

So why not do it beforehand.  And it also touches our multistakeholder model.

I think, and don't get me wrong, I really appreciate the work of Amrith and the dynamic team coalition, and the youth coalition, from a child rights perspective you could also see it as a kind of exclusion of youth and minors within the Internet Governance ecosystem.  So Jutta said all Dynamic Coalitions are open and we want to exclusively, but I'm not aware about the situation of every Dynamic Coalition, but in every Dynamic Coalition are minor and children present.  Can they bring in their perspective?

I think it's one side of the coin is that we implement a system that is inclusive and open but the other side is to create a situation where children are really invited and really find their place at the table and their voices will be heard.

And I think in this regard, we can increase participation in this model in the coming years.

>> MODERATOR: Thank you.  I will give the floor in a minute to you, Amrith to react to that suggestion, and also turning to all of the panelists to say what is your key take away from this session.  But firstly, I would like also to answer, take off my moderator hat and answer to the question that came from the online audience because what Janice said before in regard of education in this regard, I do think we, when we refer to the AI Act of the, in the European Union, which might also become a model framework for regulations across the world and in other jurisdictions, what we don't have in the AI Act is any obligation for Governments, for education in regard of AI tools of the citizens at large.

There is only an obligation for organisations and companies that are deploying AI tools to educate the staff, but no obligation for the general public to be educated, so no obligation for Governments.

And that's kind of a missing link.  Also children are not mentioned at all, or children's rights in the AI Act.  So we have there something, some regulation on European level that does not fit all expectations, I would say, and, therefore, it's necessary to further go on and try to find also in the multistakeholder environment of the Internet Governance ways to go further within this regard.

And now I'm turning to you, Amrith for your final remarks and please refer to key takeaways.

>>  AMRITH KUMAR:  Thank you.  So as we close the session, I think one common take away we can take is governance only succeeds when it's truly representative.  So we have a multistakeholder ecosystem, and it should include digital participants of all ages, sectors, experiences and bring their perspectives forward to cocreate meaningful solutions.

And that is the essence of inclusive governance.  It's representative of the dialogue we are having here today and that is the promise of the IGF.

So just to address one point as Torsten mentioned while the DCs may be inclusive to an extent of teens as well as teen representation, that may not always be the case across the IGF as we have found in our audits, and there are still age restrictions, 18 plus regulations and as a result exclude teens, but that's what we want to address.

So over the three years our DC Co‑Chair developed a model that reflects how people engage across the life span.  So within this model they have five subcategories, 0‑121 childhood, which is a focused on safety, care, and support of expression.  13 to 19, which is teen, an entry point for digital rights, civic agency, and active participation.  So connecting to the UN Convention on children's rights and General Comment Number 25.

And ten from 18 plus to early career, we have a bridge to the leadership, mentorship and opportunity for those within this categories.  Then we have mid-career which expands the impact of collaboration and decision making and finally senior career where we have legacy building, structural guidelines and intergenerational mentorship.

The goal of this model is to create an inclusive lifelong multistakeholder framework where it's not just teens that are included but to rights‑based pathway for all stages of life.  This is a common message across all of the panelists we have seen today that we must create an inclusive environment and we believe this is the way forward across the general IGF ecosystem.  Thank you.

>> MODERATOR: Thank you.

>> RAJENDRA:  These are not emerging, they are promising technologies.  So the first thing we have to get our heads clear that AI is a promising technology along with others that we don't talk about much.  The second on the governance point, I think we need an ethical framework.

I also see a formal among regulators in terms of regulating AI.  Everyone is jumping to regulate AI.  Overregulation will kill innovation.  I think what we need is a multistakeholder ethical framework for regulating AI and I think IGF is in best position to handle that objective in a fair manner.  Thank you.

>>  DINO:  My cop including remarks would be acknowledging that when dealing with emerging technologies fundamentally we have very simple question, but at the same time very demanding question.  Simple such as those alluded to before, the ISO standard and responsible use of AI requires that an organisation is able to demonstrate that they can explain the use of the technology and it can be transparent about it.  But, of course, explaining how an Algeria works in an AI system and application is very challenging, it's very demanding, but I think that with the collaboration and with the principle that are being shared and expressed by distinguished panelists, I think we can achieve that.  Thank you.

>>  JANICE RICHARDSON: We are here to talk about governance and effective governance which doesn't rely on regimentation in my mind.  What it does rely on is accountability of every single user in the world.  It depends on fostering trust amongst all of those users.  As someone rightly said, this is a lifelong path, a wild needs to learn about human dignity from the cradle if it's going to be implemented in human rights later on.

So I go back to it.  We need to inform, but really inform.  We need to educate, but we need to build the agency of every single person so they understand what accountability means, and then I think we will have a very different Internet.

>> MODERATOR: Thank you, Janice,.

>> TORSTEN KRAUSE: Thank you very much.  There is a question.  So maybe it's not the last.  I think we are facing new challenges for the upcoming metaverse with emerging technologies, with Artificial Intelligence, with the Web 4.0, and these challenges are huge.  So we have to find solutions.  And on the other side, we have fundamental rights of us, the fundamental human rights and I think or I believe that we will find the answers for these challenges in our fundamental human rights.

So technology should serve humans.  They should serve us.  And it should respect our rights.  And so we can level up our societies and gain from the opportunity.  I think that could be a way.

>> MODERATOR: Thank you.  So we have final words from the floor.

>> AUDIENCE: Yes.  Thank you, Jutta.  I think there is one comment that we have online from Stacy that she is saying that the teens should also become part of youth DIG but perhaps even at the IGF that they are more involved in the youth IGF than they are now and taking off my hat as remote moderator, listening to you all, I think there is a strong message coming out of the different DCs which sat a higher level is one message that we need to become more integrated, more aware of what we are doing.  And perhaps be able to submit a strong message as Dynamic Coalitions at the end of this IGF on what emergency technology or promising technologies are and how we should deal with them because I think we have heard the message already.  Thank you for that.

>> MODERATOR: Thank you, Wout, for that final statement.  Thank you to Stacy from the online participants.  Thanks to all of you who have been taking part.  If you want to get more information on the work of the Dynamic Coalitions, just come to our booth where you get also the information of three further workshop sessions that are run by Dynamic Coalitions and also you are invited to the Dynamic Coalitions main session which takes place on Wednesday morning at 9:00 in the Conference hall, and thanks to all of my panelists and to all of those people who have been very, very helpful in the preparation of this session.  Thank you.  See you later.

(Applause).