IGF 2022 Day 0 High-Level Leaders Session II

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.



>> SHIVANEE THAPA BASNYAT: A warm welcome.  We hope lunch was good and I hope it gets you right set in both spirits and ready for another engaging session here at the auditorium.

I'm Shivanee Thapa Basnyat, from Senior News Editor, Nepal Television and I'm so, so privileged to be having this pleasure and honor to be steering you through one of the high‑level track sessions here at the IGF 2022.

Now, before we get the session started, I have a couple of house announcements to make.

Number one, it is for the interpretation, as stated earlier in the morning, we are having interpretation in six different languages.  If you want English, please tune into channel 1, 2 is French, for Arabic, tune in to 3, for Spanish tune in to channel 4, for Russian, 5, and for Chinese interpretation, please tune in to channel 6.

Second, the time constraint.  We'll be running deliberations here at this very forum for about 120 minutes or less.  I place my pardon to all our distinguished speakers already before we even begin the deliberations to kindly pay heed to the time limit that have been suggested to us.  That's what I will be doing, following the flow chart that I have been given.  I seek your cooperation throughout the deliberations as much as we are looking forward to great insights coming from this very reputed panel.

Ladies and gentlemen, it is a privilege, in fact, an honor for me to be steering or leading or taking on this floor as the session moderator.  Deliberations are so, so central to governance, as you all know, yes, and I'm sure you already know we're talking digital rights.  Digital rights, as you all know already has increasingly become one of the critical, very, very critical aspects of Human Rights in recent years and it will continue to play a larger role, to flourish human race and all of the events that's happening around us in an increasingly digitalized world that we are getting to live with.

You know, to normalize and to defend various digital rights, it is one of the greatest challenges that is facing the entire globe with equal magnitude and momentum.

The risk that faces us should be ignore this dimension, it is the severe erosion of our collective Human Rights and security and how important is a comprehensive, a common apparatus to address these issues facing us and how challenging is the road ahead should we choose to tread on this particular path is the focus of deliberation that we will be representing in front of you.

I am very much looking forward, because this is a panel of distinguished and eminent personalities and representing multiple stakeholders, and I'm sure the deliberations here will represent the multiple realities that we are living and having to deal with in our day‑to‑day life and then, you know, we bring in our individual identities and at this stage of governance in the sector that we're talking hereon.

As multiple efforts are ongoing at various levels to address this critical aspect of Human Rights and global wellbeing in the technology and digital world we here try to bring forth the nitty gritty or dissect the entire topic in front of you to what we would be focusing on is what different stakeholders can contribute to the standardization and the defense of digital rights.  This is the issue of focus that I will seek to discuss with our distinguished panelists right here with me and cannot wait, ladies and gentlemen, to introduce our distinguished panelists who are here with us and then we have a couple of other panelists that will be joining us online.

My pleasure of inviting and welcoming Her Excellency, Ms. Karoline Edstadler, Minister for the EU and the Constitution of Austria.

Her Excellency, Monica Mutsvangwa, Minister of Information, Publicity and Broadcasting Services, Zimbabwe.

Mr. Vint Cerf, Chief Internet Evangelist, Google, Chair IGF leadership panel, it is a great pleasure to have the father of Internet here with us today. 

Thank you for honoring us, sir.

>> VINT CERF:  It is a welcome to join you.

>> SHIVANEE THAPA BASNYAT:  Amandeep Singh Gill, UN Secretary‑General's Envoy on Technology, our pleasure again.

Equally pleased we are to be welcoming Lucio Adriano Ruiz, Secretary, Dicastery for Communication, Holy See.

Thomas Schenider, Ambassador and Director of International Affairs, Federal Office of Communication, Switzerland.  We're privileged to welcome you here, sir.

Benjamin Brake, Director General, Data and Digital Policy Department, German Federal Ministry of Digital Affairs and Transport.  Pleased to have you here.

Mr. Viktors Makarovs, Special Envoy on Digital Affairs, Latvia.  A pleasure to have you here, sir.

Equally pleased we are to welcome Dr. Alison Gillwald, Executive Director, Research ICT Africa.

Likewise, our pleasure and privilege again to be welcoming Associate Professor Iginio Gagliardone, University of the Witwatersrand.

Also pleased to welcome here among us the technical expert, Tewodros Desrat, digital payments, AfricaNenda.

Likewise, we're pleased to have with us  Irene Khan, UN Special Rapporteur for Freedom of Opinion and Expression.

Equally pleased we are to be welcoming Ms. Mazuba Haanyama, Head of Human Rights Policy, Africa, the Middle East and Turkiye.

The founder startup Thoko, Thoko Miya.

Last but not least, Mr. Volkder Turk, UN High Commissioner for Human Rights.

It is a pleasure to have distinguished personalities here in.  The list is long and the time, it is pretty, pretty constraining.  I think that's the only constraint that we have.

My job here, it is quite simple.  I will be throwing a few questions related on the topic that we are here to deliberate on.  I request all our panelists to take your time limit to give your most important statements or remarks on the questions asked as I call your name.

Please pardon me if I have to interrupt you, if I should, if you exceed the time limit, I will have ‑‑ I will have to thank you and then move on to the other speaker.

I hope you would understand the constraint.

To assist you, to check the time, we have our friend seated right here who will show you this placard just to keep you on track.  Thank you, Brian.  Thank you so much for being so helpful.

Without further ado, we head straight ahead towards the questions.  Beginning with data, which again has become so, so central to our day‑to‑day living and I'm sure it will take the world by storm in the days to come.

Now moving to the panelist, beginning with Mr. Vint Cerf, chief Internet on each agenda valuing list, Google Chair IGF leadership panel himself, the first question, sir, in this era where data is an essential tool used in virtually all sectors to make progress, is it realistic to advocate for us, as individuals, and give us true agency over the ubiquitous use of our data?  What are some of the emerging data justice challenges in your region?

>> VINT CERF: The first answer is yes.

Would you like me to elaborate?

First of all, data is needed in order to transact in the digital and online world.

Second, transparency is just needed and deserved so that people understand what data is needed, how it is used, how it is shared, whether it is discarded and when, and the ability to correct data which is incorrect we must hold those who hold the data accountable and responsible for protection.

Some data deserves higher levels of access control.

I'm listening to myself talk two seconds later.  It is quite confusing.

Some data deserves equal access of control, medical, data, passwords, other authentications, so we know there are varying levels of protection that are needed and there are challenges.

Some data is malware or phishing attacks or ransomware.  Data is at risk and we must protect it.  We must improve its resilience.

Finally, if we're going to demand accountability, we need international cooperation to achieve that objective.

I'll stop there, Madam Chairman.

>> SHIVANEE THAPA BASNYAT: Thank you, sir.

May I move on to Dr. Alison Gillwald with the same question.

>> ALISON GILLWALD: (Poor audio quality).

With regard to the question of data justice, I think we have seen in the regulatory and policy point of view.

Data related harms have largely focused on ‑‑

>> SHIVANEE THAPA BASNYAT: Could you kindly move the mic closer.

>> ALISON GILLWALD: Can you hear me now?  Sorry about that.

From a policy, regulatory perspective, we're not actually seeing just data outcomes.  I think we see in policy and regulation that there is a safeguarding of more kind of individualized notions of personal and individual focuses on privacy and on Freedom of Expression.

I think that COVID has highlighted the need for us to look beyond these individualized notions of privacy to more common notions of public good and public interest.  I think, you know, this needs to be done by extending our regulation to more expansive approaches that are needed to look at more at the governance in terms of collective interests or the common good.

Of course, while also preserving individual privacy.  This on its own will not produce just outcome, however.

What we ‑‑ (no audio). ‑‑ it looks mainly at breaches and the positive forms of discrimination, in favor of those who are currently marginalized and would enable them ‑‑ and we'll come back to that.

>> SHIVANEE THAPA BASNYAT: Thank you.  I'm sure you have great perspectives to add to this.

>> I will go straight into it.  One of the challenges that I see in the data justice debate, a lot is skewed in the direction of data sovereignty, which is legitimate, because it takes into account the enormous imbalances of power between actors in the Global South and in the Global North.  It is a problem when it puts the national interests at its core and it is sort of overlooks how it may drown individual sovereignty, individual agency over data.

Maybe the extreme of this polarization is Internet shut down, where the national interests come into the way of controlling the flows of information, but then it prevents individuals from having access to whole Internet or part of it.

What I will say, it doesn't have to be.  If we look at some of the claims for more agency of user, there are Claims about value, users can't tap into the values that they produce, companies can.

And it is not that different from the claim that governments are making about the inability to tax companies that refuse to are legal entities in the wrong country.

The time is not coming up. 

One point I would like to make, it is many are connecting citizens and states, and one of the reasons why it is difficult to pursue them, it is also because there tends to be mistrust between the two actors. 

Here ‑‑ if I have ten seconds, I know it is up ‑‑ I want to stress the importance of regional institutions, like the European Union has shown us how important it could be to have this institution both in leverage with big actors checking on states and protecting citizens.


To Mr. Tewodros Besrat.

>> TEWODROS BESRAT: Thank you very much.

So I'll quickly allow me to focus on the data challenges that we face in the region.

First I would underscore the fact that this is bundled with inequality, which technology reinforces and solidifies, the gender, socioeconomic lives.  That said, three points:

Number one, we face risk of online content and taxation in a number of countries.

Excessive shut downs as mentioned earlier.

Number three, excessive surveillance and issues around data privacy where, you know, countries are requesting access of personal data for people to access, you know, the Digital World.

Thank you, back to you.

>> SHIVANEE THAPA BASNYAT: Coming over to Mr. Thomas Schneider.  Could you add your insights to this?

>> THOMAS SCHNEIDER: Yes, thank you.

Hello to everybody.

Of course we all know that with data you can do a lot of good things, improve lives of people and citizens economically and socially.

Of course, you can also do bad things with data.  For us, the key question is who controls the data?  Who controls my data?  Who controls a company's data?  In our view, it should be the people themselves or a company itself that should to the extent possible control their own data.  They should be able to decide who they want to share it, what purpose they should be used for and they should have a fair share of the wealth that's created with the data.

Should not be the government, nor big companies that control individual people's small business' data.

In Switzerland we looked into the factors to create trust in data sharing and one element, a key element, trustworthy data spaces.  We're now developing a voluntary code of conduct for everybody that's running, participating in data spaces so that together these actors decide about the governance of the data space.

We also think we should have an international place, an international forum to discuss, so it is good that we have this forum here at the IGF.  Also, in the long run, we need institutions on a global level where we can maybe agree on guidance on data governance in general because this is getting more and more important.

Thank you.


Thank you so much.

I'm sure Mr. Amandeep Singh Gill will have a lot to add to this.


A couple of points on data justice.

I think we need to move from the data protection paradigm, which is essential, which is a foundation to a data empowerment paradigm where citizens have choices over how the data is used and where they can also share in the benefits of innovation that brings on those datasets.  That's why the Secretary‑General called forgiving citizens and people choices over the data in the context of the Global Digital Compact.

The second important issue for data justice, it is addressing data poverty.  There are sections in the world where local communities, citizens are not building their own datasets.  Those datasets are essential for solving their local problems in context and therefore we need a special effort for everyone to be able to participate in the data economy.

Finally, something that Vint mentioned, we need international collaboration because data governance approaches are diverging in some ways, like this, that some kinds of personal data, health data, financial data, they're more important than others.  This is a notion that's well understood in the U.S., for instance, given common law and laptop framework.  This is not a notion that is well understood in Europe where, you know, regardless of the kind of data, personal data has to be protected at a certain level.  These are subtle differences but sometimes in operational terms they become important and we need an international platform where some alignment can be done.

Thank you.

>> SHIVANEE THAPA BASNYAT: Thank you, sir.

Ms. Thoko Miya, Founder, Startup Thoko, can we have your insights, please?

>> THOKO MIYA: Good afternoon.

It's Thoko.

Thank you so much.


So in terms of digital data governance and protection in Africa, and in South Africa particularly where I'm from, which is where I will localize the answer to, I think it is extremely important to note that there are already conventions in place and an attempt of policy at governance and at mitigation legislation.  However, there is not appropriate action taken towards implementing any of these, if we look at the African Union Convention on Cybersecurity, Governance, Personal Data Protection or looking at other conventions, the South African Cybersecurity Bill which was later converted to the South African Security Bill there is still a lack of acknowledgment, a lack of acceptance.

And if we're looking at personal data protection in terms of where contemporary politics, association economics and what's actually happening in our societies, it is extremely important that we're going to not only have these on paper or in theory, but have them in practice and recognized by the various Member States.

>> SHIVANEE THAPA BASNYAT: Thank you so much to all of the members of the panel that just took their time to answer this question.

It is obvious that we're on the unanimous link that collaboration, especially the deliberations should start and that will largely help in all the malpractices or any problems that we're facing can be dealt with.

I call for a comprehensive result of the problems, but do ‑‑ with the difficulty in reaching these multilateral agreements, you know, this is what the present scenario, the global front is showing, is there a common ground to introduce an international framework or global standard on digital rights?  What would this framework look like if your answer is affirmative.

Can I head on straight to Mr. Vint Cerf again?

>> VINT CERF: This time I'll take my headset off so I won't be so confused.

I think there is an interesting beginning, why don't we start with the Universal Declaration of Human Rights?  Cast that into the digital universe.  Then, let's try to establish norms for transparency regarding what's collected, how data is used and kept.  Then we can develop technical meanings to users can access the access and use of the data and we can establish norms for the protection of data, use of cryptography for example and other access controls and share Best Practices. 

That's a beginning.

>> SHIVANEE THAPA BASNYAT: I'll move on to Her Excellency, Karoline Edstadler.

>> KAROLINE EDSTADLER: Thank you so much.  Very warm welcome.  Good afternoon to everybody from my side!

Well, as I'm lawyer, I was working as a criminal judge, I will answer your question with a very important phrase for lawyer, very important phrase, it depends what we're talking about. 

I couldn't agree more with Vint Cerf when he says let's start with the Universal Declaration on Human Rights or the European Convention on Human Rights, I had the honor to work at the court for nearly two years and I would say that there are very established Human Rights, there are good catalogs and institutions that are secured there.

I think at least with the panel we have consensus that the Human Rights have to be implemented not only offline in the physical life but also online in the digital life.  That's what we have to do.  There is a big need to come together and to discuss how we can implement this Human Rights online.

I think especially after the pandemic, when we saw that there was a big boost for the digitalization, we saw also the downside of digitalization and it is now up to us to find solutions.

I would like to give you one example, let's talk about the freedom of expression, the freedom of opinion, but on the other hand, also hatred in the Internet.  We have to make sure that we can keep ‑‑ yeah, the sensitive balance between the different Human Rights and it is especially sensitive when it comes to online.

I think we have found a good way in Austria.  We implemented the Communications Platform Act and I will come back to that later because I see time is up.

>> SHIVANEE THAPA BASNYAT: Thank you.  Thank you, Your Excellency.

I can't wait to connect to this conversation the UN Special Rapporteur for Freedom of Opinion and Expression, Irene Khan, who is joining us online.

>> IRENE KHAN: Thank you very much.

Well, of course, all I can do is reiterate what's been said that it is well recognized now, The Rights offline apply online.  As such, international Human Rights framework in my view is the common ground on which to build the foundation for digital rights and, of course, all Member States signed up to Human Rights principles and instruments, but so have companies through the UN Voluntary Guidelines on Business and Human Rights.

The challenge is not about the gaps, the challenge is actually about implementation and that is why it is very important for states to look at smart regulation.  What I mean by that, to promote innovation and connectivity while at the same time respecting The rights to expression, information, privacy, safety online and of course non‑discrimination.

There is a big task, but within the Human Rights framework there are principles of necessity, of proportionally and legitimate objective providing the grounds for making restrictions while also upholding Human Rights.  I think that the Human Rights framework is an appropriate framework to take us forward.

>> SHIVANEE THAPA BASNYAT: Would you like to add on to this?

>> VIKTORS MAKAROVS: If I had the opportunity to repeat what's been said about the indivisibility of Human Rights online and offline, that's the best option.

The second, to make three quick points.

First of all, there is an obvious divergence of views of how Human Rights should be implemented in the digital areas.  To answer the question directly we're probably not heading to a binding framework but should focus on underlying principles.

Second point, we have tools to do that, first of all, the framework for the aspiration is in the Global Digital Compact, obviously we now have fortunately the UN Tech Envoy who we will support in implementing the digital agenda, including rights.

Third point, the E.U. has a high bar for itself with the E.U. Declaration on Digital Rights and Principles, citizens of the European Union demand that.  The E.U. will also I hope externally focus on outreach cooperation and engagement, not only to build capacity through the global gateway strategy, but also to help shape ethical, safe, inclusive international technology standards for all.

My last, fourth point, it is that multistakeholder approach is critical here.  We cannot discuss rights without Civil Society as the main advocate.  We cannot discuss digital rights without the tech companies who will bear an important share of responsibility in this area.

Thank you.

>> SHIVANEE THAPA BASNYAT: We'll add another perspective on this. 

I want to turn to Her Excellency Monica Mutsvangwa.

>> MONICA MUTSVANGWA: Thank you very much, facilitator.  I want to salute my fellow panelists, and I have been listening carefully.

First of all, let me thank the organizers for inviting us, Minister from Zimbabwe, Minister of Information, and clearly what's been said here is very important as countries who are in step with the fourth industrial revolution we have embraced digitalization and clearly digitalization is the way to go.

The question which I was asked was about the framework of global standards of digital rights.  Yes, I agree, there are numerous policy conversations which are around digitized through various forms, ITU, Internet Governance Forum like here, African Union, even others, this is because digital rights, they carry a weight of importance, that of the Human Rights, and in the digital age and it is important all governments embrace this.  My government is actually creating that environment to make sure that we are inclusive and we bring on board everybody.

The predominant conversation across these and other multilateral platforms is about cybersecurity, artificial intelligence, digital infrastructure, data protection, electronic commerce as just to mention a few.  The digital rights discourse involves a lot of players like was said and from government, infrastructure providers, data subjects, Civil Society, among others.  However, there is a need for that broad base consensus among those multipliers, multilateral players to find common points of convergency to treat the issue of digital rights as a Human Rights issue.


I turn to Mr. Benjamin Brake for his response.

>> BENJAMIN BRAKE: Yes.  Thank you very much, Facilitator.

Let me thank the IGF and Ethiopia for hosting this event right here in Ethiopia.  I think it is so important that we have this discussion that we're having here on the panel also including the Global South.  The Global South is becoming more and more important when it comes to questions around data privacy, data access and also to data rights.

So perhaps to touch base on what you just said, the European Union just issued a declaration on European digital Rights and Principles.  The focus is on six aspects that may give you perhaps flavor and an idea of what we could talk on a global scale.  They have put people and their rights in the centre of digital transformation first.

Then, they support solidarity and inclusion, ensuring freedom of choice online, fostering participation in the digital public space, increasing safety, security, empowerment of individuals and promoting the sustainability of the digital future.

While this may be very concrete on the E.U. level, we may look a bit more of a general approach on the global scale.  When it comes to me, I would like to see perhaps three aspects to be taken into account when we talk about digital rights on the global scale.

First of all, it should be the right to access the Internet at least in certain public institutions such as schools, for example.

Second, the right to certain transparency when it comes to data usage, right to be forgotten and the deletion of data, very important also from a European perspective.

The last thing, the right of transparency of algorithms.

Thank you.

>> SHIVANEE THAPA BASNYAT: Thank you so, so much.

Now to get the United Nations perspective, I turn to you.

>> AMANDEEP SINGH GILL:  There is no need to reinvent the wheel, we have the Human Rights standards, the core conventions, even specialized areas, the Human Rights defenders.  So the challenge is to actually apply them.

To make sure that we're not confused by new terms like digital rights, digital Human Rights, it is at the end of the day as Honorable Minister said, Human Rights and the digital age.

Now, when it comes to application, the problem is that traditionally Human Rights have been ‑‑ obligation is addressed to states.  Through the states to non‑state actors.  Today the private sector plays an important role, we need to find ways to sharpen the accountability of the private sector for some of the Human Rights violations.  We also need to focus on specific areas.  For instance, there is a vital role that Internet access and digital tools and platforms play in the civic space.  That civic space can be restricted either because of government action or private sector actions.  We need to defend that.

Also some emerging technologies like metaverse, it raises new challenges for the application, the promotion, protection of Human Rights.  We need to rise up to that challenge.  As was said, we need to do this in a multistakeholder manner.


Thank you, panel, for this very beautiful statement.

Certainly every little change that we're living this day is evolving around the spaces that we work in I so much agree, it is the space that is shifting, and if we could redefine or rearrange, you know, the working modalities and the issues of rights that we're talking about vis‑a‑vis our shifting spaces that should add impetus to this entire argument.

Now let's touch on the other important element of this entire debate we're having, the algorithms which is certainly enabling us to rethink about information, data, whatnot.

There have been initiatives promoting the transparency of algorithms such as trustworthy AI, yet they remain, many people, they remain something that the public largely are unaware about this very fact.  What can the international community do to promote those initiatives and the recognition of digital rights in the AI domain? 

I turn first to Dr. Alison Gillwald.

>> ALISON GILLWALD: I wanted to come back to the digital rights discussion although I know ‑‑

>> SHIVANEE THAPA BASNYAT: Could you kindly hold the mic closer, please.  A little closer.

>> ALISON GILLWALD: I just wanted to come back to the digital rights specific declaration on digital rights that was mentioned.

Just to say, you know, while I think this is an important document, particularly with the focus on managing harms, these are not the only aspects of rights and although it is framed in the context of being human‑centred, Human Rights focused, in fact, the content of the document looks very much at innovation and business and so I think we really need to think about this in terms of, you know, returning to that notion of data justice.  I think what we see, when we look at the focus on algorithmic governance, accountability, transparency, in terms of the ethical design and Human Rights frameworks is that it doesn't allow us to look at the continued governance of the algorithms in terms of assumptions around exactly what the point was made before about, you know, the offline rights actually not existing at all.  While we expect the online rights when they don't even exist offline, and that includes with the rule of law, the application is very difficult.

What we actually see, it is the perpetuation even with ‑‑ if you have physical framework, Human Rights framework, they're not actually addressing the marginalization, the exclusion, the underrepresentation of people through these algorithms that are now increasingly being used for automated decision making and really potentially discriminating against people, marginalized people in particular.

>> SHIVANEE THAPA BASNYAT: Thank you so much.

I am eager to hear Your Excellency, Karoline Edstadler take on this.

>> KAROLINE EDSTADLER: Thank you so much.

Well, I think it is out of question that AI is playing a very important role in our society and we still have to see how we can make AI really trustworthy and also compliant with Human Rights.  I think the two keywords in that respect, trust and transparency as already mentioned.

For the acceptance, also for the sustainability.  High pressure we have to be clear, AI can help us to achieve the Agenda 2030, the SDGs maybe in the field of medicine, maybe in the field of agriculture, so what do we have to do?  First of all, we have to keep human oversight, that's one of the most important things.  Algorithms should not run away from us or out of our power.

It should be explainable, and again transparent.  What can we do that was the question?  I think the first thing we can do, we can raise awareness, in the form of having discussions like this and on the other hand, set measures in the form of standards, maybe certificates for AI products and for the fields where it is used so the challenge, it is really to implement principles, there are already recommendation, think of the recommendations but it is up to us now to implement them, to make AI really trustworthy.

>> SHIVANEE THAPA BASNYAT: Great.  Thank you.  Thank you, Your Excellency.

I wanted to confirm if Mazuba Haanyama is with us.  If she's there, I would like to connect her to this discussion.

>> MAZUBA HAANYAMA: Yes.  I'm here.  Apologies for delayed entry, time zone got the better of me.

My name is Mazuba Haanyama, I lead Human Rights work within Meta in the public policy team.  Great discussion so far.  Much of what I will raise here, it has been touched on by some of the speakers.

Let me just give a little bit of an overview around this question:  At Meta, we really believe that the people that use our pros should have transparency and control on how data about them is collected and used.  This being a core idea for us, we're thinking through a number of things.  Firstly being more transparent on when and how our IA systems are making decisions that impact people that use our products.  Thinking ‑‑ making the decisions more explainable, interpretable as was mentioned just now, and then forming ‑‑ informing people of the control they have over how those decisions are made.

Here are some sort of a tool and resources we have introduced to increase the transparency in AI, we have an AI system and model documentation, we're exploring scaling our model and system level documentation through model cards and AI system cards.  We're also looking at how to expand the model interpretability and I can speak more to this as we continue because I know that our time is limited here.


We'll now turn to Mr. Thomas Schneider for your response, please.

>> THOMAS SCHNEIDER: Yes.  Thank you.

When we talk about transparency, explainability of algorithm, we should just not forget that some experts that actually program algorithms tell us that they are not able to explain or to understand what they do.

We have a different situation here.  That does not mean that there is no way to create trust, even in something that none of us really understands, it is like banking basically by the way.

So we need to have a mix of measures to apply.  One thing is technical standards and then we have self‑regulatory guidelines from the industry itself, but we also need to have regulatory frameworks for regulators and, of course, legal frameworks.

For instance, the E.U. is about to develop a regulatory framework to regulate the market of services is one element that may help.  The Council of Europe on the other hand, which is not the same like the E.U. for those that may not know it, it is an international organization of 46 Member States has decided to negotiate a Convention, a binding Convention on AI based on the principles of making the link between a Human Rights declaration and Convention and the application.  This process is open not to just European states but to interested countries from all over the world because ‑‑ I'm currently the Chair of this Committee because we're trying to create something that has a global reach so that on a global level as many countries as possible may agree on the same principles.  We're trying to introduce or stabilize things that should be a human in the loop at least if you ask for it, you should have a remedy, if a decision is taken by AI, there should be a way to have recourse, even if nobody understands it, if you think it is wrong, you should be able to go against it, and things like this that are part of the process.

By the way, we have a session on this tonight at 5:20 or something like that, an open forum of the Council of Europe.

Thank you.

>> SHIVANEE THAPA BASNYAT: I'm sure it will be great hearing you further later.

Thank you.  Thank you for this great sharing.  It is the emphasis more on will and determination, of course.

Now we may turn to Mr. Vint Cerf for your remarks on this matter?

>> VINT CERF: Here we go again.

Thank you very much again for the opportunity to address this question.

I'm going to start out by pointing out that I'm not necessarily a machine learning expert, but it never stopped me from having an opinion.

The first thing I want to point out though, is that all software potentially has biases in it, whether it is machine learning, artificial intelligence or more conventional programming.  We should be aware of that.

Second, for machine learning, where the learning is achieved by repetition, exposure to training materials, the source material is critical to the determination of bias or its lack of bias.  In order to detect the bias, you have to distinguish between unbiased results and the results that exhibit the bias.  That presupposes that we know when an unbiased model, what it looks like, it may not be as simple to figure out.

Second, the cases that will test ‑‑ we need cases that will test for bias, we need ways of checking to see whether a model is manifesting bias.  We need test cases.

Transparency, that's to say explainability.  It is really hard if the algorithm is a constellation of variable values and a gigantic multilayer neural network when someone says why did it come up with this answer, and the answer is look at all of the numbers in the neural network, that's why it came out with this answer.  This is not a very satisfying explanation.

Finally, we wonder whether bias of an algorithm can be detected purely on the basis of analysis.  That only works ‑‑ it is desirable ‑‑ I see my time is up ‑‑ it is desirable if we could detect bias by analysis.  If it turns out that biases are so infrequent we can't train the algorithm to detect it using conventional methods ‑‑ sorry ‑‑ for going on.

>> SHIVANEE THAPA BASNYAT: Thank you so much.

With this, I turn to Lucio Adriano Ruiz, sir.

>> LUCIO ADRIANO RUIZ: Reviewing the problem from the culture standpoint, I believe that the question is going much deeper than the artificial intelligence.

When we understand everything that comes to us from digital reality just like a technical, instrumental aspect, it is impossible to understand everything it means, what the digital culture implies.  This is not a technological era, a digital era.  So if we just keep the instrumental aspects and we do not care, we do not understand what's behind and what is under it.  From my standpoint, a fundamental aspect would be education so as to understand digital culture.

A user that's knowledgeable, understands, claims and produces in a different manner, if we had the capacity to being able to teach our users in what living in the digital culture means, we could claim for transparency more naturally and those that produce could provide a way of organizing AI services in a different manner.

>> SHIVANEE THAPA BASNYAT: We'll go to the other question.

Sir, what are some approaches that empowers the marginalized groups like in gender equality, sexuality, race, ethnicity, nationality, et cetera, in defending their digital rights and what are we ‑‑ where are we right now in this aspiration?

>> AMANDEEP SINGH GILL: We need a new approach that combines the risk base aid approach taken in the E.U. for example with regard to AI systems with a Human Rights based approach.  That way we can better defend the marginalized, and we need a correct understanding of these technologies.  And the algorithms, it is the training data, and then the decision that they propose to humans, their impact on the contextual situation.

In the old day, software was about data is combined with code, gives you a certain output.  AI is about combining the data with the output, that gives you the code which is your algorithm, which is your model, so we can't just think of the model, we need to think of a life cycle approach to the governance of these technologies.  These technology, also they cannot be rolled through one single instrument, we have to align the international guidance with the national, regional regulatory frameworks and the industry practices.  That has to be done in an agile way.

Finally, we need to be mindful that these systems are applied in specific domains, so the health sector, for instance, has its own very traditional, very classical, very important governance issues, principles about patient privacy, no harm to patients, et cetera.  We need to come up with a nuanced guidance for all of these domains to better protect those who are vulnerable.


May I turn to Iginio Gagliardone.


Well, my generation, I'm not a digital native, came to know or was told that the Internet was the space where we're going to connect and now we know that a lot of the story didn't go in that direction so, much so that we have to ask our self how we protect these marginalized.

I see a risk here also.  If we go the normative way, we're risking multiplying the claims for representation of smaller, smaller units of diversity.  Here are my suggestion as an academic, it is something that can enrich this debate, it is a conversation of academics in the Global South we have had for decades, the conversation about the communality, the communality rejects the normative, is interested in what's contemporary but also interested in long‑term trajectory, higher majority is a minority and the majority again.  How can we propose new images per pained about the space in different ways.  Let's think about how the colonial powers with communities as such, they have been in separate minorities in different spaces.  And my colleague for example has written beautiful words about how border, not that they don't exist, but they had much less importance in precolonial Africa, it was a space of flows and if we think about it, these ideas, it is very akin to the regional idea of the Internet in the 1990s.  My time is up, I know academics don't think that the role is in policy, I think they're probably right and policymakers are too busy to read book, I think there is a lot of potential there.  It is a mine that needs to be uncovered.

>> SHIVANEE THAPA BASNYAT: Thank you.  Great.

Now, I would want to know Mr. Tewodros Besrat's thought on this.

>> TEWODROS BESRAT: Thank you.

Let me pinpoint three or four points.

To increase in promoting digital rights, in this regard we have to look at the cause of violence.

Number two, it is for stakeholders to actually respect digital rights.

Three, it is the private sector, they will make known their information to the public and implement them.

Four, Internet freedom advocates, they have the challenge of laws and practices that actually start with digital rights and digital access.

Thank you.  Back to you.


Now I may turn to Her Excellency Karoline Edstadler to gain her thoughts on this.

>> KAROLINE EDSTADLER: Thank you so much.

Well, I think regarding marginalized groups, the same is true as for the whole Internet, there are a lot of potentials in it, for those, there are a lot of risks.  While we empower marginalized group or not for the Internet, think of communication without any different situation regarding the ethnic, social, religious background, there is also the risk on the other hand that existing social and economic inequalities could be reinforced.  This is what we have to look at if we're talking about also these groups.

We have to keep a close eye on facts like conspiracy theories for example, fake news.  I was coming back from Ukraine, I was there to see how the situation on the ground is, we have a new dimension regarding the war, and that's the information war and that's done in the Internet.  I think we have to be very clear that a lot of risks are in this or women are facing a lot of violence in the Internet.  I'm coming back to my first answer, in Austria we implemented the information platform act.  It means that social media platforms have to delete hatred in the Internet, also anti semantic comments or threats within 24 hours, if complicated to see if it is really violating someone, up to seven days, but there has to be a deletion.  We have to also have global solutions.  That's why I think we have to come together and communicate with each other and exchange views on that.


Mr. Viktors Makarovs, your take on this.

>> VIKTORS MAKAROVS: Just to complement of what's been said, I would like to stress again, digital rights are the same universal Human Rights applied in the digital environment.  In this particular case, rule of law and freedom of expression have to exist.  What's it mean?  Groups have to be able to speak up.  There is a positive obligation on the part of the states to protect The Rights.  It doesn't really help if you are allowed to speak up, the marginalized group, but then you face threats online and offline, these threats always migrate from online to offline, the state has to protect people from marginalized communities.  We need to make their case to the society.

Second, you would need effective frameworks for combating hate speech and incitement of violence, hateful disinformation online and I would like to refer to what the E.U. is doing with the Digital Services Act, we're going to speak about this during a panel on Wednesday on tackling disinformation without resorting to censorship.

With these frameworks, there are a couple of things:  First of all, they have to be rights compliant.  That's the absolute precondition for this to work.

Second, you need to educate, not just society but also the law enforcement agencies to apply these frameworks effectively.

Lastly, this also needs resources.  Any agency you will create to implement the framework, you have to have a lot of qualified people, well paid people with skills to make it work.

Thank you.


Lucio Adriano Ruiz.

>> LUCIO ADRIANO RUIZ: While the digital sector, I would like to point out four things:

First, respect for human beings because if human beings as such, similarly are not respected, then you cannot have a society.  That's an absolute right that has to be transferred to the digital world.

Number two, respect ideas because human beings are before and there has to be mutual respect, it means we have to respect others and others have to respect us.

Number three, it is promoting study, education so that I understand ‑‑ so by understanding differences we can systematically, organically study things so to reach a consensus and find a way to work our differences.

Four, education.  The family, society, because we're not born with innate capacity to understand others and finding ways to work out things.  This is why it is necessary to promote education and find a solution in the digital world as well.

>> SHIVANEE THAPA BASNYAT: Irene Khan, your take on this.

>> IRENE KHAN: Thank you.

Let me start by saying that I would like to focus actually on the word empower.  I think that the question was how do we empower marginalized groups to stand up for their rights.  We empower them in the same way that we empower them to claim their rights.  It has to begin with education and by that, I mean in this case digital literacy, then their voices will be heard.

That means representation, promoting civic space, making sure that they are there, for example, in the Internet Governance Forum discussions too.  It is not enough just to have women, but women with particular backgrounds, LGBTQI people, not just gender equality meaning women's equality.

The third point I would make, investment, investment by companies and states to support the needs that have been articulated by these groups.  Not coming in, telling them what they need and listening to them, addressing their needs.  In many cases, those needs, they're actually rooted in systematic problems, in structural problems of discrimination against marginalized groups.  I think we have to take a very holistic approach and not see digital rights as something that can be imposed on these people but that they actually are empowered to tell us what the issues are and that we work in a holistic way, governments, private sector, Civil Society together.


Our other panelist was joining us online, Ms. Mazuba Haanyama.

Can we have your thoughts on this?

>> MAZUBA HAANYAMA: Much has been said, it is very important, I will pull out a few points that resonated a bit.

I think there is a question of ‑‑ even before this, a common understanding of why the barriers exist.  It is important.  One of the panelists spoke about the coloniality and understanding the impacts would be very helpful for us.  Addressing those, thinking about the approaches, representation remains, you know, hugely important if particular voices aren't in the room when we think about how we develop certain tools we have a problem.

The issue of inclusivity, it remain as critical issue, something that we're thinking a lot about.  A lot of things have been mentioned that I think are really important for us at Meta, investment, digital literacy, education, how we think about encompassing these in ways that make it accessible for a wider range of people. 

Also understanding that our differences only enrich our ability to do our work.  How do we take the matters seriously in ways that put many folks who are often at the margin in the centre?

>> SHIVANEE THAPA BASNYAT: Thank you so much for your inputs.

Now let's turn to the other question, the people in the ecosystem and those watching the tech ecosystem closely will very much know this.

Have we been seeing an emergence of a digital colonialism and is it posing a threat to the Global South?  How can this be prevented if this is the case? 

To answer this, we turn to Mr. Iginio Gagliardone.

>> IGINIO GAGLIARDONE: Digital colonialism, it is an important term because it accounts for the great imbalances between the Global South and Global North.  It is also talking the idea about the digital, when expanded, there is always an opportunity, an expansion.  We saw that recent example, the South African drivers that were given a promise from Uber of making a different class and they found themselves worst off.  That's been detected.

At the same time, digital colonialism is ‑‑ colonialism, this comes in violence and guns.  Digital comes with opportunity with inclusion, bringing people and I don't want to sound like an old Marxist but the idea of imperialists as a complement, this is defined as a process by which a centre of power seeks to bring forth itself and disregard the concepts.  This gives us an analytical tool to understand the certain phenomena better than we have so far.  For example, we heard before the case of undersea cable, the largest undersea cables, the cost of that, this is a great thing.  If we look at how companies present these developments, we have to ‑‑ it is not a gift to Africa.  Africa is in a position of enormous strength in these days where it can compete and shape with different aspects to expand their own ideas. 

This change of narrative, it is important.

>> SHIVANEE THAPA BASNYAT: Mr. Viktors Makarovs, what are your observations of this?

>> VIKTORS MAKAROVS: Thank you. 

Honestly, I don't necessarily think of this as the best framework for addressing the issue.  To me, it looks like this, the big tech companies, they will exploit everyone if they're allowed to.

In the north, they would do the same.  If we didn't have the GDPR in Europe, they would be abusing ‑‑ using and abusing our data in the same way they do other places.  You need to regulate.

In the same way, they would completely ignore our pleas to do something about disinformation which is still rampant but at least in some countries in the E.U. there is a framework that puts an obligation on that.

The way forward, it is actually to create regulatory frameworks to learn from each other, we're all learning.  Again, not only in Europe do we have these issues, we have to learn from each other, support each other, we need to see where there are synergies to actually make our voices heard.

I understand how difficult it can be if you come from a relatively small country to do something about this.  Coming together, creating a framework that's understandable, learn from each other.

Thank you.

>> SHIVANEE THAPA BASNYAT: Thank you, sir.

Your Excellency Monica Mutsvangwa, please.

>> MONICA MUTSVANGWA: Thank you very much again.

I want to say we all agree and we have listened to experts here, that digital power sustains the blood life of the basic human needs, education, health, finance, management systems, we're therefore to learn from each other.  This governance forum, it really should give us and establish our digital solidarities especially with the inclusion of the Global South.

Coming to what you said, the Global South has been playing catch up when it comes to standards.  In principles relating to digital rights, digital platforms, in digital technologies, so Global South, the standards makers, however, that culture of being standard takers rather than standard makers, it should motivate the Global South to seek alternative strategies in the area of Digital Transformation.  This is noted in the way countries like China, India, others, have taken a leading role in this respect.

I can go on:  China is an emerging Global South champion, has been a leading manufacturer of competitive ICT guidance and we have seen that as a country, Zimbabwe.  China is becoming a dependable partner in our nation in ICT development front.

Then there is also India, which has demonstrated to the rest of the world that it is possible for a Global South nation to run an efficient eBanking system.  In the migration from analogue to digital, which we're also doing by the way, digitalization, Iran has also done this same.

So we're saying we are here at this annual Internet Governance Forum and we will share that information.  There is room certainly for South to South cooperation in the area of digital cooperation.  These innovation are relevant to the digital transformation and we'll continue working with everybody.

>> SHIVANEE THAPA BASNYAT: Thank you, Your Excellency.  Thank you.

With this, I turn to Thoko Miya for the quick observation of this.

>> THOKO MIYA: Thank you for that.

In terms of digital colonialism, it is important to note that ‑‑

>> SHIVANEE THAPA BASNYAT: Could you hold the mic closer.

>> THOKO MIYA: The Global South faces different political and economic conditions, constraints than the greater north.  This development has happened and it is extremely important to note in, the Global South, the type of development, the type of innovation that's required is often not what is supplied.  There is actually leading to a vertex, this black hole of information, what's happening with the crisis mentioned earlier, it is that Uber comes along.  You know, it is not just Uber, it is Netflix, it is others, every other sort of tech unicorn that's leveraging on the digital growth happening right now and introducing the systems that are not location specific appropriate for where that country, where the context, whether it is national, regional, holds itself. 

So it really is important to note that with regard to the digital colonialism it is happening at both ends.  It is happening at the end of technological skills development, infrastructure, managing what technology is and what is this rise in innovation, this rise in digital as well as on the ground in terms of how do we now foster and grow a sustainable society based on the ‑‑ based on the data that we are getting, how do we keep ourselves safe?  How do we engage in this process appropriately?

>> SHIVANEE THAPA BASNYAT: Thank you so much.

>> ALISON GILLWALD: Thank you very much.

It is true that the increasing generational analysis of data in society is resulting in the uneven distribution of opportunities and harms.  Following this historical patterns of social and geographical inequality both in and between countries as described by data colonialism.

Such harms do include the appropriation of valuable data and from communities as well as the marginalization, misrepresentation, even the eraser of communities through data‑driven systems.

They also deny the structural inequality, economic opportunities that are the legacy of colonialism and continue despite the legal end of colonialism.

So the implications of the policy and regulation, they are concerning.  The application of the concept of data colonialism, while it does accurately describe the destructive form of governance, it is not addressing the underlying systematic issues referred to in some of the literature around this as near colonialism or decoloniality.  So the counter points of this, historical colonialism came to an end through legal changes referred to here.  Left intact, the coloniality that underpins this and leads to the inequalities, so addressing colonialism, it can take a number of forms and none of the ‑‑ as described in the data colonialism literature, the application to governance, many of these don't deal with decoloniality and addressing these fundamental structurally inequality.  This is what's important in terms of, you know, reflecting beyond those cross generational rights, to the second, third generation rights that would ensure more equitable access to data, not just legal application of the first generation rights.


With this, I turn to Ms. Mazuba Haanyama.

>> MAZUBA HAANYAMA: Thank you.

I think digital colonialism is certainly something for to us contend with and to think about, particularly in the digital space and tech as well as thinking about the many intersecting layers of power.  We know that the communities of the Global South, that we're vulnerable with some of the conditions and how we think about the ways this speech is important.

In terms of prevention, it is probably firstly important for us to understand how the emergence of digital colonialism affects the communities we serve in, what ways are communities experiencing this, how is it showing up, how are they impacted to help better service how we may address this?  These are all really important questions to ask. 

A better understanding of what this looks like, I think we stand better chance of developing strategies to help combat what we may be seeing or ways in which folks' speech may be limited or impacted.

Thank you.


Mr. Benjamin Brake, what's your take on this?

>> BENJAMIN BRAKE: Thank you very much.


I think digitalization in general has actually the potential to bridge colonialism or to bridge the digital divide.  Yet, when we look at the south, actually far fewer people have access to the Internet and the opportunities attached to it.  However, also in the north we have problems with the Internet because we're dealing with legacy structures and policies. 

So I had the pleasure to visit India a couple of weeks ago for example and I'm happy that you mentioned India.  They actually embraced digitalization as a concept, they are not really looking so much at the threats, they're actually having the market, they're going to scale especially financial solutions.  However, on the other hand, because we just talked about digital rights, the challenge in India, when you talk about the data privacy for example, they still know that there may be still a way to go.  What may be the solution actually to bridge the digital divide?  That's only up to dialogue. 

And that's the reason why the German government is actually currently implementing digital dialogues because we would like to think about these issue:  Regulation on one hand side, where I think that the European Union is pretty good; and innovation on the other side, where it is the Global South that's broad probably a bit better than the European Union.


I would like to come to you again with my next question.

Does lack of government capacity contribute to Internet shutdowns during conflict times?  How can capacity gaps be bridged?  My question beginning with Mr. Benjamin Brake again.

Does lack of government capacity contribute to Internet shutdowns during conflict times or how can capacity gaps be bridged?


Internet shutdowns, they're severe things.  It should not be left up to only governments actually to deal with Internet shutdowns.  We're the multistakeholder forum, and it should be up to multistakeholders how to run the Internet, how to ensure redundancy so that it may not be like the Internet shutdowns and if there are, it has to be dealt with in the community.  I don't think especially against the background of a very gloomy international situation, governments should be solely in charge of running the Internet and deciding whether or not there should be a shutdown or where resources should be allocated to.


I now turn to Her Excellency Monica Mutsvangwa, with this question.

>> MONICA MUTSVANGWA: I thank you very much. 

I totally agree with my fellow panelists.  There is a need to maintain connectivity and free access at all times.  That is to balance the right of the population to digital access and the right of the population to peace.  Shutdowns during conflicts could be an attempt by governments across the world to prevent the use of digital platforms and social media to spread propaganda and fake news which may result in more bloodshed and loss of life, even genocide.  Of course, most governments in the South need both institutional and human capacity to deal with this digital platforms in terms of conflict.  Again, even where policies, law, regulations are replaced that needs to be ‑‑ that could still happen.

We also have seen digital rights must not culminate in the violation of other peoples' rights.  Like any other rights, digital rights must not be abused to the point of limiting other citizens from enjoying protection from the state in the sense that they have enjoyment of a particular right should not be a strain on the Rights of other individuals.  In summary, the function of any state is to protect the citizens from unjustified acts ever violence.  The state has the progress tore to guarantee that maximum security to private property from political colonialism in all forms of punishment originality that it may take.

Thank you.


Eager to hear your say on this.

>> THOKO MIYA: Just on the topic of Internet shutdowns as global trends cause not only legal implications, but sacrifice on the ground for those affected. 

It is actually a ‑‑ I totally agree ‑‑ up to us, it is our responsibility as policymakers, advocates, activists, and the plethora of other represented bodies to actually take this stand and to speak up and to speak out against Internet shutdowns to ensure that it is for greater knowledge and access and also connectivity itself, it should be a Human Rights ‑‑ a Human Rights going with responsibilities.

It is critical that we at this point are able to engage on Internet shutdowns, particularly within the African context where shutdowns are taking place and continue to take place consistently in conflict situations and it is critical and more pertinent as we have this conversation now in Ethiopia who recently signed a peace treaty three weeks back in South Africa that we actually are looking forward to a connected, a connect for all.


>> We need to understand the reasons behind it. 

When we talk about a shutdown, if it is a technical problem, a resource problem, if a government tries to stop something that is really bad, incitement to violence, no other idea that moment to shut down the Internet or shutdown or slowdown is happening because the government wants to silence people in criticizing it. 

If it is the latter, then this is basically unacceptable. 

If it is one of the first two reasons, then I think there are better ways than shutting down the Internet to find solutions.

With regard to combating illegal content like incitement of violence, we need a regulatory system that is able to draw the line between freedom of expression and illegal content.  Like what you have in many countries for the traditional media for decades, we have to extend this to social media and other platforms in a way that is clear what is their responsibility, what is the responsibility of the government, of the legal system, of the citizens and of the people themselves.

We would not want neither the government nor a big tech company to tell us what is true and untrue.  This is something that a society needs to peacefully discuss and agree on what their vision of reality is and not a government or a private company.

Thank you.

>> SHIVANEE THAPA BASNYAT: A very good perspective.  I turn to you for your observations.

>> Thank you very much once again.  I think according to what was said earlier, it is important to understand what's the way for shutting down service.  If you look at the research, this included mass demonstrations, military coups, operations, other coup examples, other regional issues and whatnot. 

For me, the question, it is more of what the model was for the shutdown services and capacity, and I'm sure, cognizant of the fact that capacity may be an issue somewhere.

Thank you.  Back to you.

>> SHIVANEE THAPA BASNYAT: Irene Khan, what would you add to this.

>> IRENE KHAN: Thank you very much.

Yes.  I also don't see it as an issue for lack of technical capacity.  More than half of the LDCs have actually shutdown the Internet or slowed it down at one point or another.  Clearly they have capacity to do that.  The question really is, do they have legal capacity or legal authority to do it? 

Shutting down the Internet any time has dramatic consequences on people's lives, during conflicts, it can be absolutely disastrous, that's when people need access through verifiable income, governments usually claim the shutdown is for reasons of national security or to control hate speech or false information, disinformation, misinformation, but in reality, evidence, there is plenty of research there, the disinformation, the misinformation cannot be tackled by shutting down the Internet.  On the contrary, it seems to spike during when the Internet is shut down and so do Human Rights violations, Human Rights violations increase when there are no journalists to report, journalists need the Internet very often to report.  So I think that the risks are very high of shutting down the Internet and the benefits, they're very limited.

That's why Human Rights, UN Human Rights bodies, have all been very consistent in saying that Internet bans are unlawful under international law and there are a disproportionate response to most other situations, even Internet slowdowns.  There are many other ways of dealing with this, governments should refrain from imposing shutdown, maximize access because at the end of the day, Internet shutdowns actually undermine the digital divide that we are all committed to closing.

>> CHAIR: Thank you.

May I now turn to Lucio Adriano Ruiz for his observation on this.

>> LUCIO ADRIANO RUIZ: I would like to share my opinion from the cultural point of view because going back to this idea of using what we get from a digital world, as a tool when we talk about digital culture, but now talking about policymakers, leaders, all areas because if we don't think about it from a cultural point of view, we would be misusing anyway of crisis management tools because when there is a crisis, we use what we have among our resources, social resources, so if we don't know, if we're not aware of the digital resources among the cultural tools we may use, well, we don't manage crisis properly.  Training, education for managing society, it is what makes it possible that at the time of crisis those who have to manage the crisis are able to govern the process.

As we have 2 more seconds, as we issue a call for action, so from a multistakeholder point of view, we may build together, all of the members of society, create a new reality about the use of digital media.

>> SHIVANEE THAPA BASNYAT: (Poor audio quality).

We will now go to Mr. Volkder Turk, UN High Commissioner for Human Rights for his video message.

>> VOLKER TURK: Human Rights allow us to express our humanity.  Information and communication technologies radically expanded our capacities to connect with one another.  We see cracks and fractures in this current system.  At 22% the level of Internet use in low‑income countries remains far below that of high‑income countries which are approaching universal use of some 91%. 

The digital divide across gender and geographic lines hinders the full potential of human connection.  2.9 billion people offline, left behind, excluded from access to vital information, global discussions and economic opportunities unseen and unheard.

Others pulled deeper in the darkness with disruptions and shutdowns of the Internet, phone access, blocking of social media and messaging services, between 2016 and 2021 the hashtag Keep It On coalition reported 931 Internet shutdowns in 74 countries.  These tools are not only an affront to Human Rights, but with their disruptions to health system, commerce, education, they also stunt development and they occur most often during times of protests or elections where people's ability to connect and be heard are all the more crucial.

Good governance depends on inclusive, meaningful participation.  Yet, the ability of people's voices to be heard is threatened on many fronts in the digital space by regulation that criminalizes critical speech, by labeling it as disinformation or harmful.

By spyware turning a phone into a surveillance device, invading the privacy not only of the target but of the family and friends by online campaigns that inciting a threat, violence and incrimination.  As states regulate the online space embedding Human Rights guardrails will ensure that the digital space is open, free, safe, inclusive.

Social media companies currently overwhelmingly from the Global North need to invest sufficient resources to operate safely in every location in which they do business, expanding their language capabilities and stepping up their understanding and engagement with all the communities they serve.

We need to encourage and facilitate community‑driven approaches.  We need to empower people who design their own Digital World and support local tech ecosystems and promote transparency of company and government practices.  This is a vital ingredient for a meaningful conversation about our global digital comments.  We need, indeed, a common global commitment to build an Internet where trust is not undermined by disinformation, where threat and harassment have no place, where people can express themselves and build their communities free from fear and repression.

The development of a Global Digital Compact proposed by the UN Secretary‑General and to be hopefully agreed in 2024 gives us all of the opportunity to work together to that goal. 

In Human Rights we have a common unifying language, to have these important and challenging conversations and to ensure the protection of all of our fundamental rights in the digital space.

Distinguished participants, the amazing value of the Internet lies in its ability to deliver on the promise that was already envisaged by Article 19 of the Universal Declaration of Human Rights almost 75 years ago, that everyone has the right to seek, receive and to impart information and ideas through any media and regardless of frontiers.  We have an opportunity to get it right and to reconnect the human community.  Let's use it.

>> SHIVANEE THAPA BASNYAT: From the entire panel and the members of the audience here, we extend our gratitude to Mr. Volkder Turk of the UN High Commissioner for Human Rights for just simply adding to the thoughts on the issue that we have brought up today.

Ladies and gentlemen, the panel basically was trying to focus on and reflect on how a basic framework could possibly be established given the diversity of governance structure and individual identities to address this very pertinent issue of digital rights.

We now are heading towards the end of the conversation herein.  If you thought 90 seconds were rapid, I have the privilege of granting each and every of our panelists 30 seconds now to make your final statements.

>> TEWODROS BESRAT:  I would like to use my time savings from earlier questions to amplify some of the work we do at African entrepreneur, the digital rights agenda in Africa, we're doing this through a number of way, one, we advocate for universal, equal access to the Internet and digital rights and to also deploy and develop tools with privacy and data security and we have a report called the State of inclusive payment systems report in Africa, collaborative work, with UN and the World Bank which is very insightful and rich and visit us on our website AfricaNenda.org.


>> IGINIO GAGLIARONE:  The Global South, it is in a position of strength and confidence, even if it is complex and in vague term we saw it in COP27, not just in the digital space.  I see obstacles along the way, there are claims of an increasing digital Cold War and we have to be aware of it but we also have to push back against it.  We have to realize that innovation cannot be a combination of ideas and materialities from the South, from the east, from the west, rather than the simplification that will get into the way of a more inclusive digital space.


>> THOKO MIYA:  The forum of Internet Governance and the openness, it is so important for all participants of the Internet and those who will engage with it.  It is so important that we are able to have this firstly forum that it remains multistakeholder, that it remains open, transparent, that's the only way to be able to move towards greater digital cooperation, greater interoperability and accessible Internet that's affordable and connect for all, that we protect the rights of users whilst they're on this platform, they are, of course ‑‑ there are dangers that come up along the way and it is important that we not only note these but that we provide the skills and the resources to better be able to use this for all people.


May I request your closing remarks, 30 seconds is your time limit.

>> MAZUBA HAANYAMA: Thank you so much.

Mine is really just to say a huge appreciation for being here with other panelists and perhaps just to reconfirm Meta's commitment to AI transparency, our commitment to thinking through, working with marginalized groups, our commitment to inclusion, diversity, some of what's been said, I think that the conversation, it is super important.  For as long as you'll have you, we'll always be here.

Thank you.


Ms. Irene Khan.

>> IRENE KHAN: Thank you.

I think there's a lot on which we agree here.  This is a great, fantastic multistakeholder forum to agree and move forward.  I think it is extremely important that we also measure concrete progress and that this is not only meaning a nice place to talk about what we agree on.  A place where we have to solve the problem.  Digital technology would be central to our future.


>> BENJAMIN BRAKE: We'll keep it short. 

We can do it together as a community and stakeholders only, the greater North and the Global South for once should join forces to protect the freedom of the Internet and digital rights because it is not the North against the South, it is openness, it is democracy against authoritarianism and cruel regimes.  That should be the frontline.

Thank you.


I turn to you.

>> ALISON GILLWALD: The challenge is to move beyond the first generation rights.  We have to move towards looking at the governance of these in terms of issues of social justice and of economic justice.  I think absolutely we have to do this at a level of global governance and the way do this, it is to treat these global digital public goods as such and ensure that the realization at the national level, it happens through global coordination and through global frameworks that ensure that the current, you know, extremely uneven distribution of both harms and opportunities are realized at the national level in the delivery of services and equality in the digital realm for all people.

Thank you.

>> SHIVANEE THAPA BASNYAT: Thank you.  Thank you.

Over to Mr. Viktors Makarovs.

>> VIKTORS MAKAROVS: Human Rights, universal across all regions of the world.  Altogether it will help with agency and implementing the Human Rights in digital areas if we work together as one global community and not north and South and if we do it in a truly multistakeholder way.

Thank you very much Ethiopia for hosting IGF.

>> SHIVANEE THAPA BASNYAT: Thank you for the great words.

May I now turn to Your Excellency.

>> MONICA MUTSVANGWA: Thank you.  Again, I would like to thank the organizers and thank you, Ethiopia, for organizing this panel.

Government together with multistakeholders should continue to create an enabling environment for digital rights to thrive.  It is important to put legislative framework for this digital rights, yeah, access to Internet, use by all, it is very important, robust ICT, policies, it is very, very important, also a relevant cybercrime prevention legislation to make sure we protect all.

That's what I think I got from this.


Vint Cerf now.

>> VINT CERF: Two points:  First of all, please keep in mind that we are now operating at a scale never before accomplished, five billion people using another 3 billion to get using the net.  Operating at that scale is a huge challenge to achieve the objectives we have been talking about.

Second, let's keep in mind that personal, institutional, national responsibility and accountability are vital to preserving Human Rights online.

>> SHIVANEE THAPA BASNYAT: Thank you for the great thought. 

Moving to Karoline Edstadler.

>> KAROLINE EDSTADLER: Four points from my side, first of all, whenever we're talking and finding solutions regarding challenges ahead of us, Human Rights have to be the guiding star.

Secondly, it is crucial to quickly come to a common understanding of how we can translate international law to new technologies.  I think thirdly that there is a big change, a change in the Internet Governance Forum and the high‑level panel which I have to be under your chairmanship, Vint, and we should do it now, it is time now and lastly, fourth, this is about nothing less than what kind of world we want to live in.  Let's strengthen our forces and to use our international power to do it.

>> SHIVANEE THAPA BASNYAT: That's a great call, Your Excellency.

Turning to Mr. Amandeep Singh Gill.

>> AMANDEEP SINGH GILL: Internet is the closest thing we have today to a global consciousness.  Despite all of the war, the problems, I think we have to continue to believe in that and uphold that.

Whenever things get mucky or fuzzy, I think that the Human Rights standards are a great start to guide our way forward.  Just to echo what Thomas said a little while back.  All of the problems we have to see them in context and work with mutual respect and mutual accommodation to take digital cooperation forward.

>> SHIVANEE THAPA BASNYAT: Couldn't have said it better, sir. 

I will turn to Lucio Adriano Ruiz.

>> LUCIO ADRIANO RUIZ: From the point of view, the digital divide, we should think of respecting the integrity of each human being, in a digital era, everybody is involved.  It is important to take into account that digital rights should be inclusive, should be Human Rights in a digital era, a digital age.

>> Our thoughts are similar.  Whether an analogue, Digital World in, the end, if you want to live together peacefully, with solidarity, the fair competition, other unfair competition.  We have to sit down, all of us, discuss and agree on shared values, discuss rules, how we treat each other to implement the values and very happy that the IGF for the Digital World is one of the most important fora where voices can be heard that are not hearable normally elsewhere and we're happy that this IGF is able to feed into the Global Digital Compact which is an important document of the UN that hopefully reflects what's referred to here. 

Thank you.

>> SHIVANEE THAPA BASNYAT: We couldn't have ended on a better note than this.

Indeed, thank you to the distinguished panel. 

This issue that we're thinking about, it is more than our disparity, more than difference, it is about our responsibilities, our accountability and respecting the integrity of human race, about us, each other.

As the panel, as they have conceded, agreed, that Human Rights should be leveraged to look at the present purpose and it is the right time for interventions.  We have come a long way now, and have learned a lot of lessons from natural and other disasters that have come by.  We can so much foresee the future that's before us.  It is time to come together and make the right intervention in access.  Our future will be based on the values we contribute to this and how we see and weave this to set the foundations.  It is for the present generation to decide and let us all make our contributions to a significant ‑‑ make a significant and positive contribution to this impact.

With this call, I believe that's unanimous for the panel.  I thank the entire IGF 2022 forum, the Government of Ethiopia, the UN, Geneva, all partners putting up this very important forum.  Also a moment to thank each and every panelists here who have taken out their time and shared their valued insights and experiences before us, and especially for doing it so well.  Trust me, 90 seconds, it is not a long time!  Thank you for the great contribution to this.

And to the members of the audience, thank you for your valued time.

With this, I rest my microphone and call of this session here.