The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> EDEN TADESSE: Good afternoon to you all. Thank you for those of you who joining us in person and online. Welcome. Thank you for coming to our session, "Jointing Tackling Disinformation and Promoting Human Rights" sponsored by the AU‑EU D4D Hub. It's an honor and privilege to be here with you all today. It's near and dear to my heart, and I'm looking forward to the discussion. I would like to look at the multistakeholder engagement, bringing together civil society, private sector, private sector, public sector, government, academia to contribute meaningfully to the contribution and influence digital policies.
Also very fitting with the spirit of IGF, I would say.
While disinformation ‑‑ sorry. While disinformation is very much a global issue, this discussion will focus on Africa, Europe perspectives and cooperation which is really at the heart and the scope of the AU‑EU D4D hub's work. Now, I would briefly like to turn your attention to Slido, which is a live quiz app that we will be using to interact with you sporadically throughout the session.
You can join Slido by scanning the QR code that is plastered around the room. And the online audience, you can join by clicking the link. Please do take part, whether you are joining us in person or virtually and our team is also here to assist you if you have any questions.
So the first question I would like to turn your attention to on Slido is: Which country are you from? And I assume being at the IGF that we will have a nice variety of responses.
As we navigate into the fourth Industrial Revolution, we must keep our eyes and mind open to the digitalization, in particular, disinformation which is something that needs to be addressed on a local, national, and transcontinental level, but first, what is disinformation? Simply put, disinformation is information intended to mislead people.
With the prevalence of the Internet across the world, and subsequently the rise of social media platforms, disinformation has become increasingly weaponized against population by causing division, chaos, hatred and uncertainty.
And it's also causing a lot of damage politically, economically and societally, but we'll get into that a bit later.
All right, yes, as suspected, we see a nice diversity of countries. This is a live version, and we are getting more and more countries in the word cloud. That's great.
I would like to turn your attention back to our favorite tool, Slido. For those of you who are just joining us in the room, we are interacting using a live quiz app called Slido and you are able to participate by scanning the QR code that's plastered around the room.
And so we will be asking a few questions sporadically throughout the session. So please do take part in that.
So again on Slido, I would like to introduce you to the second question, which is, which sector do you represent? So please input your responses on the app. You can choose between public sector, private sector, CSO, academia, or other. All right. That's great. We are getting some responses. Private sector and international cooperation. That's great.
Also seeing people from civil society. Private sector, academia. So very well represented from many groups.
So in this discussion, we hope to gather a set of practical recommendations on how to best tackle disinformation, while safeguarding human rights and this will be gathered from inputs by you, the audience and also our speakers. Now, we turn to our esteemed speakers, Simone Toussi, Charlotte Carnehl, and Odanga Madung. I would like you to keep your responses to 3 minutes.
So Simone, starting with you, you are a project officer at the Collaboration On International ICT Policy for East and Southern Africa, or in short, CIPESA. You are working and focusing on research, community engagement and advocacy on African digital policies and look at democratic processes such as privacy laws, surveillance regulations and government mechanisms to tackle disinformation in Africa. So Simone, why have you know cussed on multistakeholder collaboration in your research and what do stakeholders need to collaborate better?
>> SIMONE TOUSSI: Okay. Thank you. Thank you, Eden. Thank you, everyone.
So why we have focused on the multistakeholder collaboration on our research is ‑‑ is ‑‑ I will start by that question. Well, the focus on multistakeholder collaboration was mainly because we found that disinformation is multifaceted phenomenon that implies all the stakeholders and the affects democracy and human rights, but our binding issues, our binding stakes for all these stakeholders and when I think of stakeholders we have not defined it yet, when we say multistakeholder, we mean governments, intergovernmental organizations. We mean civil society. We mean private sector, media, journalists and, yeah, population. And sometimes ‑‑ and academia as well. So we have found that through the report mainly, we found that disinformation, it manifests through many ways. Welcome be deepfakes. It can be identity theft. Well, it can be many order of forms of disinformation and it's perpetrated by diversity of actors we have these governments that have cited sometimes, they are carrying out just political actors. They are carrying out disinformation campaigns to either gain power or keep power and we may have influencers, I'm citing some examples. We may have influencers who carry out disinformation campaigns, maybe to help political actors. Sometimes they are paid to do that. And sometimes for the private interests just to widen their audience.
At that time, we talk about misinformation, they receive false news without checking if it's false new. They display it on social network or the messaging applications and we found also that disinformation spreads through social media, mainstream media, and these messaging applications that I was talking about ‑‑ the most important part is sometimes the measures that take holders use to tackle disinformation are not always effective they are either inaccurate or ineffective because when the government, for example, tackles users to tackle this as an individual entity, sometimes they will miss some aspects of the manifestations, aspects that can be backed up by civil society organization or a journalist.
It means the measure put in place will not being effective. And to ensure that this measure is effective, it's not just the government, as an example, and also civil society take a measure and private sector ‑‑ the private sector companies are taking measures, but it doesn't always work well, because the media are missing some aspects.
In order to make it work and ensure it works well, we've ‑‑ we found as a solution that those actors should come together and collaborate and put together their competencies and their expertise to jointly act against this information.
>> EDEN TADESSE: Thank you so much. I think you touched well on the disinformation techniques and also the actors responsible for perpetrating this information, particularly online.
So now maybe we turn to our second speaker, Charlotte. Charlotte, you are working as operations director at Lie Detectors and EU‑wide journalist‑led organization. Your main object sieve to empower school kids and their teachers in Europe to act as powerful Lie Detectors and critical thinkers which is much needed in a world that is facing increasing propaganda and distorted facts online. You do this by providing schools with fact‑checked online content, helping students and teachers to understand news media and make informed choices.
At the same time, Charlotte, you advocate for media and information literacy with European policymakers. Charlotte, why did Lie Detectors decide to focus on media immediately literacy of school kids and what have you learned from your experience in the classrooms but also your advocacy work.
>> CHARLOTTE CARNEHL: Thank you so much for the question. I'm thankful to be part of this discussion and to at least virtually be with all of you in Addis.
And before I address your question, I want to give a brief shot out to Africa Check because when Lie Detectors was founded almost six years ago, they were one of the first organizations that we spoke to. Already at that time, Africa Check was not only doing fact checking but also sending journalists into schools which is a large part of the work that we do at Lie Detectors.
So Lie Detectors is an independent media literacy organization and we work to counter the corrosive effects of disinformation and polarization on democracy.
We are active in five European countries. We currently work with more than 250 journalists and by the end of this year, we expect to have visits more than 1,300 classrooms. And we fulfill our mission as you said, Eden, by empowering school Sides and their teachers to tell fact from fake online to understand how professional journalism works and to apply basic journalistic skills. And we do that really practically by enabling journalists to go into schools and create moments of honest and authentic exchange.
Why do we focus on school kids? Because we are convinced that fact checking alone won't do the job. We think that everybody needs the skills to assess and critical think about information. And young kids, kids in school are actually a high‑risk group for disinformation, misinformation, because they are targeted on channels that can't be monitored and they are largely navigating them by themselves.
So without their teachers or even their parents present. Of course, in terms of logistical possibilities, schools are a place where we can reach these kids. It's not only that the kids benefit from, it but also the journalists because they learn a lot about how this generation uses the news and consumes it. And then in addition to that, we also more and more train teachers directly because we know that they really are the key allies to ensure that media literacy education is embedded in day‑to‑day teaching.
On your second question about what we learned, I think I have, you know, a message of empowerment, because we have really learned if we approach digital literacy in a way that's age appropriate and keeps the teachers safe and meets the children where they are, we can get a lot done. Teachers are even more motivated since COVID, but they often tell us they don't have time to teach media literacy and there's too little recognition for their efforts or they themselves don't really possess the skills to teach the subject or this issue.
I have to say that I can sympathize that it can be a daunting task to discuss media literacy because they inhabit a different world than their teachers do.
In terms of advocacy, we continue to evaluate the feedback and the finding from our works to feed these results to policymakers almost in realtime. And we are seeing some changes as the OEDC and the European Commission are increasingly backing the idea that journalists have very important skills to pass on.
And also there's a growing desire to see a greater number of financially independent organizations acting in this space so that the reliance on media platforms to fund them can be reduced.
>> EDEN TADESSE: Thank you so much, Charlotte. He touched upon media literacy and focusing on children who are a high-risk group in this ‑‑ in encountering disinformation. Now I would like to turn your attention to Slido again.
So we have a third question that we would like to bring your attention to, and the question is in one word, how are you concerned by disinformation?
So again, for those of you who are just joining us, either online or in person, we have a live quiz app that we are using to participate with you, the audience. So please take a few seconds to add your responses there.
So Odanga, you are a cofounder of Odipa Dev. You investigated closely on the role of social media companies on spreading disinformation Africa, especially on the case of Kenyan elections that occurred this summer.
So what is the role of private sector actors in tackling disinformation, and what are some of the good practices of collaboration between social media companies and other stakeholders?
>> ODANGA MADUNG: Thank you very much. My name is Odanga and I'm currently a fellow at the Mozilla Foundation.
I think the role of private sector players when they are not profiting from disinformation, whatever or whenever they are not profiting from disinformation that is spread on their platform itself, is essentially to try and tackle it as much as possible and, you know, keep it from spreading as far as they currently are letting it do. They have had a few successes in terms of, you know, establishing the right kind of partnership such as partnering with civil society, and partnering with fact checkers and it's still in policies that essentially allows them to detect disinformation within their platforms, but at the same time, I would say, we should think about this as being pleased with some of their efforts but not necessarily satisfied with them. And this is mostly because like I mentioned we have a case whereby what has been developed by social media companies and social media platforms is not only an engine of radicalization as we have seen with the great replacement theory. What we have is a perverted business model that relies on surveillance advertising ‑‑ or that relies on surveillance advertising and does not respect the right to privacy for a lot of people.
And therefore, it is in essence actually metastasizing the disinformation that's being pushed by malicious actors on to these platforms and might not necessarily be protecting the users of these platforms from the harmful effects that disinformation can cause. So there is a very, very clear amount of platform accountability and accountability work that needs to be done to call these private sector players out, and to ensure that we are actually able to achieve some form of information justice.
Because I think just as the previous speaker had mentioned, fact checking alone is not enough. And even in some cases, media literacy alone is not enough. We need to hold these platforms to account. And a lot of problems with misinformation and disinformation is not that it's published, right? A big problem with misinformation and disinformation, especially in this Internet age is largely because it is algorithmically amplified. And because of the algorithmic amplification by these companies within the platforms, they also end up profiting from misinformation and disinformation.
This is a very serious problem! People die from all of these things. So what do we do about that? How do we get some compensation from the companies in the cases where there's harm. There are trillion dollars companies that are failing to keep their users safe. This is an extractive model that's taking data from marginalized populations across world, and in return, they make a ton of money from it. So, you know, my main thing and a lot of investigations that I do about it focuses on trying to understand what exactly are these failures, and what can we go ahead and do about them.
>> EDEN TADESSE: Thank you so much, Odanga. I think mentioning the role of social media in perpetuating or worsening disinformation is a critical element of our discussion. And also you touched upon accountability which is really important for the social media giants in protecting all of their users from disinformation. Great.
Now I'm looking at the responses from the audience on our previous question and I think it's quite ‑‑ it visibly reflects the diversity of our audience, both online and in person. So ‑‑ and that's really essential to our conversation. So thank you for taking part in that.
Simone, coming back to you, in the recent report, disinformation pathways and effectives on democracy an human rights in Africa, you compared the effects of disinformation in five different African countries Ethiopia, Cameroon, Kenya, Nigeria and Uganda. What are the most effective measures to tackle disinformation, that doesn't affect human rights and what can we learn from these experiences?
>> SIMONE TOUSSI: Okay. I think it is important to underline that these countries were chosen ‑‑ they were taken a sample for a regional representativeness. So we tried to have as much ‑‑ we tried to have one country per region in Africa, even if North Africa for now is not represented.
But other criteria I was looking for countries where we have had elections or high little protests in the last five years because that was the focus of the report. It was about disinformation during elections or periods of high political protest. So looking at those countries and the disinformation put in place by the actors.
It's very, very difficult to speak about the measure to tackle disinformation that does not affect human rights. For now, I think we don't have one. But we have some measures that could go without ‑‑ for now go without affecting human rights but they are not ‑‑ they are still not effective due to some ‑‑ due to some aspect that I'm going to underline those measures are fact checking that was mentioned earlier. Fact checking is a possibly effective measure in the sense that you can easily show and ‑‑ easily and clearly show how ‑‑ how news or information is false through analysis of sources and many other things. But fact checking is facing other challenges, challenges like lack of access to information. Governments must ‑‑ most of these governments of these countries that we studied are very closed to ‑‑ to publish public information. So access to information is not a common good in these countries, and in Cameroon, for example, we don't have a law on access to information.
We don't have well‑established mechanism to ensure that the public can access information, even information that is public, that is public information, you know, government information. He spoke about accountability. He was talking about the platform, but we can also talk about governments and governmental agencies. We don't have that information. We don't have the platform where we can access that information in these countries. Nigeria is working on that. And in Cameroon, you have local initiatives by CSO, civil society organizations to ensure that we have open municipalities but still, we don't have ‑‑ we don't have that established mechanism to allow access to information. We have fact checking and there is also ‑‑ there is also that measure that governments generally use to address disinformation. That is raising awareness. Yeah, we are talking about collaboration multistakeholder collaboration.
Government tried to gather multi‑actors like private companies, Facebook, like the civil society like journalists when the election, the presidential election was approaching in 2018. And I'm talking about Cameroon, yes?
And they tried to figure out how to tackle disinformation, how to fight against disinformation during the election period. And the main activity that they did, that they carried out was raising awareness. And when they raise awareness, they will send messages to people informing them on the race that they face about disinformation, informing them, if they share false news and about the attitude that they can adopt if they have an abuse message.
When you go through this awareness raising, you see that the law is not very clear about what is disinformation. And when you see the sanction, that's very, very high sometimes, it can either lead people to self‑censorship. I mean, it can tackle ‑‑ maybe it can tackle disinformation, but it will suppress Free Expression.
So when you look about these two measures fact checking and awareness raising, there are ways that we can tackle disinformation without affecting human rights, but they are still ‑‑ they still have many challenges. And the principal ‑‑ the domain ‑‑ the main way to tackle ‑‑ to use this means without affecting human rights and to face these challenges is to come together I will come back again and again and again on coming together as multi‑actors, multistakeholder to better do it because if I take one example this time. If you are implying ‑‑ if you are involving civil society or human rights lawyers in the regulation, the regulatory processes, it will be easier to have provisions against disinformation or even content regulation ‑‑ content regulations that respect human rights and respect it from international principles of human rights so that is it ‑‑ if you bring on board civil society organizations, human right defenders, lawyers, intergovernmental organizations, you may have the expectance needed as a government to do a fair regulatory process to tackle disinformation. Yeah.
>> EDEN TADESSE: Great. Thank you so much, Simone for that insightful response. So my next question is for Charlotte. Imagine Lie Detectors would receive a mandate to advise European and African leaders on how to jointly tackle disinformation. What would be your main recommendations on that front? And what do you think should be prioritized?
>> CHARLOTTE CARNEHL: Thanks so much for this question. And Simone, it was really interesting to hear your thoughts about the challenges of fact checking and building on that and repeating what I said earlier, as important as fact checking is, we really think that fact checking alone won't solve this problem of disinformation. We advocate for media literacy to be an integral part of school and teacher training, and an important step in the right direction which we strongly support is the OECD's move to have critical digital literacy. We think that media literacy should be taught across all subject areas and not be confined to one subject only and we really need investment in teacher training to achieve this.
It's necessary and building on what I said earlier, to aim for impact among young age groups. By the time these students go to university, it can already be too late. I think it's important that we don't tell kids what to think but that we tell them that they should think, access and question information independently.
It's really like training a muscle reflex without taking sides. It's really important to address the topic in a way that's inclusive, that we keep it light and fun and nonpolitical and what we really advocate for is that we need students to feel empowered and not frightened. So it's important to give them very practical tools that they can easily use themselves. So, for example, when our journalists go into classroom they show kids how to use a reverse image search. How they can look up images online.
We think that until that goal is reached, until media literacy is embedded in all curricula, our model, which facilitates the secondment into educational settings works really well as long as training is in place. Because the trustworthy external actors can make a big difference in a classroom as they have ‑‑ as they leave a very long-lasting impression with the students. One should not underestimate what it takes to make more than 1,000 classroom visits per year possible. It's really a significant logistical undertaking because there's so many different stakeholders involved. We are dealing with different countries and educational systems which come with their unique opportunities and challenges.
Yes, additionally the topic itself is rapidly evolving. So it's necessary to have the right systems in place to continuously monitor control quality, and really be on top of what is happening in different context.
I think this nicely ties in with what we heard earlier, it's very important to safeguard the independence of media literacy and education. So funding mechanisms should not hinder this independence, both of education and media literacy work and journalism.
Most of our work currently is happening in Europe, but if anyone in the audience or Addis or Johnline would like our help in getting this started in your context and region, please reach out. My colleague will put the contact information in the chat, and I really look forward to your questions and thoughts during the discussion and thank you very much for your attention.
>> EDEN TADESSE: Thank you so much, Charlotte. Odanga coming back to the role of social media platforms. You investigated different attempts of regulating social media platforms worldwide, what would be your dream scenario for the future?
>> ODANGA MADUNG: Thank you.
I quite frankly I'm not sure. I think as a journalist, I like to burn things down. I really ‑‑ like, I usually find asking people what is the solution and I'm like, my job is to find problems. Anyway, I think my dream solution in this specific case I think would be, number one, I think a nuanced regulation into understanding and trying to regulate the social media and Internet industry and not necessarily focusing on information control because I feel like a lot of time when a lot of the ‑‑ especially African governments go into the business of trying to actually regulate what happens on the Internet, they usually like ‑‑ we don't like what people are saying on the Internet and let's go and try to regulate the Internet. No, what we need to be going back from is we need to try to understand what are the harms that are being caused by the internet and how do you protect the consumers of the Internet from those kind of harms. That's a conversation that is nuanced and protects human rights, right, that is able to protect human rights but also is able to provide regulation into an institution that is allowed to run amuck.
Secondly, I think my second dream scenario would be basically breaking up the Big Tech companies. I think we have had ‑‑ and they have stifled competition and have held captive their ideas of an entire industry. And we need to stop ‑‑ we need to imagine a world where the Facebook is not an Internet. A world where Twitter is not the Internet. It's a big movement to try to decentralize the Internet from many of these big companies so there's diversity of experience, on the Internet, as it what before, you know, the advent of what we have now come to call Big Tech, right? And then we can have a diversity of ideas that is not necessarily dominated by ‑‑ it is not necessarily dominated by a bunch of people in Silicon Valley.
And, you know, we will be able to have, you know, fresh ‑‑ basically fresh thoughts and fresh attempts at trying to design what social media landscapes are able to look at that take into consideration the context of the people or of people like us, black and brown people who know very clearly inhabit these kind of spaces.
Those are my two dream scenarios.
>> EDEN TADESSE: Great. Thank you so much for sharing that, Odanga. It was insightful. Just a quick reminder before we navigate into our debate. We invite you to answer and give your relations on tackling disinformation, while promoting human rights. So you can include your input via Slido. Again, you will be prompted on the app with a question. So ‑‑ and also for the online participants, on Zoom, you can also do that through the chat box. That's also an option for you.
So as we do, that I would like to open the floor to the audience to share your questions comments, and/or specific recommendations towards the topic of this discussion. We want to hear from as many people as possible. So please do keep your responses to 30 seconds maximum.
May I also remind you to please state your name, where you come from, and if your question is targeted towards any speaker, please also mention that. Thank you.
>> AUDIENCE MEMBER: Yes. I am Dr. Mohamed Yasim, I came from the University in France, and I'm originally from the Sudan. So I'm joining the union between the African Union and the African Union and representing 82 countries currently while I'm here.
I'm glad about this ‑‑ these hubs and I think it will have huge impact as long as it will be ‑‑ to include all the countries of Africa and maybe create this dream future for our generation this is crucial. And it will accelerate the development, because now the pace of development, which is going is very slow. And as long as we utilize things, things will be accelerated. So from your experiences, what are the maybe ideal accelerators which can replicate or maybe increase the number of hubs to include all the African territories and ‑‑ and what are the best ‑‑ the best way to go forward in order to make this digital future inclusive ‑‑ as inclusive as possible? Thank you.
>> AUDIENCE MEMBER: My name is Makru, I come from a civil society in Ethiopia. My question is how do we ‑‑ on presentation, especially media literacy for children is a great intervention for misinformation, but it's ‑‑ its impact is like planting a tree. So it takes time and a process, but our world, especially for the past some years and a lot of some kind of nationalist leaders are growing from west and even ‑‑ there are different nationalist leaders and those nationalist leaders are weaponizing misinformation to create a mistrust in different groups so that they can sustain their power and their interests. So as an immediate solution for the coming few years what will be the solution if you reflect on this? Any of the panelists will be interest? Thank you very much for the opportunity.
>> AUDIENCE MEMBER: Hello, I'm Tim, I'm from Russia and I would like to remark that we in Russia strongly support the idea of Big Tech regulation by the government or civil society, but we see the same that platforms ‑‑ the global IT corporations need regulation, because they have to take responsibility for what they are doing and how they are making money.
Second, I wanted to deliver you a remark about our own experience on fighting disinformation, and I should say that actually we have a big experience especially recently, and we have come to the idea ‑‑ not idea but an answer that actually content restrictions and removing of misinformation or fakes is not simply effective, as long as if you remove one, the 10 will ‑‑ will arise.
So that our best practice for fighting misinformation, disinformation, and fakes is to deliver verified information as wide as possible, maybe somewhere targeted, maybe somewhere very wide and in some easy, consumable way for people. That is our experience. We are willing to share our experience with anybody who is interested in fighting disinformation.
Thank you so much.
>> AUDIENCE MEMBER: Thank you very much. I am from Ethiopia, from the judiciary, and thank you for your presentations and insights. How we need to calculate misinformation, and disinformation and hate speech in line with the protection of the digital rights, it's important to have a discussion on critical issues. Like, there needs to be a national outlook by the outlets on the very critical issues that divide societies, because the outlets are the ones who fabricate, who prop up disinformation. So prop up hate speeches. Therefore, there needs to be a forum in which they can discuss what is the difference.
Maybe when you go to the deep of the divisions, it's not a real division. It's for the sake of galvanizing the societies since they have different negative and I think having this national dialogue and having the discussions against the different elements is very important.
The second thing is that there should be a law which should be in line with the international human rights standards, like with the freedom of expressions. If there is a law which is international standards and the effects is the freedom of expressions. Here in Ethiopia, we have an issue with hate speech. It should be in line with the international standards. That's the first thing.
The second one, the courts, the judiciaries, they should apply the human rights standards and by doing so, the law enforcement organizations in implementing that law will not affect the human rights standards and implementing that law as far as part of the human rights standards.
The third thing is there should be a responsible usage of Internet by the general public. Therefore, there needs to be an awareness question, mechanisms for the use, for the vulnerable group, for those would don't get enough access how to use the Internet.
If they were using the Internet, the social media which propagates negative thoughts and disinformation, then they succumb to that idea and they took it as though it was real things, and I think empowering is very important. Thank you.
>> EDEN TADESSE: Thank you so much for your question. I think we received some interesting insights and remarks, but I want to give our speakers a chance to respond and then we can continue the dialogue. Okay. There's one more question from the audience.
>> AUDIENCE MEMBER: Thank you very much. So I'm the director. From Kazakhstan. We had, you know, shutdown in January, and our Ministry of Information and social development blocked so many websites by disinformation and enthusiasm can't communicate with social websites and messengers.
And I heard that in Nigeria, in 2021, Nigerian government looked at Twitter and in January 2022, Twitter and Nigerian government made an agreement, and Twitter should open their own office and registered company, and representative and pay taxes and have Nigeria between government officials and Twitter to manage prohibited content that violate the community.
So I want to say we ‑‑ we had like ‑‑ to reduce the company in Kazakhstan, to make communication between government and the social websites. You did it, but we did not. And I would ask government ‑‑ Nigerian government can use their legal method to control content in Twitter or other social websites.
>> EDEN TADESSE: Great. Thank you so much for your question.
Odanga, I would like to give the floor to you if you would like to respond to some of the questions that have been made.
>> ODANGA MADUNG: Yes, with some of the remarks and questions that have been made, so I think ‑‑ from some of the comments that I have seen around, you know, your concerns around, you know, what does very primary exposure to social media being the main ‑‑ your main exposure to the Internet, the kind of effects it can have.
I think someone asked the question of how do you replicate this to make it more inclusive? And then I think we also talked about the issue of media literacy and the limits that it can have. We see all of these again tie back to some of the big, big problems that we are currently seeing on the Internet which is that the Internet right now is incredibly centralized, amongst some very, very big companies that dominate what your experience on the Internet is likely to be, and that is likely to have a lot of effects in terms of how people use the Internet, right? In terms of how we even divide to tackle misinformation and disinformation. You know, I think the saddest part about how we talk about this issue is that, you know, these topics are largely framed by some of these very companies that end up propagating this misinformation, but the various, very many different types of information ‑‑ of information disorder out there, but because of the sheer dominance of these companies we end up with a problem.
We end up with some of the problems that all of you have highlighted. So thinking about what decentralization of the Internet looks like is a very big part and has to be a big part of the advocacy that any of you do. We want to think about what does the Internet look like away from some of these platforms. And not in a forceful way but in a way that is able to actually provide a nutritious experience in terms of using the Internet.
It's almost ‑‑ you know, information diets are just the same as nutritional diets. If you meet one thing, you will end up malnourishing yourself in a certain way. The same thing is actually a case with a lot of the platforms we are using and that's why they have become incredibly problematic.
>> EDEN TADESSE: Thank you so much. I would like to give the chance for Charlotte, if you would like to address some of the remarks and questions that have been made.
>> CHARLOTTE CARNEHL: Thank you. I found it really fascinating to hear all of these various perspectives, because to me it really shows that the challenges are so context dependent, and it is really hard to find a one size fits all solution.
But I want to build on the comment and question that our colleague from the civil society organization in Ethiopia made about the limits of media literacy training. I really like this picture that you painted about planting a seed and I very much agree with it. I think it's definitely ‑‑ I mean, this panel, I think really illustrates that we need several approaches to solving this problem, and they need to they need to go hand in hand.
I do think that planting the seed also can have an immediate effect in the sense that ‑‑ for example, very practically, if our journalists speak to kids in the classroom, from the feedback we get, we realize, they say okay and going forward I want to be a little bit more cautious about what information I'm passing on.
I want to be a detective a little bit myself. I want to check information a little bit more. While I agree that we are planting a seed, I think it has time to grow. I mean, I think we agree that disinformation is not a new phenomenon. It has existed for a very, very long time, but, of course, now the means are very different by how it ‑‑ it spreads.
And the problem, of course ‑‑ I mean we talk about school kids, but they are not the only age group that need media literacy education. And so I would welcome to see many more initiatives taking on also for age groups that we are not covering. But I think in the meantime, there are things while the tree grows like supporting quality journalism that are really indispensable to help us kind of bridge this gap until the tree is fully up and standing.
>> EDEN TADESSE: Thank you so much, Charlotte. And now I would also like to ask Simone to respond to some of the questions and remarks.
>> SIMONE TOUSSI: Okay. Thank you, Eden.
I will go for the first question, where Odanga has said very much about. I don't think I have much to add but I wanted to bring out the fact of access to ‑‑ access to infrastructure, media infrastructure in all levels as one of the ways to make it as inclusive as possible.
Most of the time, when we ‑‑ when we talk about the measures to tackle disinformation, the populations of rural area, I will talk about that, are ‑‑ are always ‑‑ almost always left behind. We consider people who have access ‑‑ who have access to the media, to mainstream media and maybe social media, but we generally don't consider those who don't have access to those ‑‑ to those media and the information means, but they are still exposed to disinformation. They are just victims of disinformation.
So making it as inclusive as possible, considering the rural population would be taking into account the way they too take in information. When we are doing ‑‑ or when we are doing sensitization or raising awareness of disinformation, we can at least make sure that they access that information a way and go through the community ‑‑ the community radios, community means of access to information, even from ‑‑ from mouth to ears like sometimes they do, because that's the main way how they are exposed to the information. People just hear about it and then spread with mouth, and it reaches that population that have no way to verify information. Or the information will be flagged as fake and they won't know that the information they are believing in has already been flagged as fake. So that is the part of inclusivity that I can bring in.
The other question was about weaponizing disinformation. I'm not sure I hear it ‑‑ I got it very well ‑‑ what I get with it, what do we do with the government weaponizing disinformation, right? What we found is weaponizing disinformation ‑‑ they are able to do that because they have already set the, you know, the bullets. The disinformation is like the weapon, and the bullets are right behind. And the bullets floor regulations. The bullets are into the gaps that exist, either into the regulations, either into ‑‑ I was talking about mechanisms of access to information, when the government has not set that mechanism. When the government have enacted a law that is not gear that have many gaps on the definition of disinformation. It is very easy for the government to use that law to ‑‑ to use that law against population, against freedom of expression, against dissenting voices and criticizing voices. So.
I don't know if I point out to ‑‑ so your concern, that's how we say it from what we have done so far.
>> EDEN TADESSE: Thank you so much, Simone.
I will just turn to our online audience now, but first I will share some recommendations that have been added so far for jointly tackling disinformation, while promoting human rights. So the first one is address anonymity on social media allowing for the identification of people and organizations that spread disinformation.
Empowering people to check their sources, bringing people together, multistakeholder actors from different industries to address the issue through dialogue, and stopping the business model, which is benefiting from disinformation. I think this has been echoed by Odanga and stronger laws that could find common ground on an international level. So now we'll take two questions from our online audience. So the first one is what could be done if the global ‑‑ in the global digital platforms refused to cooperate with law enforcement of other countries regarding illegal and harmful content like incitement to violence, organize disinformation campaigns, hate crimes on their platforms or refuse to establish an official representation in their countries and in this situation, what would be the responsibility of states regarding the bad behaviors and noncooperation of their respective platforms, work under their jurisdiction regarding the content that undermines the security, stability, and public order of these other countries or content that harms the lives, health and rights of users. So that's quite a packed question.
But if the speakers have anything to respond to that and also closing remarks would be great. Thank you.
>> ODANGA MADUNG: Okay. I think I can respond to that. Well, it's such a loaded question. Could you read maybe the first part? I think it needs multiple answers.
>> EDEN TADESSE: I will read the first one quickly again what could be done if the global digital platforms refuse to cooperate with law enforcement, like incitement to violence, organize disinformation campaigns or they refuse to establish an official representation in some countries.
>> ODANGA MADUNG: Okay. So those are two separate things. On the case of, you know, harmful content on their platforms, they actually do tend to comply. I have yet to see a case whereby a platform has completely refused to take down a please of content that has been identified to be problematic, harmful and, you know, against the law and they refuse to take it down. Kanye West was just suspended from Twitter again today. I don't know if anyone else is following that news.
But, you know, that's another ‑‑ that's just one example. I think in the cases where they are requested to take down something that is clearly in violation of the certain rules and laws, and this is pressure to take that down, it does go down. Okay? So now in the case of put in representatives within certain jurisdictions, we must be able to ask why are they being asked to put in certain representatives within the jurisdictions?
Within the EU, I know that DSA will be ‑‑ maybe someone can fact check me on that, but I do know that representation will be required within the EU. Right? For platforms that surpass a certain number of users. It depends on why they are asking to do that. Some ask that they are in their regions for authoritarian purposes in Brazil, we have seen where employees of social media companies have been arrested. The same thing in India.
And that also poses a kind of threat to the individuals that do actually end up taking those jobs. So it's very important to ask why. Now in the case they do refuse and they deem that the reason they have been asked to put in representation within those countries is not valid enough for them, they exit. They say, you know, we are a business after all. We won't put our people in danger or we are not willing to comply with this. It's a free market. They just pull out. Yeah?
And let anyone else take over the mantle. So ‑‑ and it has happened before. It happened before, it happened in Australia when they literally said we are not going to comply with any of your laws and we're out. So they had to come to the negotiating table. Those are some of the nuances that we have to consider.
>> EDEN TADESSE: Due to time constraints we will have to wrap up the discussion. I wouldn't do that without firstly thanking our audience, both online and offline for taking part in this important conversation and by sharing your individual voices and unique perspectives and, of course, secondly, our remarkable speakers, Simone, Odanga and Charlotte for your input and embodying the multistakeholder dialogue that is deeply rooted within the framework of this topic.
On behalf of the African Union, the European Union, and the Digital for Development Hub, I thank you all for your participation today. Thank you.