IGF 2022 WS #214 Blurred lines between fact & fiction: Disinformation online

Time
Tuesday, 29th November, 2022 (12:35 UTC) - Tuesday, 29th November, 2022 (14:05 UTC)
Room
CR4

Organizer 1: Evangelia Daskalaki, Greek Safer Internet Center (FORTH)/ Hellenic Mediterranean University
Organizer 2: Sabrina Vorbau, European Schoolnet/ Insafe
Organizer 3: Joachim Kind, Safer Internet Centre Germany / klicksafe
Organizer 4: Rodrigo Nejm, Safernet Brasil

Speaker 1: Maria Spyraki, Government, Western European and Others Group (WEOG)
Speaker 2: Ricardo Campos, Private Sector, Latin American and Caribbean Group (GRULAC)
Speaker 3: Marina Kopidaki, Intergovernmental Organization, Western European and Others Group (WEOG)
Speaker 4: Sérgio Gomes da Silva, Government, Western European and Others Group (WEOG)

Moderator

Sabrina Vorbau, Civil Society, Western European and Others Group (WEOG)

Online Moderator

Evangelia Daskalaki, Intergovernmental Organization, Western European and Others Group (WEOG)

Rapporteur

Rodrigo Nejm, Civil Society, Latin American and Caribbean Group (GRULAC)

Format

Break-out Group Discussions - Flexible Seating - 90 Min

Policy Question(s)

- Does online disinformation undermine our democratic values and principles and how? - How do policymakers and the industry tackle the problem? - Is the media literacy education landscape equally well established in all the countries? If not, what is the baseline of actions that a country should take to create digitally literate societies?

Connection with previous Messages: - Adequate enabling environments (e.g. policies, legislation, institutions) need to be put in place at the national, regional and global levels to foster just, safe, resilient and sustainable digital societies and economies. - Artificial Intelligence (AI) needs to be developed and deployed in manners that allow it to be as inclusive as possible, non-discriminatory, auditable and rooted into democratic principles, the rule of law and human rights. This requires a combination of agile self, soft and hard regulatory mechanisms, along with the tools to implement them. -Policies implemented by Internet platforms to deal with harmful online content need to be transparent, acknowledge the limits of automated content moderation, and ensure a proper balance with the right to freedom of expression. -Regarding online education and learning, many countries are faced with a lack of devices, weak infrastructure and low levels of digital literacy and digital skills. Increased support and international collaboration and partnerships to tackle these issues are key. Individual actors at local and regional levels should also take responsibility in finding solutions together. - Multiple different actions are needed to fight against illiteracy, in particular in the Global South. There is insufficient common language between stakeholders, inadequate participation and lack of critical assessment of whether engagement is meaningful. There is a need to improve coherent use of terminology which can impact the effectiveness of Internet policy debates. For example, having better translation between languages, but also exchange within and between regions.

SDGs

3.d
4. Quality Education
4.7
13.3
16. Peace, Justice and Strong Institutions
16.10

Targets: Regarding the Sustainable Development Goals 16.10 and 4.7, they have direct links to the proposal because democratic societies depend on willing, informed citizen participation and expression of political will. And if the basis of those decisions made by the citizens is corrupted by disinformation, then that's a hijacking of democratic society. Accordingly, inaccurate or misleading content has potentially damaging impacts on core human rights and the functioning of democracy. It can manipulate citizens, create distrust in democratic institutions, distort electoral processes, foster polarization online, or feed disbelief in key challenges such as pandemics and climate change. As far as 3.d and 13.3 is concerned, disinformation long predates COVID-19. Falsehoods designed to undermine the validity of science extend from the resurgence of the ‘flat earth movement’ to those that dispute scientific consensus on climate change, usually for narrow political or economic gain. The fabrications that contaminate public health information today rely on the same dissemination tools traditionally used to distribute disinformation. What’s novel are the themes and the very direct impacts. Last, but not least, the the Sustainable Development Goal 4, that includes Media Literacy among others, is directly applicable to our proposal as the deficit in Digital Media Literacy across the world has been identified as a critical factor explaining widespread belief in online false information. Media Literacy is indeed a key answer to disinformation and its roots.

Description:

Blurred lines between fact & fiction: Disinformation undermining democratic values and principles. The concept of disinformation refers to false, inaccurate, or misleading information designed, presented and promoted intentionally to cause public harm or make a profit. Unanimously, the internet is the source to which people would turn first if they need information on a specific topic and the internet has provided unprecedented amounts of information to huge numbers of people worldwide. However, at the same time, false and decontextualized information has also been disseminated. The rise of the digital platforms has enabled people to provide more direct access to content , and thus have replaced, in a way, mediated professional journalism and editorial decision with algorithms that prioritize clickbait content, in order to maximize engagement. Anyone with a social media account can create and spread disinformation: governments, companies, other interest groups, or individuals. Research (Reference: https://www.science.org/doi/abs/10.1126/science.aap9559) has suggested that human users are the main amplifiers of online propaganda, not bots. Consequently, online influence operations are extremely fuzzy, as they largely depend on the broadcast of data by many private actors to reach their target audience. On top of that, in the acceleration of manipulation, deep-fake technology can be extremely harmful because people cannot tell whether content is genuine or false. An example of that is a manipulated video of Ukrainian President Volodymyr Zelenskyy that was circulated in March 2022. In it, a digitally generated Zelenskyy told the Ukrainian national army to surrender. The video was circulated online but was quickly debunked as a deep fake. Democratic societies depend on willing, informed citizen participation and expression of political will. And if the basis of those decisions made by the citizens is corrupted by disinformation, then that's a hijacking of democratic society. Accordingly, inaccurate or misleading content has potentially damaging impacts on core human rights and the functioning of democracy. It can manipulate citizens, create distrust in democratic institutions, distort electoral processes, foster polarization online, or feed disbelief in key challenges such as pandemics and climate change. Effective responses to disinformation are needed at multiple levels, such as formal laws and regulations, corporate measures and civil society action. One of these responses is empowering people with educational programs about Media Literacy. Specifically, the deficit in Digital Media Literacy across the world has been identified as a critical factor explaining widespread belief in online false information. According to the report “Disinformation and propaganda: impact on the functioning of the rule of law and democratic processes” (Reference https://www.europarl.europa.eu/RegData/etudes/STUD/2021/653633/EXPO_STU…) Media Literacy should be made part of school curriculum at every level of education; it should include knowledge of responsibility of sharing and paid-trolling. Last but not least, more research should be supported to clarify what type of strategic communication can impact attitude change and behavioral intent.

Expected Outcomes

- Raise awareness about Disinformation and Misinformation. - Raise awareness about the impact of disinformation in societies and democracies. - Discuss the baseline, yet necessary, actions taken at national, regional and global levels to tackle disinformation. - Discuss the role of Social Media and the steps they have taken so far to tackle disinformation. - Discuss the key role of Media Literacy and what type of strategic communication can impact attitude change and behavioral intent in regard to disinformation. -Policies implemented by Internet platforms to deal with disinformation online need to be transparent, acknowledge the limits of automated content moderation, and ensure a proper balance with the right to freedom of expression.

Hybrid Format: Break-out Group Discussions - Flexible Seating In terms of format, the session will be organized as a facilitated dialogue. Led by the moderator, the session will kick-off with a 30 minutes high-level introductory speakers discussion. Each speaker will give a short statement outlining their role and responsibility when tackling disinformation. Following the introductory panel, different break-out group discussion will take place in order to pro-actively involve all participants in the debate. For 30 minutes three different discussions (break-out rooms online but also onsite) will be led by representatives from civil society, academia and youth, in order to fully fulfill the multi-stakeholder approach, respecting as well gender and geographical balance. Discussions will evolve around the policy questions mentioned above. High-level speakers will join the table discussions as well. The session will conclude with final closing remarks by the high-level panel and takeaways summarized by the onsite moderator, the online moderator and rapporteur. In addition, regarding the hybrid format of the workshop the online moderator will ensure online participants are able to express their opinions, communicate questions and be equally involved with the workshop throughout the whole debate. Our experience from our successful workshop last year in the hybrid IGF 2021 , showed that the online moderator will be given the same rights with the onsite moderator to give the floor to the online participants. Complementary to this, a social media campaign on Twitter will help to give further visibility to the session. Live tweeting during the session will open the discussion to a wider online audience.

Online Participation

 

Usage of IGF Official Tool.

 

Key Takeaways (* deadline 2 hours after session)

Disinformation circulating in the internet and in private groups can interfere in democratic processes . All panelists pointed out that investment that should be made in media literacy skills in population as a way of empowering them through a critical analysis of the information they receive. Meaning, to give the people the instruments that can help them to distinguish disinformation and misinformation and empower their decisions.

Apart from impacting democratic processes, disinformation has also impact in mental health of activists and young people regarding emotional effects that disinformation provokes. It was also mentioned that we should be very careful with the legislations, because there exist the threat to suppress free speech and pluralism in the conversations.

Call to Action (* deadline 2 hours after session)

One call for action is to support media literacy education and raise awareness about safety online. Particularly for the Youth we have to come up with different ways of raising awareness than the usual because they tend to ignore all the side effect of disinformation in the democratic societies and they only touch upon the surface of the problem.

Support to journalistic media as a way to avoid giving ground to disinformation, meaning to support good practices that follow the journalistic deontological code.