IGF 2017 WS #68
Fake News, AI Trolls & Disinformation: How Can the Internet Community Deal with Poison in the System

Short Title
Poison in the System: Fighting Propaganda Bots & Hate Online
Proposer's Name: Ms. Sarah Moulton

Proposer's Organization: National Democratic Institute

Co-Proposer's Name: Mr. Chris Doten

Co-Proposer's Organization: National Democratic Institute

Co-Organizers:

Mr.,Chris,DOTEN,Civil Society,National Democratic Institute Ms.,Sarah,MOULTON,Civil Society,National Democratic Institute

Additional Speakers

Nighat Dad

Nighat Dad is the Executive Director of Digital Rights Foundation, Pakistan, and an accomplished lawyer and a human rights activist. Nighat Dad is a pioneering campaigner around access to open internet in Pakistan and around the globe. She has been actively engaged at a policy level on issues focusing on Internet Freedom, women and technology, Digital Security and women’s empowerment. Ms. Dad was recently included in Time Magazine’s Next Generation Leaders List for her work on helping women fight online harassment.

 

Agenda

Introduction to the topic by Moderator (5 mins)

Introduction of Topic Leaders (10 mins)

Open Discussion with Session Participants (40 mins)

The Birds of a Feather session will be driven primarily by interests of in-person and online participants, but guiding questions would include:

  • What is the role of internet policy makers in responding to nation-state engagement in online conversations?

  • What are the economic incentives for creating disinformation flows? Should one remove troll networks?

  • Does fact checking work? Should communications platforms adopt it? Who checks the fact checkers?

  • How can we address the disproportionate impact of trolls on women’s online participation?

  • With accelerating AI behind sock-puppet accounts, what is the future of civic dialogue online?

Concluding remarks (5 mins)

 

Session Report (* deadline 26 October) - click on the ? symbol for instructions

- Session Title: Fake News, AI Trolls & Disinformation: How Can the Internet Community Deal with Poison in the System

- Date: December 19, 2017                 

- Time: 15:00 - 16:00    

- Session Organizer: National Democratic Institute            

- Chair/Moderator:  Chris Doten, National Democratic Institute (onsite), Sarah Moulton, National Democratic Institute (online)            

- Rapporteur/Notetaker: Priyal Bhatt, National Democratic Institute    

- List of Speakers and their institutional affiliations:                

Matt Chessen, U.S. Department of State
Donatien Niyongendako, DefendDefenders
Alina Polyakova, The Brookings Institution
Samuel Woolley, Oxford Internet Institute-University of Oxford

- Key Issues raised (1 sentence per issue):                 

  • People are looking for infotainment and consume fake news because it is emotionally pleasing.
  • Governments need to strike a balance between the panic surrounding disinformation during elections and short-term solutions that aren’t censorship or criminalizing online discourse.
  • Understanding the economic incentives that support dissemination of disinformation can illustrate the motivations and roles of various platforms and explain why concerted action against disinformation is difficult.
  • With respect to emerging technologies, it is important to consider not just disinformation in the present, but possible disinformation practices of the future. For example, it will soon be possible to alter audio and video just as one can fake text.
  • While it is necessary to think individually about the responsibility of the digital platforms for their content and the role of government and civil society, the approach to combat disinformation must be a multi-stakeholder, democratic process.

- If there were presentations during the session, please provide a 1-paragraph summary for each presentation: N/A

- Please describe the Discussions that took place during the workshop session (3 paragraphs):                

The discussions during this workshop session were focused primarily on understanding the factors that support and perpetuate disinformation practices, as well as sharing possible solutions to combat misinformation. From the onset of the conversation, both participants and panelists noted the often disregarded, emotional aspect of disinformation; people consume disinformation--irrespective of their belief in it--for infotainment and because it is emotionally pleasing.

The economic incentives that support disinformation were also considered. There is a market for disinformation--with consumers interested in infotainment and producers interested in the monetary gains from disseminating disinformation. A participant shared the example of Macedonia where youth shared disinformation primarily for the opportunity to make money. Given these monetary benefits, the implications of taxing disinformation or changing the revenue structure for media organizations were debated, and broader questions of who controls and categorizes information were raised.

Both participants and panelists also shared ideas on short-term and long-term solutions that could make disinformation less prevalent. Long-term solutions primarily focused on educating consumers of information to differentiate between legitimate and illegitimate news, as well as civic and ethics education for those who build the platforms. Questions were also raised on reconciling these long-term solutions with the short-term needs of government to eliminate disinformation during particularly critical periods, such as elections. For example, in East Africa, impersonation of human rights defenders accounts is common in the election period or during demonstrations. In an effort to address the fake news, states look for quick solutions that may lead to censorship and criminalizing online discourse. In light of this, it is important for civil society to work with governments and the platforms themselves to ensure that any regulation does not drown out digital activism.

- Please describe any Participant suggestions regarding the way forward/ potential next steps /key takeaways (3 paragraphs):    

Both panelists and participants shared a variety of ideas and solutions considering next steps for the IGF community. From civic education programs to community-driven verification of news, a number of practical ideas were shared within the group. An overarching theme of the conversation was the need to involve multiple actors. Panelists noted the need for collaboration between government, civil society, and owners of the digital platforms in setting the narrative rather than simply reacting to past events. Disinformation poses a challenge to democracy, but it also requires a democratic solution.

Consideration was also given to emerging technologies, particularly with respect to artificial intelligence, and how stakeholders can better prepare for disinformation of the future. The possibility of manipulation of audio and video information is expected to grow in the coming years and must be addressed, perhaps through digital signatures. However, many of these technologies are dual-use which makes effective regulation more challenging. It may be more worthwhile to focus on malicious actors rather than the tools themselves.

A number of individuals who participated in the conversation formed new relationships that they reported would be useful partnerships in the future.

Gender Reporting

- Estimate the overall number of the participants present at the session: 125-150 participants

- Estimate the overall number of women present at the session: ~60 women

- To what extent did the session discuss gender equality and/or women’s empowerment? The workshop session did not discuss these issues.

- If the session addressed issues related to gender equality and/or women’s empowerment, please provide a brief summary of the discussion: N/A