Check-in and access this session from the IGF Schedule.

Addressing Terrorist and Violent Extremist Content Online

     

    Session description / objective

    The reach of content published online is amplified through social media platforms at a speed never seen before. This has promoted democratic values by empowering individuals, giving a growing voice to those who have not been heard before. These platforms are a great example of how Internet-powered innovation have enhanced the way people participate in society from an economic, social, political, and cultural perspective.

    Among these positives, however,  there are increasing risks associated with the proliferation of hate across social media platforms, including the amplification of violent extremism, which has affected the way people feel safe and secure both online and offline. In light of events like the March 2019 Christchurch mosques attacks, there is a growing expectation for responses that quickly identify threats and bring effective action to prevent and minimize the damage that the viral amplification of terrorist and violent extremist content online can cause. At the same time, there are risks associated with such responses as different rights and freedoms come into play (such as freedom of expression, freedom of thought, conscience and religion).

    Social media platforms have embarked on processes to develop their own community standards, incorporating the feedback from their community of users to deliver upgrades, new services and tools as well as decide what is acceptable content and behavior online and what is not. Industry Forums have formed to coordinate efforts and share best practices. Besides these efforts around self-regulation, other approaches to these challenges include co-regulation, working with regulators, etc. However, it is not entirely clear how these processes are strengthening the rule of law, following due process, and been consultative, inclusive and open enough.

    This main session will focus on different responsibilities, responses, rights and risks involved in policy approaches to dealing with terrorist and violent extremist content online.  It will consider regulatory and non-regulatory approaches by social media platforms, as well as how such platforms address violent extremist content uploaded to their services by end users. 

    Policy Questions

    The session will focus around four main areas of discussion: responsibilities, responses, rights and risks.

    Responsibilities: A holistic approach to addressing terrorist and violent extremist content online requires engagement and action by different types of stakeholders.  

    • What are the different responsibilities of the different stakeholders to each other in developing solutions to the spread of terrorist and violent extremist content online?

    Responses: Different governments have responded in different ways to terrorist and violent extremist content online, from creating criminal liability for executives through legislation to joining voluntary declarations, such as the Christchurch Call. Industry has developed collective responses, as well as platform-specific approaches to dealing with violent extremist content. 

    • How have governments responded to the spread of terrorist and violent extremist content online?  How has the private sector responded?

    Rights: Laws and policies that regulate or moderate Internet content raise questions about freedom of expression and other human rights.  While international human rights law is only directly binding on states, under the UN Guiding Principles on Business and Human Rights, platforms have a corporate responsibility to respect human rights, including freedom of expression.  

    • What are the different human rights that are relevant to the discussion of terrorist and violent extremist content online, and why?

    Risks: As mentioned in the Rights section, Internet content regulation raises freedom of expression and other human rights concerns. Regulation may also have technical impacts.    

    • What are the potential risks to different human rights posed by terrorist and violent extremist content regulation and how are these risks being addressed?

    Speakers

    • Gert Billen. State Secretary. Federal Ministry of Justice and Consumer Protection, Government of the Federal Republic of Germany. Germany.
      Government. Male. WEOG.

    • Dr. Sharri Clark. Senior Advisor for Cyber and Countering Violent Extremism, U.S. Department of State. United States.
      Government. Female. WEOG.

    • Paul Ash. Acting Director, National Security Policy Directorate, Department of Prime Minister and Cabinet. New Zealand.
      Government. Male. WEOG.

    • Courtney Gregoire. Chief Digital Safety Officer Microsoft Corporation. United States.
      Private Sector. Female. WEOG.

    • Brian Fishman. Policy Director, Counterterrorism, Facebook. United States.
      Private Sector. Male. WEOG.

    • Eunpil Choi. Chief Research Fellow from the Government Relations & Policy Affair Team, Kakao Corp, Korea.
      Private Sector. Female. Asia Pacific.

    • Professor Kyung Sin Park. Korea University Law School. Korea
      Technical Community (Academia). Male. Asia Pacific.

    • Yudhanjaya Wijeratne. Team Lead - Alghoritms for Policy, LIRNEasia. Sri Lanka
      Civil Society. Male. Asia Pacific.

    • Edison Lanza. Special Rapporteur on Freedom of Expression, Inter-American Commission on Human Rights. Uruguay.
      Intergovernmental Organization. Male. GRULAC.

    Moderators

    • Moderator: Jordan Carter, InternetNZ, WEOG (New Zealand). Technical Community. Male

    • Remote participation moderator: MAG member Susan Chalmers. NTIA, WEOG (United States), Government, Female.  

    Agenda

    11:15 – 11:35 Introduction and opening statements (20m)

    • The Moderator explains the policy problem, provides an overview of the session agenda, and introduces the panellists

    • New Zealand PM Jacinda Ardern video address

    11:35 – 12:15     Responsibilities & Responses (40m)

    Professor Park (Academia), will introduce this section of the agenda, focusing on responsibilities and responses when addressing Terrorist and Violent Extremist Content online. The panellists will discuss around the following:

    • What are the different responsibilities of the different stakeholders to each other in developing solutions to the problem of Terrorist and Violent Extremist Content online?

    • How have governments responded to the spread of Terrorist and Violent Extremist Content online? 

    • What is the private sector doing to address Terrorist and Violent Extremist Content online? 

    • How has the civil society responded?

    12:15 – 12:40 Rights & Risks (25m)

    Edison Lanza will introduce this part of the discussion, focusing on rights and risks associated with addressing Terrorist and Violent Extremist Content online. 

    12:40 – 13:00 Audience Interaction (20m) Moderator receives interventions from the Floor & panellists respond.

    13:00 – 13:15 Looking Forward (15m)

    • What are your policy recommendations going forward for addressing Terrorist and Violent Extremist Content online?

    • What role can the IGF ecosystem play in making progress on this Internet policy issue?

    • Concluding remarks by Moderator

    Plan for in-room participant engagement/interaction?

    The moderator will provide details about how the audience can use of Speaking Queue for the Q&A on site as well as for the remote participation. The moderator will request the audience to use #contentgov #IGF2019 hashtags.

    The moderator will encourage discussion among the speakers and take questions from the audience in between the different sections of the agenda. Questions from the Policy Questions list could be used to support the discussion. The floor will be opened for questions from the audience as well as remote participants. The moderator will engage with the audience and encourage them to ask questions, managing the flow of the discussion. 

    Remote moderator/Plan for online interaction?

    There will be a remote moderator that will encourage the remote participants to ask questions and make comments and assist with how the online conversation is integrated with the discussion in the room. The plan for the session will be posted prior to the event, questions and comments will be gathered to enrich the on site participation. The organizers will liaise with the IGF secretariat to engage remote hubs to gather input prior to the event, in case the real time options are too difficult to handle. 

    Connections with other sessions?

    From the 2019 program, the following sessions will be tackling aspects of terrorist and violent extremist content online:

    The session will also be linked to the Main Session on Technical and Operational issues that was organized last year, which focused on Content Blocking and Filtering. The session covered the importance of definitions, due process, and technical implications (around 3 or 4 of the policy questions that we have listed here were covered on that session to a certain extend). Proposals & Report: https://www.intgovforum.org/content/igf-2018-technical-operational-topics 

    From the 2017 IGFs, the session "A Net of Rights: Human Rights Impact Assessments for the Future of the Internet" also covered similar concerns.

    Desired results/output? Possible next steps for the work?

    This main session aims to approach the issue of Terrorist and Violent Extremist Content online by addressing the different responsibilities, responses, rights and risks that governments, private sector and civil society have to consider when they are designing and implementing policy approaches. The session will help to identify the approaches and challenges stakeholders face when coming up with responses that are in fact solving the problem, but also strengthening the rule of law, following due process, been consultative, inclusive and open enough. The main session will produce a session report documenting the contributions from the panellist, as well as the input from the audience, that could serve both as inspiration for the development of more holistic and integrated policy frameworks at national and regional levels, and as a contribution to strengthening the multi-disciplinary approach within the IGF. It is anticipated that a series of blog articles will be published following the session, from a variety of perspectives, to raise awareness among stakeholders about the challenges to address Terrorist and Violent Extremist Content online.

    Co-organizers (MAG members)

    • Sylvia Cadena. APNIC Foundation.
      GRULAC & WEOG (Colombia, Australia), Technical Community, Female

    • Susan Chalmers. NTIA.
      WEOG (United States), Government, Female

    • Jutta Croll. Stiftung Digitale Chancen.
      WEOG (Germany), Civil Society, Female

    1. Key Policy Questions and Expectations

    This main session focused  on the different responsibilities, responses, rights, and risks involved in policy approaches to dealing with terrorist and violent extremist content (TVEC) online.  Regulatory and non-regulatory approaches were considered, including how Internet platforms deal with TVEC uploaded to their services by end users. Panelists addressed various policy questions, including:

    • What are the different responsibilities of the different stakeholders to each other and to the broader public in developing stragegiess to fight the spread of TVEC online?
    • How have governments responded to the spread of TVEC online?  How has the private sector responded?
    • What are the different human rights that are relevant to the discussion of TVEC online, and why?
    • What are the potential risks to different human rights posed by TVEC regulation and how are these risks being addressed?

     

    2. Summary of Issues Discussed
    1. Different government policy approaches: New Zealand, Germany, and the United States
        • New Zealand
        • Germany
        • United States
    2. How tech companies are responding to TVEC: Kakao, Facebook, Microsoft and GIFCT
    3. Policy implementation: Challenges for resource-poor languages in the Global South
    4. Rights and Risks: The tension between TVEC regulation and Freedom of Expression

    Should TVEC be treated as Hate Speech? Does TVEC equal “hate crime”?

     

     

    NZ PM Ardern addressed the audience with 4 main messages: 1) Respect for international law (human rights and counterterrorism frameworks); 2) Free open, interoperable, global Internet to preserve the benefits of connectivity; 3) Collaboration and consultation in a multistakeholder approach is key; 4) Strengthening and engaging collaborative efforts.

     

    Park introduced the responsibilities and responses section of the agenda and then all panellists presented their initial remarks.

     

    Park: Although there is a proliferation of violent content across platforms, services and applications, content can’t be banned just because it is violent but based on the external harms it may cause. Understanding hate speech is a fundamental part to address TVEC online as well as the impact of trying to counter it. "TVEC" is a "misnomer" and the discussion should instead be couched in the well-established analytical framework for “hate speech.” "What we really mean by violent and extremist content is 'hate speech.'"

     

    The statement from Park summarizes two SCOTUS decisions by saying that “Content can be banned only for its external harms.” (see https://www.supremecourt.gov/opinions/10pdf/08-1448.pdf ; https://www.supremecourt.gov/opinions/09pdf/08-769.pdf)

     

    Park’s statement aroused disagreement from both the US representative and GIFCT/Facebook. Also Gerd Billen, State secretary from the German Ministry of Justice and Consumer Protection did not agree with Park and explained that the German Network Enforcement Act NetzDG does not focus on hate speech but hate crime, i.e. content that falls under the regulations of the German Penal Code like Nazi symbols.)  

     

    Ash: Identification of stakeholders and concrete definitions of their roles and responsibilities is far from being an easy task, as each stakeholder has a different understanding of what responsibility entails, the different vocabularies each stakeholder uses and the resources they have to effectively engage in the response. 

     

    “We're working in difficult, uncharted territory, where work like legislation can have significant unintended consequence unless it's rooted in a collaboration of parties.” It is essential to work with the companies, government partners, and a broad swath of civil society actors - from those concerned primarily with Freedom of Expression to those who work to protect the rights of the victims. This collaboration lead to the Christchurch Call, at the heart of which is respect of international law, counterterrorism law, and human rights law.  It is a response to “someone murdering 51 people peacefully at worship and livestreaming that across the world.”

     

     

    Billen: Extremists final goal is trying to destroy democracy and pluralistic approach to society . Freedom of expression is at stake as well as the freedom to exercise individual rights. He highlighted how many tech companies reduce their counteraction to TVEC content to the application of community standards. Therefore Germany responded with the Network Enforcement Act to increase transparency and accountability, in reference to Germany’s criminal law. Only content that violates Germany’s criminal law can be deleted or removed since community standards -however valuable- as they are not approved by parliament equate to private rules or regulations. It is improtant to protect victims of these crimes and to invest more in digital literacy programs and in understanding the criminals and their networks better.

     

    Germany is working with France to deliver ideas to the European Union on this issue in the discussions on the upcoming Digital Services Act legislation.

     

    Clark: Building long term resilience and responses to terrorist messaging is at the core of the US response on how to tackle TVEC not only short-term content removal. US Constitution and strong commitment to Freedom of Expression, expressed through the First Amendment, informs the US approach, as well as international obligations and a commitment to human rights.  Guiding principles for policy approach: 1) US law does not compel removal of content unless it clearly violates US law. There are types of TVEC content -beliefs alone however extremists- are protected under the US constitution first amendment. The US approach encourages 2) voluntary collaboration with tech companies through strengthening and expanding terms of service and community standards, instead of designing new regulations. 3) The most effective means from their point of view is not censorship or repression but through more speech that promotes tolerance, cultivating critical thinking skills and raising awareness on a multistakeholder approach. In the US, there is a line between hate speech and violent extremist content, so only speech that calls for violence is not protected under the constitution.

     

    It is important to note that some governments have used counterterrorism as a pretext to crush political dissent. An experience from working with former members of groups focused on racially or ethnically motivated terrorism, is that they’ve said government censorship is one of the best recruitment tools because it reinforces the group’s narrative of oppression.

     

    Choi: User protection is at the center of Kakao’s products design and use as part of their social responsibility (social media, messaging and news). This human centric design is shared among many other companies in South Korea concerned about Korean culture and values and online safety, with a strong emphasis on engaging the user to reporting issues that are cause of concern and acquire digital skills to effectively engage in conversations and dialogue online. The company continues to adjust the design of its products to counteract bad behavior and protect user rights. Kakao encourages the user to use reporting tools to help prevent the spread of harmful content. Kakao’s policies are developed in conjunction with other Korean companies through KISO - the Korean Internet Self-Governance Organization (https://www.kiso.or.kr/%EA%B8%B0%EA%B5%AC%EC%86%8C%EA%B0%9C/), established in 2009.

     

    Fishman: Facebook works to coordinate efforts across different departments of the company (e.g. legal, engineering). Brings in independent experts to help develop policy. More than 350 employees globally are focused on TVEC; 10 million pieces of content have been removed for TVEC alone in 6mo period.

     

    Facebook’s efforts cover 5 main areas around dangerous organizations (terrorist, hate, criminal groups): 1) enforcement of Community Standards and Terms of Service; 2) engagement with law enforcement in response to information requests or credible threats of violence; 3) support counter-speech; 4) look after staff dealing with this TVEC; 5) engage industry partners beyond competition. The biggest challenge is the scale and how to take context into account, even using AI and machine learning. This is the reason for why the policy is global - making enforcement infrastructure for national level legal structures for all countries an extraordinarily difficult challenge

     

    The GIFCT was established by tech companies (Facebook, Microsoft, Google and Twitter) to share best -and worst- practices and evolved into sharing hashes of known terrorist content and coordination . Now as an independent organization, the main issues arising are: 1) an industry led effort, with strong engagement with governments through an advisory structure (only those that are signatories of the Freedom Online Coalition and respect human rights) ; 2) good, effective training for smaller companies’ platforms and online services to define their own terms of service and technology development, including AI; 3) continue the collaboration to coordinate responses; 4) to share capacity and 5) to sponsor research. Terrorism is a strategy of the weak to provoke responses that are not conducive to long term interest. As we think about the long term, it is more important to find strength on what we stand for as opposed to against.

     

    Gregoire: highlighted that clear definitions of the roles and responsibilities of each stakeholder are as important as the responses we come up with. Those definitions are the base to build effective collaboration and partnerships that lead to concrete responses. For Microsoft, the responsibilities vary, not only from their social media platform but around the productivity tools that the company provides.

     

    Microsoft organizes its work around advocacy, internal policy, tech and tools for enforcement of that policy, and partnerships with the broader ecosystem - e.g. Microsoft is a strong supporter of the Chch Call and its multistakeholder approach.  The Chch Call is unique in articulating the roles and responsibilities of the different stakeholders.  Government’s role is to counter all notions of what breeds violent extremism, including lack of economic opportunity.  Private sector needs to focus on enforcement, transparency, and knowledge sharing, upholding human rights. 

     

    Wijeratne: Responses against TVEC have to deal with the fact that people are dying while the conversations about how to deal with what is happening are taking place. Watchdog was a civic tech, first responder, civil society response. At the scale and speed at which hate speech and harmful content grows in Sri Lanka, in conjunction with the scarcity of linguistics and etymological resources available to understand how hate speech manifests in those languages, the enforcement of terms of service in non-English speaking countries becomes almost impossible. The connection between local expertise and research capacity in the global south and where the data sets are (where the companies are) needs to be supported as it is impossible to have technical systems without biases, so multidisciplinary teams should be analysing the datasets. It is necessary to consider the technical aspects of the implementation of those policies into concrete action.

     

    Would estimate that a majority of users engage in hate speech in Sri Lanka, as compared with Germany, but the enforcement of policies in countries like Sri Lanka and in the global south is relatively low.  This is because the technical implementation of policies is not designed for “resource-poor” languages. What an engineer can do, in terms of building enforcement tools, in a well-researched language belonging to the West Germanic language tree is 10 years ahead of what can be done in Sinhala or Tamil; algorithmic design problems are based upon language.  E.g. Removal policies for hate speech may focus on specific threats to occur - things in the future - but Sinhala does not have a future tense. Understanding when hate speech occurs in different countries requires analysts to understand the ethnicities involved in the conversation. Collaboration is needed between parties who have the data sets and local academics.

     

    Lanza introduces the rights and risks section.

     

    Lanza: TVEC online moderation is probably one of the most difficult areas of content regulation due to the incompatibility of current policies and technical tools with the existing human rights framework. Protection to freedom of speech is falling on the tech companies, without enough understanding of what their responsibility is, about what is at stake for society -not for the companies-. Companies are recommended to assess their terms of service and community standards against the human rights framework, especially around the responses they design (remedies) and the appealing mechanisms in strong collaboration with local experts. Censorship is not an effective response to violent extremism. Content filtering policy has to meet the requirements of legality, proportionality, and necessity.

     

    It was mentioned that terrorism is not so much of a problem in Latin America as incitement to violence by gangs. It is mportant that states refrain from applying speech restrictions in a broad manner - including words like “glorifying, justifying, or encouraging” which results in broader criminalization of speech than allowed under international law.  Companies should conduct human rights impact assessments of content moderation policies, these should be necessary & proportionate, transparent and accessible appeal mechanisms should be provided, and thecountry context should be taken in account. Another suggestion would be to develop international standards for content moderation policies.

     

    Panelists offered their views around the rights and risks

     

    Clark: Regulations and enforcement -of conflicting regulations- can restrict innovation and commerce. She emphasized how difficult enforcement can be as definitions are so unclear and contradictory. Focusing on technical solutions takes the attention off the actual perpetrators and may even violate human rights as is the case with upload filters developed by companies. She stressed that voluntary collaboration is a better approach. Regulation (like the German Network Enforcement Act) may be an inspiration for other less-democratic regimes to apply censorship and restrict freedom of speech. A response to this threat should not put the open Internet at risk.

     

    Park: The laws that impose liability on the platforms for not removing content and obliges them to engage in general monitoring and incentivize the use of upload filters or other forms of prior censorship conflict with e-commerce directive from the EU and ignoring intermediary liability principles included as part of Freedom of Expression safeguards. When faced with compliance with mandatory takedowns, operators rush to delete content, as they may not have enough time to make an evidence-based decision. This might lead to suppress counter-speech and takedown lawful content prepared as part of awareness campaigns, counteracting misinformation.

     

    Billen: The experience in Germany is that companies do not delete everything related to a complaint they receive, only about 20 to 25%. That shows they are deliberating and finding out what is illegal and what has to be tolerated to preserve freedom of speech. Pressure from civil society and victims in Germany has led not only to content takedown but to the cancellation of platforms accounts of extremists. As they search for other platforms to continue, they have not succeeded in taking their followers from the big platforms with them, which has the positive effect on having less people exposed to this type of content.

     

    Gregoire: Freedom of expression is important as well as the right to access information and the right to privacy. Concerning trends from the notice and takedowns regulations around laws that have extraterritorial implications as well as mirror content, as context matters. How are we going to address narrowly defined harms that uphold the global framework is something that requires deep thinking.

     

    Ash: This is hard, but not acting is a major risk. We have to put the victims at the center of a response about how to deal with this. There is a big risk to the Internet as a whole, as the perception from those affected, calls for additional protections which might lead to loose the benefits the open and free Internet can bring.

     

    Wijeratne: Bad actors are not the only ones spreading TVEC online. In the hunt for bad actors it is important to acknowledge that the virality of this content is in most cases caused by terrified people trying to warn their loved ones. It is mathematically impossible to design systems without bias. We have to bring humans to deal with the false positives and false negatives. The datasets and protocols observed to design these systems at the tech companies, should be open to civil society to have a multidisciplinary interrogation, so the companies get advice on local cultures and languages, human rights frameworks and legal frameworks.

     

    The moderator opened the floor for questions and comments.

     

    1. How/what is Facebook doing in the context of the war in Afghanistan? Fishman answered that their policies are global and do apply in the Afghan context. As their policies start from what they called “dangerous organizations” one of the issues they faced is how to deal with the content that such organizations might produce as part of peace negotiations.
    2. Are we close to a real-time global response for takedown of harmful content? Wijeratne answered that due diligence is really important to actually takedown content as the risk to act with haste can cause more harm.
    3. Online content is only a portion of TVEC content available. Some of the responses required will be conflicting with the business models of the tech companies, the role of governments to make sure effective action is taken. It is important to understand how people get radicalized and their motivation to be able to deal with the effects. Platforms should take responsibility as publishers have to.
    4. There is a bias on terrorism research and we are failing to understand the lessons learned from the last few decades. It is important to acknowledge that terrorism may have a root in state sponsored terror. Some organizations tagged as terrorists in the past can negotiate peace and their evolution is part of the process to grow as a society. Historical records should be preserved to be able to understand it and learn from it. Fishman agreed with the need to have a multistakeholder discussion -and agreement and real collaboration- for a historical record to be kept.
    5. Do you believe nations should be held accountable for the actions of companies and organizations within their borders for producing and promoting TVEC?
    6. How do you see traditional news organizations play a role? Clark responded that a lot of radicalization takes place by traditional media. She highlighted the role of news organizations to provide clarifications, correct misinformation, and support the production of counter-speech to promote tolerance. Wijeratne highlighted how blocking content online is not going to stop humans hating each other as a lot of the harm to individuals caused during war is not mediated by technology and that not everyone is acting in everyone’s best interest. Lanza highlighted the need to protect journalists, researchers and their sources. He reminded the audience that government officials should also be held accountable for the impact of their words, as they have a duty to respect the right to protest. He said that the legal framework of one country should not be the only test for TVEC responses, but reviewed under international frameworks.

     

    Closing remarks

     

    Park: Although it is welcome progress to have principles of intermediary liability incorporated into national law and international frameworks, it is important not to take them too far and use them as an excuse for companies to avoid responsibility. Takedown of content should be the last resort. Mandatory takedowns may also hinder innovation and diversity in the market as new platforms addressing TVEC in a different way may not have enough resources to compete with the current platforms that dominate the market.

     

    Billen: We should not limit the conversation toTVEC online but also support research into the reasons why people become an extremist and how to avoid it. It should not be only about the platforms’ responsibilities.

     

    Choi: The private sector should take seriously their responsibility to solve this problem and their response should be agile. Respect for law and human rights is key to do business. Digital and media literacy is key for a safe Internet.

     

    Gregoire: The tools of opportunity that the Internet has provided can be weaponized. The IGF is a key space to seek common ground to define the responsibilities of each stakeholder, to clearly articulate what is the problem that we are trying to solve and what are the values and the rights we are trying to preserve.

     

    Ash: This is hard, a holistic approach to deal with the harms and victims' rights is the right thing and it has to be done together.

     

    Wijeratne: More research is required to fill the gaps. Civil society and technical community should be encouraged and supported for in-depth understanding of local contexts.

     

    Fishmann: Ambiguity in law may cause conservative responses from companies, for example platforms not allowing access to datasets to academic and civil society in light of not fully understanding what GDPR implications might be. Clear definitions around responsibilities are needed.

     

    Clark: What success looks like? What it is that we are expecting to happen? A balance response between strong security and fundamental rights. IGF’s role is key to discuss these issues and deepen the understanding around other types of harmful content across many sessions.

     

    Lanza: A raise for criminalization and censorship is not the best way forward. Clearer definitions around what is content that incites violence and the responsibilities for each stakeholder requires a multistakeholder approach to come up with global solutions.

     

    In closing, the moderator called to continue the discussion to deepen the understanding and the reasons why there are different approaches to solve this problem.

    3. Policy Recommendations or Suggestions for the Way Forward

    Both during the video address of PM Ardern and as part of all the panelists interventions there was strong agreement that the Internet is a powerful force for good, but that terrorist and violent extremist content online at a global scale requires a multistakeholder, inclusive and concrete response, taking into account the risks, rights and responsibilities.

    There was also strong agreement that human rights need to be upheld, no extremist should have the right to neglect and demolish the human rights of any other person, like the Christchurch terrorist did to his victims by the murder and the live-streaming.

    Legal regulation was considered as one option in particular to avoid companies being in sole responsibility for the decision on what should be deleted and what should not. Nonetheless a balanced approach was demanded since governments might misuse regulation to oppress free speech.

    4. Other Initiatives Addressing the Session Issues

    Links to:

    Christchurch call

    GIFCT

    German law: Act to Improve Enforcement of the Law in Social Networks non official translation

    US law

    Kakao policy

    Microsoft policy

    Facebook policy 

    Watchdog

    OAS freedom of speech

    Australian law

    Korea law

    Manila Principles

    French law

    5. Making Progress for Tackled Issues

    The IGF’s role was considered as key to discuss the issues of TVEC and deepen the understanding around other types of harmful content. The IGF is also seen as a key space to seek common ground to define the responsibilities of each stakeholder, to clearly articulate what is the problem we are trying to solve and what are the values and the rights we are trying to preserve.

    Civil society andc technical community should be encouraged and supported to do more research for in-depth understanding of local contexts.

     

    The IGF should be the platform to develop international standards for content moderation policies.

    6. Estimated Participation

    400 people, around 100 women on-site.

    7. Reflection to Gender Issues

    The session did not address gender issues in particular.

    8. Session Outputs

    Transcripts and video of the session are available.