Session
Organizer 1: Juliana Novaes, Youth@IGF
Organizer 2: Gustavo Paiva, Grupo de Estudos de Direito da Internet - GEDI@UFRN
Organizer 3: Luísa Côrtes Grego, UFMG
Speaker 1: Dajana Mulaj, Civil Society, Eastern European Group
Speaker 2: Rebecca Ryakitimbo, Civil Society, African Group
Speaker 3: EDUARDO MAGRANI, Civil Society, Latin American and Caribbean Group (GRULAC)
Speaker 4: Martin Fischer, Civil Society, Western European and Others Group (WEOG)
Luísa Côrtes Grego, Civil Society, Latin American and Caribbean Group (GRULAC)
Juliana Novaes, Civil Society, Latin American and Caribbean Group (GRULAC)
Gustavo Paiva, Civil Society, Latin American and Caribbean Group (GRULAC)
Round Table - U-shape - 90 Min
The Policy Questions will be addressed in two groups -- Stage 1 will set the debate over the first third of the workshop, while Stage 2 will be more open-ended, allowing for a much greater degree of audience participation. The structure is explained in details further in the proposal.
Stage 1
In what ways online anonymity in platforms such as discussion forums can be, and is used, towards legitimate, lawful purposes compatible with human and constitutional rights?
To what extent are anonymous forums contributing to violent extremism, radicalization and harmful speech in societies across the globe?
To what measure is the harmful speech displayed in anonymous forums illegal, as opposed to merely distasteful, when considered under an international perspective of multiple countries and their regulations?
In a situation of anonymous harmful speech, how does the individual's or group's intentions and age affect our judgement? Does age play a factor in liability? Do exclusively satirical intentions or political debate trump what would otherwise be harmful content?
Stage 2
Are anonymous platforms otherwise compliant with national regulations, and do they function as regular private sector businesses?
What technical measures can be adopted by the stakeholder groups to minimize the negative impact of anonymous platforms and maximize their positive impact?
Do anonymous platforms require specific regulation? If yes, how should State actors balance it as to not intrude on lawful freedom of speech and online business models? How would it affect other platforms?
What non-regulatory means are available to State actors to effectively, proportionally respond to crisis relating to anonymous spaces, keeping in mind a perspective of minimal interference towards the lawful enjoyment of human rights?
GOAL 5: Gender Equality
GOAL 10: Reduced Inequalities
GOAL 16: Peace, Justice and Strong Institutions
Description: The Policy Questions will stand as the main pillars of the debate. They are divided into two groups.
Stage 1 Questions are brief and direct, meant to set the stage for the debate. Each Speaker will be given one question, tailored to their expertise and experience. After a 4 minute speech from each of them we'll have our first opportunity for interventions, when the audience will be able to complement the basic discussion so far, offering their own answers to the policy questions, or challenge what has been stated.
Afterwards we'll have the Stage 2 Questions, which are more complex and offer a greater space for debate. They all have a strong component of practicality and try to raise debate about what each stakeholder group can do to improve the situation. Once again each Speaker will be given a question, tailored to their expertise.
Afterwards we'll have the main moment of the workshop, 20 minutes of open debate in which the audience will be able to offer their own insight into the questions. We consider that it'll be through this open debate, built upon the Speakers' initial points, that we'll be able to achieve the outcomes described below.
We have set aside 10 minutes for the Speakers to respond to the Open Debate, and further 5 minutes for concluding Remarks. This structure is satisfactory but, notably, flexible; should the debate be too fruitful and intense we can cut down on Concluding Remarks, and so can we merge the Speaker Responses into the Open Debate itself.
We leave 5 minutes aside as a safeguard against delays and, should none occur, extra time for discussion.
Introduction (4 minutes)
Stage 1 Questions (4 minutes for each question, total of 16 minutes.)
First Audience Q&A (10 minutes)
Stage 2 Questions (5 minutes each, total of 20 minutes.)
Open Debate Part 1 (20 minutes)
Speaker Respond to Open Debate (10 minutes)
Concluding Remarks (5 minutes)
Time for Delays: 5 minutes
Total Time: 85 minutes + 5 for Delays.
Expected Outcomes: This workshop aims to bring together people from all different regions, stakeholder, and organizations to have a face-to-face discussion about the impact of videogames communities and anonymous forums have for dissemination of harmful speech among youth.
The session will address three main issues that are central for the discussion (i) the impact of anonymous communities for youth social insertion on the Internet, (ii) the impact of these communities for dissemination of hate speech online and offline and (iii) how regulation and Internet policies can address this issue.
While it is difficult to measure the exact impact of these communities for the socialization of young people on the Internet, the current scenario indicates that many of the participants are young and many of the young people involved with hate crimes are also part of these communities. This session, therefore, has as a goal to increase the debate around the topic, pointing out some of the questions that still need to be analyzed.
Speakers and participants will address the following trigger questions:
How is the Internet changing the nature of relations between youth and how their socialization through videogames and anonymous communities occur? How are these communities strcutured? How does their structure facilitate the dissemination of hate speech? What is the actual impact of it on offline and online hate attacks? How should Internet policies address it, given that anonomity is an essential tool for privacy and freedom of expression?
By addressing those issues we hold the following points as our expected outcomes:
a. Assessing the pros and cons associated with anonymous discussion forums and platforms;
b. Achieving a consensus about the issues that require addressing from the stakeholder groups, in particular regarding harmful speech and radicalization;
c. Assessing how youth contributes and is affected by those discussion forums;
d. Assessing possible responses and plans of action for how stakeholders can minimize negative impacts, and
e. Establishing how regulation can play a part in this debate, and how it may relate to platform regulation in general.
As described in the Workshop Session Description field, this proposal has as a central point two periods in which the audience will have, overall, 30 minutes to actively participate so they may build effectively reach a consensus. We have outlined a two-stage structure for this, in which the first period of the workshop deals with "setting the stage" and making sure all are up-to-date on the current state of affairs. The participants will have an opportunity to intervene then.
On the second period, after 20 minutes of Speakers answering pre-determined central questions, the participants will have a minimum of 20 minutes to engage with the questions and answers provided. There're 15 minutes afterwards scheduled for final Speaker remarks and a Conclusion, which can be reallocated in favor of incentivizing participation should the debate be particularly fruitful.
This is why we have selected the U-Table format, as it allows for interested participants to engage directly via the microphones -- as opposed to requesting an opportunity to the moderator.
The structure of this roundtable is intended to foster an inclusive conversation and promote constructive exchanges between participants and speakers.
Relevance to Theme: While Safety and Resilience often refer to technical aspects of the network, we can also apply it to the physical safety of users and their psychological resilience, as stated in the Theme's description. This workshop aims at exploring this exact point, in the specific context of online anonymous forums and their possible relation with violent radicalization, harmful speech, political extremism and echo chambers.
As an inciting incident, linking those forums to unlawful acts, we can pick the recent New Zealand Christchurch shooting, which so far has been related to a specific online anonymous image board (8chan) in which the shooter announced his manifesto and gave out the link for the shooting's livestream. As of the writing of this document there are 49 dead, painting a bleak picture of how online extremism can lead to a loss of life.
We don't need, however, to exclusively look at violent crimes. Those anonymous image boards can be studied for their extreme content on its own, as well from how they are a means to fully realize freedom of expression for those who would be harmed for their opinions. In this sense, anonymity when coupled with freedom of expression seems to be a two edged sword; it can save the life of the dissident who would be persecuted for his beliefs, as it can cultivate an environment of non-accountability for one's harmful content.
We can trace comparisons between anonymity in this area with how it can play out in the WHOIS system. While in Europe WHOIS anonymity is seen as a matter of privacy, in undeveloped countries an activist's identity can be revealed with a simple online search which can lead to both social and physical harm.
The subject of how those image boards lead to radicalization and violence is made imperative in spaces like the IGF, where the matter can be appreciated under many perspectives. In this proposal so far we've focused on how it impacts the users' security and resilience, but there are direct implications to the Web's Resilience as well -- in response to the Christchurch shooting, New Zealand authorities responded by blocking two anonymous image boards (4chan and 8chan), a video-sharing site (Liveleak) and went on to ban possession of the shooting's video under Possession of Objectionable Material, with a possible jail time of 10 years.
In this sense, we can state this workshop's connection to the Theme of Security, Safety, Stability and Resilience is twofold; mainly it relates to the users' integrity, and secondarily to how countries can respond to threats relating to anonymous forums in ways that do not harm Internet's integrity.
Relevance to Internet Governance: In enjoying the right to freedom of speech, anonymity can be an important shield in defending individuals who, holding on to unpopular opinions, would be harmed by an intolerant majority. It can help ethnic and religious minorities who would have their integrity harmed by totalitarian governments or paramilitary groups. Similarly it can protect those who hold on to unpopular, but otherwise harmless, political opinions, or who identify with, for example, sexual minorities with whom association would cause damage to one's career. In all those circumstances, anonymity comes up as a facilitator to the full realization of one's freedom of speech.
Notwithstanding, anonymous forums are, in a way, conducive to the new wave of privacy protecting regulations worldwide, in the sense that they ultimately abdicate on the requirement of offering any data before gaining access to it.
Simultaneously, anonymity can instill a wide array of harmful behaviors, both to individuals and to the world at large. By often instilling a sentiment that one's actions cannot be traced back, it can be a breeding ground for harmful speech and extremism, which can lead to anonymous communities gradually turning into echo chambers and catalysts for further radicalization.
This phenomena has implications for all of Internet Governance's stakeholder groups. The use of anonymity as a catalyst for freedom of speech and hate speech is a matter of pertinence to civil society, who has a stake on both sides.
Bringing the issue to the discussion of elections, the popularity which Donald Trump achieved among anonymous forum users is considered an important factor for his victory, and anonymity is a conducive tool for the dissemination of fake news both by interested individuals and hired actors. When coupled with those environments' potential to radicalize, such as what has been observed in the recent Christchurch shooting in New Zealand, which has been associated with an anonymous forum, it becomes an urgent matter. Austria has announced they want to enforce identification for Austrian internet users in large platforms, as an attempt to curb hate speech. Suffice it to say that governments, too, have a stake in this debate.
The private sector also has a stake in this matter. Microsoft's TayAI initiative was derailed by an effort originating from an anonymous forum, which consisted in feeding the AI's algorithm with hateful speech. Those forums' tendency towards disruption can be an issue. Those websites are, however, ultimately platforms like any other; would an attempt at regulating them affect other social media businesses? This questioning by itself warrants bringing the private sector to the table, as ultimately it is also a question which can impact other platforms.
When it comes to the technical community, anonymous forums are an environment whose principles hark back, at times, to a cyberoptimistic perspective, to the shedding of national identities, which by itself can be an attractive to technical communities which would prefer to keep its members' identities unknown.
This discussion is particularly appropriate for the IGF as it benefits greatly from a global perspective. The very definition of hate speech is country-dependant, and even an approach based around harmful speech is heavily dependant on cultural differences. When applied beyond the initial issue of freedom of speech, we can assess its implications both on democracies, in how those forums can act as breeding grounds for malicious efforts, and in the safety of people who might be victimized by radical actors instilled by anonymous echo chambers.
In this context, anonymous forums come up as an uniquely relevant type of platform, in that their specificities bring about a mix of old and new issues, given new relevance.
In order to promote an effective discussion on the proposed topics between onsite and online audience and to allow interventions, online participation will be facilitated as mentioned above, as well as via the Youth Observatory online discussion forums.
The opportunity for Q&A will also extend to remote participants, who will be given the opportunity to ask questions through the dedicated online forum.
All of the session organizers have abundant experience managing remote participation in the Youth Observatory and ISOC context and will have no trouble facilitating remote participation.
Proposed Additional Tools: In addition to the aforementioned fora, we will also promote a dedicated hashtag so that the panelists, audience members, and online participants can discuss the issues raised in real time on a more widely accessible medium.
A collaborative document will gather these records of comments and questions during and after the workshop, to be later integrated into the report. A variety of media can also serve as background material for this debate, based on previous workshops. Remote participation tools will ensure an inclusive, accessible, and global audience both via the IGF online participation tools and Youth Observatory online discussion forums.