Session
NRIs Collaborative Session: Technical aspects of content regulation
Theme: Trust
Policy Questions:
- What are the means of online content regulation?
- What are concrete examples of the multistakeholder response toward the content regulation?
- Are content regulation practices able to interfere with Internet identifiers? How?
- How do we set up standards for content regulation while preserving human rights and freedoms online?
Relevance to Theme and Internet Governance:
We are witnessing the rapid growth of inaccurate and hateful content online that is negatively shaping the public opinion and impacting democratic processes, as well as individual people’s lives. This problem also comes with significant implications on various human rights, including freedom of expression, right to privacy, right to be informed, among others. Countries are applying different ways to regulate online content, ranging from introducing standardisation, burdening the online platforms or interfering with inline identifiers. It is not rarely that these means affect the human rights and freedoms of online users, as well as the overall trust between people and the Internet.
Description:
This session will focus on understanding technical and policy aspects of local policies and standards for content restrictions and moderation. It will learn from the multistakeholder communities of the Brazil IGF, France IGF, German IGF, IGF-USA, North Macedonia IGF.
In this sense, the session will address issues related to content regulation, such as (i) policies and standards put in place by governments and local multistakeholder communities; (ii) ways in which content regulation may impact social interactions and people's rights; (iii) potential impacts content regulation may have on Internet identifiers and how to cope with these situations; (iv) mechanisms undertaken to enforce measures, ranging from direct action by public authorities to specific actions in partnership with market players (e.g. ISPs, technical operators and social network platforms).
This session will also outline some of the good practices regarding the regulation of hate speech and disinformation.
Format of the Session:
60-minutes interactive roundtable discussion with introductory remarks and open floor for questions and answers.
18:10-18:15 UTC |
Moderators introduce the topic, organizers and speakers |
18:15-18:40 UTC |
Online content regulation for safeguarding democracy and fundamental rights. Practices from France IGF and Brazil IGF
How do we set up standards for content regulation while preserving the human rights and freedoms online? Are content regulation practices able to interfere with Internet identifiers? Examples from IGF-USA
What are concrete examples of the multistakeholder response toward the content regulation? Learning from North Macedonia IGF
|
18:40-19:00 UTC |
Open discussion with participants. |
19:00-19:10 UTC |
Technical content regulation and ways forward: Action-oriented concluding commitments from the involved NRIs:
|
19:05-19:10 UTC |
Conclusion by the moderator and final key discussed concepts presented by a rapporteur |
Expected Outcomes:
Understanding specific challenges and examples of good practices on local levels, as well as strengthening networks of collaboration on the topic between the NRIs.
Discussion Facilitation:
The moderator will follow the agreed set of policy questions and will allow for introductory, case study remarks by the NRIs speakers. This will be followed by engaging other present participants into developing an interactive discussion.
Online participation:
A dedicated online moderator will be placed next to the onsite moderator. All participants will be using the online speaking queue to be treated equally in their requests for interventions. All input presentations will be made available at the IGF website and links will be shared via the online tool.
Co-Organizers:
- Brazil IGF
- France IGF
- German IGF
- IGF-USA
- North Macedonia IGF
Moderator: Mr. Peter Koch, German IGF
Rapporteur: Luiza Mesquita, Brazil IGF
Connection to SDGs:
ANNEX: Substantive inputs from the co-organizers:
Brazil IGF
Technical aspects of content regulation
The phenomenon of disinformation currently draws attention due to the massive dissemination of false news and the effects it produces on society, especially during electoral periods. However, there are great challenges for the development of actions that make it possible, on the one hand, to safeguard fundamental rights such as freedom of expression, privacy and access to information, and on the other hand, to recover respect for cultural and thought diversity, relevant aspects for democratic processes.
Considering those challenges, a recent publication <https://cgi.br/publicacao/relatorio-internet-desinformacao-e-democracia/> has been launched within the activities of the Brazilian Internet Steering Committee (CGI.br) in terms of Internet and Democracy, with a major focus on disinformation and its effects to democratic processes. This outcome was fostered by a Working Group formed by CGI.br Board Members, which also included several other Brazilian experts in the field, from different national institutions. A full report has been released in March 2020, which serves as a concrete best practice to be shared with other NRIs in the session. A brief overview of the document contents will be provided, followed by a presentation of the multistakeholder process that led to that contribution, as well as the lessons learned from the two-year process, which is still ongoing.
France IGF
Content regulation and tackling the online spread of dangerous contents is drawing the attention of French policymakers. Online platforms are subject to an obligation of transparency (Cf article L 111-7 of the French Consumer Code). A number of legislative initiatives are being discussed. In 2018 France’s parliament has passed a law to tackle disinformation which aims at better-protecting democracy against fake news deliberately spread and to empower judges to order the immediate removal (legal injunction issued by an interim judge) of such content during election campaigns.
A bill is also going discussed (pending current crisis resolution) on hate speech and received a lot of criticism. The draft legislation aims at removing hateful content within 24 hours or face a fine. A number of technical measures are also being considered to block content. A short 24-hours deadline may lead to the use of automated content removal and over-removal. Such legislations may also overlap with the future Digital service act negotiated at EU level.
Report
Policy Questions:
- TBC
TBC
The panelists share a common view regarding content moderation that is the big issues around disinformation and fake news online and the problems to deal with it. There are some initiatives having place in different spheres, but also a lot of challenges, basically among technical and ethical issues regarding the use of technological and legal tools, and the borders between private and public sector. Nevertheless all the countries agree that the debate and good initiatives to tackle with this problem couldn't be postponed. It is worth pointing the experiences that use multistakeholder solutions.
TBC
TBC
TBC
Mr. Lucien Castex, Mr. Diogo Cortiz, Ceweb.br/PUC-SP, Ms. Melinda Clem, Mr. Anastas Mishev and Mr. Boro Jakimovski
TBC
TBC
TBC