IGF 2022 Reports

IGF 2022 Open Forum #79 CGI.br’s Gender and Diversity Agenda for ICTs

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:
The CGI.br Working Group on Gender, Race and Diversity is preparing a document on ‎Brazil's main challenges for diversity and inclusion in ICT to be published next year. ‎Through a series of workshops, ten challenges were collected and presented in the Open ‎Forum, so the public could give their opinions and comment on them.‎, The challenges showed that, despite the problems related to themes like surveillance, vio-‎lence on the Internet, or the dangers of the artificial intelligence to women and people of ‎color, Brazilian most significant challenges are still related to the low use and participation ‎of women on the Internet.‎
Calls to Action
Build an agenda of action in order to be able to prevent the deepening of the gaps ‎between women and men. It is important to make a list of our homework and address ‎these issues.‎, Continue with the publication about diversity and Internet Governance in Brazil, as we ‎found out throughout the process the existence of a great demand to publish materials on ‎these themes.‎
Session Report

Organized by the Brazilian Internet Steering Committee (CGI.br). the Open Forum, focused on providing an overview of the activities developed by CGI.br within the scope of the Gender and Diversity Agenda, a project that was implemented in 2021 to investigate challenges to promote gender and ethnic-racial diversity in the IT sector and Internet Governance ecosystem in Brazil.

The session was presented by Beatriz Barbosa, Laura Tresca and Gabriela Nardy, and divided into two parts: 1) the motivations, objectives and methodology of the initiative; 2) the results of the national workshops.

In the first part, Beatriz presented the guidelines for developing the workshops, how they were held and conducted, ensuring regional and ethnic diversity of women, and leading the group to the joint construction of the main challenges in Brazil to promote diversity in ICTs. The second one focused on presenting the ten challenges found in the workshops, how the debate took place and how it relates to the Brazilian scenario. The ten challenges are:

1. Produce data that includes information on gender, race and ethnicity regarding presence participation in the technology sector

2. Develop public and private policies that promote diversity and equity in Internet access, use and development

3. Address gender and racial violence and different forms of oppression on social media/Internet platforms

4. Ensure access to information and the exercise of freedom of expression for women on the Internet

5. Promote policies with gender, equity and diversity perspective

6. Build capacities for girls and women regarding ICTs considering the intersectional perspective of class, age, race, sexuality and persons with disabilities

7. Support and foster civil society initiatives that promote diversity within the Internet ecosystem

8. Create a work environment that is favorable for women in Internet and technology companies

9. Ensure gender and racial diversity in Internet governance spaces

10. Economic empowerment of women in the online environment

After the presentation, some comments highlighted how this initiative can inspire actions into other countries. The comments from the audience could be divided in three areas: policies that could contribute to reduce inequalities, education, and intersecctionall issues.

A participant from Uganda mentioned that, in his country, the digital world is creating even more gaps between women and men. The debate revolved around policies that could tackle these challenges not only in the digital environment, but, somehow, to make women equally benefit from the digital technologies. Another one ranted about university courses and asked how we could make STEM more appealing to women. A comment from an online participant from Brazil said that the first step to advance on themes like facial recognition and transgender is producing material on the topic. Finally, another woman from Brazil talked about the importance of making women feel like they belong in tech and how to provide a warm environment where they can feel safe to learn and work.

Complete report also available here

IGF 2022 WS #401 Strengthening African voices in global digital policy

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:
Are the African voices heard in global digital policy? The short answer would be not as much as they should. It is ’s not that they are non-existent, but they do not reflect the size of the continent in terms of the population and future digital opportunities that are ahead of Africa.
Calls to Action
African voices need to be heard in international digital policy processes not only to pursue African interests, but also to ensure a more stable, safe, and prosperous internet. Because a stable and prosperous global digital space is contingent on Africa’s meaningful participation, Strengthening the African voices in global digital governance requires strengthening the region’s representations in three specific tracks - intergovernmental (e.g., UN), multistakeholder (e.g., IGF) and non-governmental.
Session Report

Are African voices heard in global digital policy? The short answer would be not as much as they should. It is not that they are non-existent, but they do not reflect the size of the continent in terms of the population and future digital opportunities that are ahead of Africa. However, Africa has played a unique role in developing global internet public policy, starting from 2005 and the Tunis phase of the WSIS, to this year’s IGF, which is the third IGF on African soil.

Strengthening African voices in global digital governance requires strengthening the region’s representations in three specific tracks – intergovernmental (e.g., UN), multistakeholder (e.g., IGF) and nongovernmental. Enhancing the voice in the latter is perceived as challenging as African and other developing countries do not have equal presence in large companies and international civil society organisations, which would reflect, defend, and promote Africa’s voice within that track.

Arecent report published by Diplo on ‘Stronger digital voices from Africa: Building African digital foreign policy and diplomacy’ provides a snapshot of Africa’s digital diplomacy, drawing on lessons learned, good practices from Africa and beyond, and some of the underlying challenges to be addressed through ‘whole of government’ and ‘whole of society’ approaches.

The study shows that African countries are not really lagging behind more developed countries in formulating a digital foreign policy, as the process is in its inception. To date, there have been less than 10 countries, including Switzerland, France, Australia, and Denmark, with comprehensive digital foreign policy strategies. Also, there are elements of digital foreign policy and diplomacy in specific national strategies of African countries on the questions of connectivity, cybersecurity, data, and other digital issues, which can form the basis of future comprehensive digital foreign policy approaches. However, most of these strategies are yet to be implemented. 

Africa finds itself amid the so-called ‘digital cold war in the making’, which is shaping the environment in which Africa contributes to global digital policy.Africa, therefore, has to position itself smartly to maximise its development potential and avoid risks. To address the challenges and maximise its potential, Africa needs a holistic approach to activate all possible resources for representation. The sheer number and variety of digital policy issues require the involvement of all actors across and beyond the national spectrum. To this end, the speakers also highlighted the important role the diaspora plays in increasing the African impact on digital.

The study also provides an in-depth assessment of how Africa positions itself on a number of digital policy topics, from telecommunications infrastructure, data, AI, and cybersecurity, to development and sociocultural issues such as multilingualism and digital identity, focusing on eight countries. For instance, it was concluded that Africa is moving well on digital infrastructure with more and more cables being deployed. It is moving rather slowly on frontier technologies and issues such as cybersecurity and cybercrime and the digital economy, which are of paramount importance to Africa, although it has made significant progress in the past years (e.g., mobile banking). That said, it is doing rather well on sociocultural issues such as digital identity.

Finally, the study addressed the cooperation of African countries with major global actors, such as the EU, China, the USA, and India, but also international organisations and digital hubs, such as Geneva. One of the important findings is the lack of African representation in standardisation processes in Geneva which is reflected in the number of chairmanship positions in committees and working groups of standardisation organisations.   

African voices need to be heard not only to pursue African interests, but also to ensure a more stable, safe, and prosperous internet. Because a stable and prosperous global digital space is contingent on Africa’s meaningful participation. To achieve this, Africa needs to learn from other countries and actors, but it cannot simply replicate the solutions developed for other regions. Africans need to ensure they are involved in the design of their infrastructure. What Africa needs is an open infrastructure, as there is no single technology that can address all problems on the African continent.  

In developing an African approach to digital, it is essential to start from a solid base which could be found in the Tunisia and Geneva outcomes and G77 negotiations, rather than from scratch. 

Increasing research capacities and academic programmes in the field of diplomacy is equally important, and Diplo has played its part by recently providing training for Namibian and Rwandan diplomats. Other speakers also highlighted the need to mainstream (digital) literacy to create a critical mass of citizens that are consuming services and products on the internet value chain. This is where the importance of mentorship comes into play. The speakers reflected on community networks built around mentorship and the grassroot movements. 

The role of communities of practice has also been noted, as they can ensure a stronger representation of African interests in global digital discussions. While training is important, sustainable impact is created through institutions, within the African Union, national government, or universities.

The speakers also tackled the issue of the lack of ‘buy-in’ from African policymakers for digital transformation and technology. That said, progress has been made in engaging African parliamentarians with creating the African Parliamentary Network on Internet Governance (APNIG) during the African IGF in Malawi. There is a hope that through capacity development on local and relatable digital policy issues and greater engagement, digital will become a priority for policymakers across the region. 

IGF 2022 Town Hall #37 Beyond the opacity excuse: AI transparency and communities

Updated:
Addressing Advanced Technologies, including AI
Session Report

Internet Governance Forum 2022 Addis Ababa, Ethiopia

 

Panel title: “Beyond the opacity excuse: experiences of AI transparency with affected communities”

Date: Dec 2, 2022

Time: 3 pm UTC+3

The moderators

  • Mariel Sousa (Policy Advisor, iRights.Lab, Germany)
  • José Renato Laranjeira de Pereira (German Chancellor Fellow, iRights.Lab & Co-Founder, Laboratory of Public Policy and Internet/LAPIN, Brazil)

The panellists

  • Nina da Hora: (Brazil, Thoughtworks): https://www.linkedin.com/in/ninadahora/
    • Nina da Hora is a 27-year-old scientist in the making - as she identifies herself - and an anti-racist hacker. Nina holds a BA in Computer Science from PUC-Rio and researches Justice and Ethics in AI. She is also a columnist for MIT Technology Review Brazil and is part of the Security Advisory Board of Tik Tok Brasil and the transparency board for the 2022 Brazilian elections created by the Superior Electoral Court. She has recently joined the Thoughtworks team as a Domain Specialist to think about responsible technologies for the Brazilian industry.
  • Hsu Yen Chiah (Taiwan, University of Amsterdam): http://yenchiah.me/
    • Yen-Chia Hsu is an assistant professor in the MultiX group at the Informatics Institute, University of Amsterdam. His research is focused on studying how technology can support citizen participation, public engagement, citizen science, and community empowerment. Specifically, he co-designs, implements, deploys and evaluates interactive AI and visual analytics systems that empower communities, especially in addressing environmental and social issues. He received his PhD in Robotics in 2018 from the Robotics Institute at Carnegie Mellon University (CMU), where he conducted research on using technology to empower local communities in tackling air pollution. He received his Master's degree in tangible interaction design in 2012 from the School of Architecture at CMU, where he studied and built prototypes of interactive robots and wearable devices. Before CMU, he earned his dual Bachelor's degree in both architecture and computer science in 2010 at National Cheng Kung University, Taiwan. More information about him can be found on his website (http://yenchiah.me/).
  • Abeba Birhane (Ethiopia, University of Dublin): http://www.abebabirhane.com/
    • Abeba Birhane is a cognitive scientist researching human behaviour, social systems, and responsible and ethical Artificial Intelligence (AI). Her interdisciplinary research explores various broad themes in embodied cognitive science, machine learning, complexity science, and theories of decoloniality. Her work includes audits of computational models and large scale datasets. Birhane is a Senior Fellow in Trustworthy AI at Mozilla Foundation and an Adjunct Assistant professor at the school of computer science, University College Dublin.

The Panel

Introduction (Mariel & José / iRights.Lab & LAPIN):

Many have been highlighting the importance of transparency for AI systems to empower affected communities and enhance accountability, but we still have a long way ahead of us before this principle will become practical and effective. The idea of this panel is thus to look at how transparency can be put into practice with regard to communities affected by these systems. How should they be informed? What information is necessary and sufficient? Would that be dependent on the technology?

To advance on this debate, we have invited one cognitive scientist and two computer scientists to share their perspectives on the extent to which transparency in AI systems is meaningful in order to tackle and avoid negative outcomes towards society and the environment.

In her interventions, Abeba Birhane mentioned her experience auditing AI systems, especially those with large datasets. Although she expressed the importance of improving transparency mechanisms as a means to help people better understand their impacts, it should not be seen as the sole solution for addressing the negative outcomes produced by these technologies. It is fundamental however, to address the power asymmetries that they perpetuate. She also dove into the example of face recognition and, agreeing with Nina da Hora, she expressed that she could not perceive any positive outcome of its deployment, as it has, in numerous examples, helped intensify racism. In this sense, transparency, even though feasible, is meaningless if there are no means for individuals to act and exert control over these systems. It is necessary to entail justice.

However, one important step in thinking about AI transparency is to go beyond the notion that we need to focus on understanding strictly the inner working of systems, their technicalities.

Instead, it is crucial that we also have sufficient information about who is developing, implementing and deciding in a broader manner about AI systems, in order to comprehend what the political, ethical and financial interests are that are driving the creation and adoption of these technologies. She also highlighted that an opportunity to better inform about key stakeholders and data used to create AI systems is to make this information accessible as open data/ open source. This is especially important for public data.

Hsu Yen Chiah, in his turn, mentioned his experience at the intersection of design and computer science. His work focuses on understanding how technology interacts with people and his lab has a close connection with local communities. He builds systems in close contact with communities in order to understand their needs and contexts.

When asked about how transparency looked like in his projects, he mentioned that in a recent research project to develop a computer vision system for assessing air pollution, he held monthly meetings with communities from the impacted region. There they discussed the system and also created videos to denounce the issue to authorities. In this case, transparency was mainly about keeping people in the loop for them to understand the system and gather their views on how to increase its effectiveness. 

Nina da Hora shared her interest in the ethics of AI and how algorithms perpetuate colonialistic practices. She mentioned how, in her first project in a startup in Rio de Janeiro, she had problems with systems as they presented many flaws while conducting voice recognition and facial analysis of black people. This sparked her interest in AI ethics. In her most recent project, she assessed how face recognition systems have been impacting black communities in Brazil by being subject to a much higher rate of false positives, leading to unjust detainments.

With regard to transparency, she mentioned that there are many different ways, depending on the system and its application, to provide information about its functioning and deployment.

With that in mind, she mentioned that transparency needs to be provided in a way that the general public, but especially the groups most affected by these systems, including Black and Indigenous communities, understand their impact and are provided with tools to interact with them, sharing experiences and giving feedback.

Key takeaways:

Based on the above considerations of the panelists, we conclude that transparency in AI systems can be part of the solution, but it is not a silver bullet. More important than making models themselves transparent is to operationalise the provision of information about these technologies through (i) accessible information to affected communities on how AI impacts them; (ii) inclusion of underrepresented groups, such as black and indigenous people in the debates; (iii) building trust among citizens and those who build AI models; and (iv) ensure public data is open source.

Next steps:

Following the panel we see the need to further deepen the discussion about practical implications of AI transparency for specific communities. An opportunity to do so could be though a follow-up panel where we bring together representatives of affected communities, scholars and technicians involved in AI systems design. Valuable outcomes of such an event could be the assessment of similarities and differences, needs and challenges of affected communities to information about AI systems. Recommendations for policy advisors and technicians to the requirements of communities for transparent information and opportunities to engage with AI systems would also be positive results.

IGF 2022 WS #352 Youth lenses on Meaningful Access and Universal Connectivity

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Connectivity is still a big issue in the global south. Rural areas especially lack basic infrastructure. Private companies run into the issues of profitability, hence alternate routes are required such as community networks, IXPs, etc.

Calls to Action

Funding is a big hurdle for community networks and IXPs and needs additional efforts, specially in Latin America. Meaningful connectivity doesn’t always translate to meaningful access

Session Report

Session Report:

—-
Historical perspective: Adoption of technologies and infrastructure work is usually slow. For example, in the early 80s and 90s, the Internet was seen as something fancy that would phase out like a trend after some time. However, due to the optimistic approach by some people, it slowly made its way through schools, village halls, and then finally to everyone’s homes. 

Baseline Perspective: For meaningful connectivity, there is a requirement for an Internet with 4G-like speed, an appropriate device, unlimited broadband access, and daily use. One further issue with the current metrics is that they are outdated. For example, ITU currently considers an Internet user who has used the Internet at least once in the past 3 months, which is totally not reflective of the current scenario. 
Although the issue of access and connectivity is less pronounced in the developed countries, it is still a big issue in other parts of the world. 2.7 billion are still unconnected in 2022. Gender disparity also exists. ¾ over 10 own a mobile device. Meaningful connectivity doesn’t always translate to meaningful access. We need an Internet where there is equity with respect to gender, race, social class. To convert connectivity into meaningful access, we also need to look into non-Roman domain names. Significant work has been undertaken towards International domain names. However, widespread adoption still remains unachieved due to critical bottlenecks in the DNS infrastructure and backward compatibility among other issues. Technical capability, however, exists and we should work towards achieving a truly multilingual Internet infrastructure.

Rural and developing world perspective: There is a big gap in connectivity and access between rural and urban areas. Primary issues include affordability, infrastructure, digital competence, language barrier, information overload, and unavailability of local content. There is also a cultural context in some regions, for example, in India. Rural women want to buy a smartphone but the cultural norms do not support it. Aversion to the Internet exacerbated by previous experiences with online scams and fake news.
Due to the issue of profitability, it’s very difficult to serve rural areas with broadband connectivity. However, typically, national broadband networks serve up until village boundaries. Hence, if we find a way to solve this last mile connectivity problem, we could improve the connectivity issue in a lot of rural areas. Community networks have a great potential towards solving this. These community networks can also work as interconnections for additional funding. This could be achieved by creating a small IXP that will be administered by the community networks in the area, available also to small providers and local providers. Furthermore, they could also work on developing IXPs that could connect to national and regional providers/networks. However, starting any such initiative requires significant funding which is currently very limited as well. So, to solve rural connectivity issues, funding for local initiatives is necessary.
In most developing countries, the amount of bandwidth allocated for daily use is limited, even though the subscriptions are advertised as being unlimited. There is also the issue of poor connectivity inside homes and thus to meaningfully access the Internet, a person has to go outside their homes. 

IGF 2022 Open Forum #6 African Union Open Forum

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Collaborative activities on capacity building across the continent are happening and should be encouraged. Awareness creation on IG to parliamentarians, Judicial officers and law enforces to be promoted. More countries and regional NRIs encouraged to use PRIDA training modules.

IGF 2022 Open Forum #101 Open Forum on Technical Standard-Setting and Human Rights

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

Effective and inclusive multistakeholder participation in technical standard-setting process is critical to ensure adequate human rights considerations are taken into account before, during, and after the development of technical standards, including during the implementation stage.

,

In order to ensure inclusive and sustainable participation of stakeholders, including civil society, various barriers (financial, cultural, knowledge) to access and meaningfully participate in technical standard-setting processes could be considered and addressed at various standard setting organizations while ensuring this does not slow down the process or pose additional hurdles.

Calls to Action

Standard-setting organizerions should consider methods to ensure inclusive, meaningful, and sustainable participation of and access to technical standard-setting processes for stakeholders, particularly civil society organizations and human rights experts that can provide expertise so that adverse human rights risks are mitigated and addressed in standard-setting processes, including through dialogue and collaboration with civil society.

IGF 2022 Open Forum #4 Digital self-determination: a pillar of digital democracy

Updated:
Governing Data and Protecting Privacy
Session Report

The Swiss OFCOM organised an "Open Forum" on the topic of digital self-determination and trustworthy data spaces.

Political authorities at national and international level as well as civil society are developing new models of data governance in this respect. In Switzerland, we published a government report earlier this year on the concept of "digital self-determination" and "trusted data spaces". Our vision is a data society based on autonomy and freedom to manage one's own data. We want to restore trust in data technology and empower the actors in the digital space. To this end, Switzerland is developing a code of conduct for trusted data space providers.

The proliferation of national and unilateral approaches to data governance has led to an increasing fragmentation of regulation at the international level. This is accentuated by the fact that there is currently no international process for a holistic and cross-cutting discussion of data governance issues. This session aimed to address difficult data governance issues such as: How can we reinvent ourselves in the digital age as our digital footprints expand? What are the key elements needed to implement digital self-determination? How can we ensure that such efforts do not contribute to further regulatory fragmentation? What governance mechanisms are possible and how could they be deployed?

Magdalena Jóźwiak argued that personal data should be protected as a core value. Yet there is a dichotomy between private and public actors. The transparency of private companies is only based on voluntary disclosure. Many studies bring the importance of data governance to a constitutional level. The inclusion of digital self-determination is a way to balance the trends. 

Roger Dubach recalled the difficulty of moving from a national to a global discussion. At the national level, Switzerland is developing a voluntary code of conduct, with the objective of building trustworthy and human-centred data spaces. The discussion at national level should help inform the international level and vice versa.

Pari Esfandiari pointed out that the issue of data governance is very controversial, with a geopolitical concern. There are various ideological perspectives on data, from the American approach of maximising use, to the European approach of protecting it and the national approach of controlling it. Pari reminded the audience that a single, global data governance regime is as essential as it is unlikely to be achieved at the moment.

Marilia Maciel noted that data regulation has moved away from the use of data as intellectual property to focus more on how data can be shared and used collaboratively. The Swiss proposal is in line with this. Little attention paid to the development aspects of data in trade-related discussions. The issue of transparency arises in this context, trade negotiations are very opaque. We only have an idea of what is being discussed, but no clear indication. And only governments participate, which excludes the idea of multi-stakeholder participation.

IGF 2022 IRPC Access & participation as enablers of digital human rights

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Barriers to access and participation relate to infrastructure, policy - legislation (e.g taxation) and implementation, the level of digital literacy and content - especially related to the lack of language diversity online. The online and offline worlds are interconnected and marginalised communities are disproportionally affected online. The challenges can go beyond access since access can also be limited or used as a surveillance tool.

,

More needs to be done to ensure that online violence is tackled effectively. Content moderation is primarily used as a remedy but platforms need a concerted strategy that promotes transparency, investment, better working conditions, awareness raising on abuse reporting, and more cultural and language diversity for long-term solutions. Effective legislation and its implementation need to be the result of a constructive multistakeholder dialogue.

Calls to Action

A real commitment to promote access and participation and ensure digital inclusion from all stakeholders: more infrastructure, concerted policies and effective implementation, ensure digital literacy and content which is accessible to all. More engagement and constructive multistakeholder dialogue is needed to develop regulatory frameworks and effective solutions to promote digital inclusion, and to respect and uphold human rights online.

,

Online violence must be tackled effectively. Internet shutdowns are an obstacle to a free and open Internet and hinder the full enjoyment of digital human rights. Content moderation can only be an effective remedy if civil society, the technical community, private sector and governments work together to ensure that any implemented solutions are transparent, sustainable, localised, human-centered and rights-respecting.

Session Report

The session reflected on the importance of meaningful access and participation in the online environment for the full enjoyment of human rights. The panel discussion focused on

Access and participation to address major challenges  to access and participation online and reflect on ways to promote empowerment and inclusion;

Online content moderation  and the most pressing challenges  in Africa;

Redress for human rights violations online

 

The panel included:

  • Catherine Muya, Civil Society (Article 19), African Group - remote
  • Hon. Neema Lugangira MP, Government (Tanzania),  African Group - on-site
  • Roselyn Odoyo, Private Sector (Mozilla) - on-site
  • Victor Ndede, Civil Society (Amnesty International Kenya) - on-site
  • Yohannes Eneyew Ayalew, Technical Community, African Group - remote

and intervention from  IRPC Steering Committee member, Santosh Sigdel

 

The session was moderated on-site by IRPC co-chair Raashi Saxena and online moderated by  IRPC co-chair Minda Moreira

 

Access and participation

Victor Ndede identified four major barriers to access and participation:

  1. Network infrastructure  and policy
  2. Taxation. The ever-increasing governmental taxes on digital devices and services pose affordability challenges and a reduction of mobile taxes would lead to better inclusion.
  3. Digital literacy,  It is important to know how to use Internet-connected technologies to harness their full potential
  4. Content, language barriers and the importance of localised content. The fact that the content is mostly in English can limit participation online.

Roselyn Odoyo added that when it comes to access and participation, marginalised communities are disproportionally affected as the legal and political environment is still hostile to these communities e.g, refugees, and LGTB. Access is not only difficult but can even be used to surveil and curtail their rights.

Roselyn pointed out the fact that human rights violations online have repercussions offline and vice-versa and therefore human rights defenders working with marginalised communities and digital rights groups should work together rather than in silos to better address these issues. She also highlighted the importance of including civil society in discussions on accessibility and participation.

Responding to the issue of taxation Hon. Neema Lugangira pointed out that developing countries are losing out due to international tax regulations and that it is important to ensure that African countries benefit from the taxes generated by sales within the countries and that it is only fair that tech companies pay taxes on income generated in the country. The discussion developed further with examples coming from Kenya, Tanzania ad Congo and there was some support both from the panel ad the floor that the increase of taxes for users of digital services and devices could lead to accessibility challenges and hinder inclusion.  Hon. Neema Lugangira highlighted the importance of all stakeholders working together so those valid arguments can be passed on to legislators to better look at these issues.

Santosh Sigdel (Internet Rights and Principles Coalition Steering Committee) highlighted the importance of documents such as the Charter of Human Rights and Principles for the Internet in the promotion of accessibility and participation. The Nepali translation of the Charter, he added was a collaborative process. The translation of the document into the local language develops capacity building and gives communities the chance to work on issues that affect directly the tools to work on better laws and policies to address those issues. Sigdel also stressed the importance of the right to access to address digital inclusion and pointed out that the online world replicates what is already happening offline and, therefore, escalates vulnerabilities.

Online content moderation

Reflecting on online content and issues at the intersection of freedom of expression, online abuse and online hatred or incitement to violence the panel reiterated the position that human rights should apply online as they do offline and agreed that online violence has repercussions of offline.

Hon. Neema Lugangira looked at the fine line between freedom of expression and online abuse. She explained how the latter was used as a tool to silence some groups, especially women and highlighted the dangers of self-censoring. She called for online discussions that are “focused on the agenda, not the gender” and for wider representation in social media platform teams so that cultural and language diversity are taken into account. Hon. Lugangira stressed the importance of legislation and regulatory frameworks to remove the grey areas from where hate speech and online violence flow. She also pointed out the need for all stakeholders to come together and called for different ways of engagement and cooperation among stakeholders.

Yohannes Eneyew Ayalew reflected on the profound impact of online hate speech in the context of the war in Northern Ethiopia and on the slow response of social media platforms to prevent the escalation of violence.

Content moderation as pointed out by Catherine Muya should be a remedy to address issues such as hate speech however there was strong support that more needs to be done to address the challenges of online content moderation to ensure it is a practical and effective remedy. Catherine highlighted the lack of transparency and coordination which are directly linked to the failure of tech companies to truly invest in online content moderation and to develop effective measures from adequate training and fair remuneration of online content moderators, more awareness of the tools available for abuse reporting to the development of long term solutions accountability for those responsible and support to victims of online abuse.     

Questions and comments from the floor came from different stakeholders, from civil society to government and National human rights institutions and highlighted the different experiences on taxation of digital devices and services in African countries and the impact on accessibility and participation online. They also stressed the crucial need to address online accessibility for all by designing and developing content for people with disabilities. Participants also highlighted the importance of striking the right balance between business and human rights and the role of governments to uphold human rights online as offline and pointed out the barriers that need to be overcome to promote accessibility and inclusion from the physical barriers still preventing access in many communities to the lack of coordination and effective mechanisms or the lack of trust among stakeholders. The lack of access to the Internet and digital services due to national and regional shutdowns, particularly the current situation in Ethiopia Tigray region was another issue brought up during the Q&A discussion with participants and speakers highlighting the importance of a free and open Internet and members of the IRPC SC referring to the Coalition’s Charter of Human Rights and Principles and the importance of Article 1 - The Rights to Access the Internet as an enabler of all digital human rights.

 

Recommendations

At the end of the session, the panel put forward several recommendations to ensure the full enjoyment of digital human rights:

  • A real commitment from all stakeholders to promote access and participation and to ensure digital inclusion. This includes more infrastructure, concerted policies and effective implementation, ensuring digital literacy and content which is accessible to all.
  • More engagement and constructive multistakeholder dialogue to develop regulatory frameworks and effective solutions to promote digital inclusion, and to respect and uphold human rights online.
  • Online violence must be tackled effectively. Internet shutdowns are an obstacle to a free and open Internet and hinder the full enjoyment of digital human rights. Content moderation can only be an effective remedy if civil society, the technical community, the private sector and governments work together to ensure that any implemented solutions are transparent, sustainable, localised, human-centered and rights-respecting.
IGF 2022 WS #242 Lessons Learned from Capacity Building in the Global South

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Training programs and capacity building for the deployment of community networks in rural and remote areas has to be aligned with the needs of the territories, based on local knowledge, language and culture, supported by a multi stakeholders approach based on collaboration between different actors to exchange good practices and make them sustainable in long term.

Calls to Action

To gain meaningful connectivity in remote places and equal access for women, a deeper collaboration and understanding from governments, institutions and social organizations about the communities they work with has to be achieved for the development of community networks, considering these community projects as lifelong learning processes and local-based.

Session Report

Key Issues Raised

1. When people from rural and remote communities are trained to develop their technical skills that allows them to generate and operate local solutions for connectivity, as the community networks, the chances for those projects to be sustainable long-term increases.

2. The project of the National Schools of Community Networks was launched in 2020 and the main objective was to develop training processes to allow capacity building for the development of Community Networks in 5 countries of the global South: Brasil, Indonesia, Kenya, South Africa and Nigeria.

3. Each of these schools has their own program and different ways to execute it, and also different pedagogical methods.

4. Participatory Action Research (PAR) methodology is a research method that has successfully been employed for the development in the projects of the National Schools of Community Networks. Also other popular education tools to train people have good results to bring closer technology to people with no previous knowledge.

5. To help for the implementation of every school, a guide where PAR methodology was explained in a guide that was launched with the objective of create contextualized programs: https://www.apc.org/sites/default/files/FINAL_Technological_autonomy_as…

 

6. Training programs and capacity building for the deployment of community networks and other local solutions for meaningful connectivity have to be aligned with the needs of each territory based on local knowledge, ways of learning, language and culture.

7. Efforts and specific strategies need to be done in terms of increasing the participation of women in these training programs.

8. The development of these training programs has to be supported by a multi stakeholders approach based on collaboration between different actors to exchange good practices and make them sustainable in the long term.

9. Inclusive, responsible and sustainable digital transformation needs to be impulsed by policy strategies and regulatory frameworks, and also for the investing in some initiatives and projects that impulses local capacity such as the National Schools of Community Networks.

 

Presentation summary

1. Carlos Baca Feldman, LocNet project (Mexico): the initial setting for the development of the National Schools of Community Networks was the publication of a guide where it explained the Participatory Action Research methodology. This methodology was employed in the design of Techio
Comunitario, a training program developed in Mexico, that was the origin of the development of the design and implementation of the project of the National Schools of Community Networks in 5 countries of Global South: Brazil, Indonesia, Nigeria, South Africa and Kenya.
One of the goals of this project was the development of an online repository in order to strengthen and develop community networks by the exchange of materials that can help people and organizations to develop skills and knowledge.
2. Alessandra Lustrati, FCDO (UK): UK Digital Access Programme (DAP) is focused on catalyzing, inclusive, safe and secure digital transformation in 5 countries like Brasil, Indonesia, Kenya, South Africa and Nigeria. So supporting digital connectivity and skills development in cyber security, digital entrepreneurship and innovation are key to the development of alternative models of local solutions. There are real value following the next main three principles:

  • First principle - scalability vs replicability: Relatively small investments but really well targeted, context-specific and fully focused on building capacity, can be very significant in terms of impact. Once the model is demonstrated, in terms of adaptability to local conditions both the technological and organizational knowledge, can be disseminated effectively through the method of the National Schools of Community Networks, considered as positive proliferation of meaningful connectivity solutions on the ground.
  • Second principle - local ownership: As the model needs to be fit for context and well embedded in the local reality, it needs to take into account local needs and preferences, and what is viable or not in a particular location.
  • Third principle - sustainability: strengthening local capacity is essential for sustainability. The community became autonomous in terms of know-how in establishing and managing their own telecoms networks, understanding the interaction with a broader market or what it is considered the connectivity value chain: developing a business and organizational model that enables communities to access and appropriate technological services, in an efficient and affordable way.

3. Neo Magoro, Zenzeleni (South Africa): many rural areas do not have access to Internet, the cost for rural areas is higher than in the cities, there is a high level of employment rate and gender-based violence, and Internet data costs in South Africa are the highest in the world. For the development of the school, the organization Zenzeleni developed a curriculum made by experts who understood the needs and the realities of each community based on four pillars: personal, social, technical and business development. A Learning Management System (LMS) was created, and all the participants of the school were provided with cell phones to access the contents. Because of the diversity of participants, the idea of using home languages was reinforced, and some translators were employed. Another challenge faced was developing interest in interacting with technological devices for the female
participants, because at the beginning of the training was waiting for men to start using the devices. So peer-to-peer learning was encouraged and working into smaller groups led them to relax and share their own knowledge.

4. Harira Wakili, CITAD (Nigeria): Connectivity issues are a challenge for Nigeria, lack of access to digital education is a problem in most of the communities. To bridge the gender lack of access to education, CITAD worked on developing awareness of the importance of being connected and to develop community networks and also, for the women to participate. To create the curriculum of the school, they looked for experts, focusing on Technology and Sustainability skills development. Also, volunteer mentors were invited to support the participants, because of their knowledge on the relevant issues happening in the communities. For the first school that was developed, there was not much participation from women, so for the second school a different strategy was taken to focus on women and elders from the community, so was successful with 50% of participation this time. To improve the participation a group of women were created, because they felt intimidated by men, so bringing a feminist approach to the Internet encouraged them to speak for themselves.

5. Akinyi Arose, Tunapanda Net (Kenya): Connectivity built by and for the community were emphasized and the focal areas of the contents of the school were connectivity, using human centered design approach, providing meaningful access to community. The guide that was provided, and PAR methodology at the building scenario helped them to delimited the training needs, as a co- creation process with the community networks members that really speaks to their needs. For the next stage of development, a series of conversations and a survey were conducted to analyze and curate the training. Also mapped out the experts were important to provide information and training services, the key areas that became the development of some Community of Practices: to design and deploy community networks; sustainability; and local content creation. Different stakeholders were invited to participate in the process to understand how to work collaboratively. Peer exchanges, virtual mentorship and bringing the concept of community to the training allow them to work with different grounds or levels of development of the community networks from starting projects to more consolidated ones, and what they are doing on ground.
Another challenge that the communities face after training, are related to access to infrastructure and equipment to deploy the networks, so one of the strategies followed was to figure out ways of resource mobilization, especially for emerging ones. Regarding volunteer and sustainability of knowledge, how to maintain the knowledge delivered by the training.

6. Adriane Gama, PSA (Brazil): a co-created curriculum based on social and digital aspects, gender and youth concerns, working with methodologies focused on the perspective of Paulo Freire of popular education and ludic were developed. The challenges faced were related to the pandemic and the
impossibility to travel, and also the lack of connectivity in the territories of the Amazon, where PSA bases its work. It was important to look for partners to strengthen the community networks and access to fundings to work, based on a sustainable economy and according to local needs, where the local women participation to be strengthened.

7. Gustaff H. Iskanda, Common Room (Indonesia): the pandemic outbreak revealed a huge gap in different areas, not only in the digital divide but also in the development gap. In Indonesia the challenge of digital divide comes with a number of problems from very huge geographical challenges like in Amazon, and lack of electrical supplies devices. A prototype in indigenous communities of National Schools of Community Networks was developed, based on Common Room´s own experience and also following the guide was developed with APC. An Advisory Committee and a curriculum was deployed focusing on software and hardware with the integration of policy and regulation, technical capacity building and meaningful connectivity. Also, a training of trainers programme was launched and a handbook was published, to make the contents easier and more accessible for the people. In Common Room they work with an approach called 5L: low tech, low energy, low maintenance, low learning and local support. The nature of Community Networks is context-specific, it can be different from one another, so their implementation needs to be focused on research and observation on the needs and the rights of the community. Meaningful access celebrates multiplicity in microscale. Local communities have to find their own way to deploy and learn what is a community network that is relevant for daily
life. A multi-stakeholder approach was developed, especially policy and advocacy with specific needs for long-term capacity building, digital literacy, special license for Community Network deployment, including tax incentive, because most of them are non-profit. Community Network´s strong foundation
is on the network of people, open knowledge and technology.

8. Josephine, Kenya ICT Network: There are a lot of similarities on the challenges the schools have faced as access to technologies, language, devices as well as women participation. Also how the schools have been adaptable and flexible, and learning together with the communities, collaboration between different territories, development of open source technologies.

 

Participant Questions - Additions

Said from Public University of Debre - Ethiopia : 1. What are the target groups for your local capacity building? are public schools or private companies or institutions?
2. The Internet is becoming a place for violence or abuse especially for women and children. What is your effort to mitigate this problem and safeguard the connected community? 3. To Indonesia, to ensure infrastructure accessibility most of countries from global South have a problem of ensuring accessibility, especially in Telecom or Internet infrastructure, because most of the telecom companies are not willing to go to rural areas , so what is your government effort or institutions like you in your
country to ensure infrastructure such that Internet is accessible to all communities?
 

Talant Sultanov Internet Society Kyrgyzstan chapter: there are some Community Networks in Kyrgyzstan and we learned that local communities can do capacity building. The first thing that people did when they had the Internet was send messages to the central government to say “We have no roads, no electricity”. The second thing that people started using the Internet for was e-commerce to promote some local products from farmers. Also, local WiFi hotspots became safe areas for girls. The Community Networks Learning Repository mentioned by Carlos would be a really useful instrument.
 

Ashapur Rahman from Bangladesh School of Internet Governance: more cooperation is needed in the continent. But for the capacity building if we cannot connect the whole globe maybe we cannot achieve our goal.

 

Reflection on Gender Issues

a. The number of participants in your session (or an estimate)
There were approximately 45 participants in Addis Ababa, the CR3 was almost full. Even though we proposed a hybrid format, just 4 people joined us in Zoom. The panel itself was gender balanced, with the moderator, the rapporteur and four panelists were women, of a total of 8 participants.
b. An estimate on the percentage of women and gender-diverse people that
attended the session, to the extent possible

More than 50% of the assistants were women.
c. Whether the session engaged with gender as a topic. If so, please explain in
a short paragraph.

Even though the main topic of the session was not gender equality, many of the contents mentioned women participation and the challenges faced related to this topic, because the project of the National Schools of Community Networks have this issue as one of the main subjects to work for. Also, the number of participants were mostly women.

IGF 2022 WS #252 Building a safe & trustworthy digital world for all children

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

While the Internet offers many opportunities for learning, communication, creativity & entertainment 2/3s of the world’s children don’t have internet access at home.

,

Going online is essential for future generations to reap the benefits of digital transformation & support a sustainable future. But bringing children online requires more than expanding connectivity: it needs to respond to specific risks. Designing digital products, services, & programs for children must consider their specific needs, context & ability.

Calls to Action

It is imperative to ensure a safe and trustworthy environment for children engaging with new technologies and spending time online.

,

Developing such an environment requires multistakeholder cooperation to foster appropriate policy, legal and regulatory environments, responsible practices by all stakeholders online as well as ensuring relevant capacity building and skills development.

Session Report

Introduction

Connectivity has rapidly become one of the most defining features of our everyday lives, the way we study, work, do business, consume content or connect with our communities. A recent report suggests that the digital economy is worth US $11.5 trillion globally, equivalent to 15.5% of global GDP, and that it has grown two and a half times faster than global GDP over the past 15 years. Information and communication technologies (ICTs) are also transforming essential social services, such as education and health care, as well as the ways in which people interact with their governments.

In this context, it is imperative to ensure that future generations are equipped to reap the benefits this transformation can bring, that all children and young people are learning and acquiring the knowledge and skills they need to support a sustainable future.  But bringing children and young people online requires a lot more than expanding connectivity. While the Internet offers many opportunities for learning, communication, creativity and entertainment, it also opens up certain risks to vulnerable users such as children. It is imperative to ensure a safe and trustworthy environment for children engaging with new technologies and spending time online.

Key takeaways

The session presented a multidisciplinary standpoint on the policy, technical and skilling elements necessary to build a safe and trustworthy online environment for children. From the business community's unique role and perspective, the session delved into an exchange of ideas on how to best interact with policymakers to promote safe business practices in safely connecting children to the Internet. In particular, the discussions showcased solutions in the development of age-appropriate products and devices to enable trusted connectivity for children, and providing meaningful connectivity for children through the necessary skills and capabilities. Speakers mentioned initiatives such as:

  • Amazon Kids provides child-appropriate content experiences through curated and filtered services. There Amazon Kids+ subscription service offers entirely hand-picked, prescreened and filtered content, which parents can further curate. Amazon Kids is developed in partnerships with experts in this space, such as Family Online Safety Institute, The Digital Wellness Lab at Boston Children’s Hospital, and other groups to leverage the existing expertise as the platform is built.
  • Chicos.net has been working for 25 years to improve the well-being of children in Latin America. Their Historias para armar digital platform invites children from 8 to 11 years old to create their own stories with digital media, with the aim of stimulating the development of 21st century socioemotional and literacy skills. The initiative provides answers to the plurality of socioeconomic contexts in Latin America: it offers free resources that can be used with or without connectivity, proposing both online and analogic activities and without digital devices. Since its launch, Historias Para Armar has reached more than 20,000 teachers and 800,000 children throughout the region.
  • The African Girls Code project based in South Africa is working on empowering and enabling girls to enter into the ICT and STEM related industries. The organisation aims to ensure that in the future women and girls in Africa are included in these industries, through learning interventions, whether on the ground or at the policy and intergovernmental level, collaboration with corporate and private sector partners, civil society organisations and running a number of skills development programmes for youth, for women and girls on their campus.
  • DotAsia is supporting a new top-level domain, DotKids, which is currently being launched. They work closely with different parties to try to maintain the platform with a protective environment while allowing children to explore more freely.

Throughout the discussion, the role of education was referenced as a pivotal element in building children’s comprehension of the digital environment, their digital skills, and capacity to understand and manage risks – as well as how the level of comprehension, skills and capacities should be taken into account when designing devices, services or policies for children online. Against this background, the work of the OECD 21st Century Children project as well as the OECD Typology of Risks were referenced, which respectively look at how education systems play a key role in empowering children to navigate the digital environment safely and effectively, and an overview of the risk landscape, outlining risk categories and their manifestations. The session also allowed for stocktaking on the current work and research on connecting vulnerable groups, especially children with disabilities and/or from underprivileged social-economic backgrounds.

Speakers noted  the importance of including children in the product design process, in services, programmes or even policies targeted at them.  Furthermore, recognising the diversity of children from different contexts, cultures, languages, ways of thinking, promoting representation of all kinds of groups, is important so that they can see themselves represented on the internet and thus break the cycle of exclusion.

Speakers also stressed the need to forge partnerships and work towards reducing digital inequalities in education and accessibility in technologies, agreeing that it takes multiple stakeholder groups to bring forward an effective solution towards a safe and trustworthy digital world.

Call to action

In pursuit of a people-centric, sustainable digitalization, policymakers must improve their understanding of how ICTs work in practice, including knowledge of the ICT ecosystem, the roles of the various stakeholders and relevant policy issues. 

Policy and regulatory mechanisms in particular, should consider the value of the entire communications and digital services ecosystem. They should be non-discriminatory, technology-neutral, and supportive of innovative business models and the development of a wide range of technologies, standards, and system architectures. 

In that regard, developing a safe and inclusive digital environment should account future generations, allowing them to reap the benefits of digital transformation and support a sustainable future. It is therefore imperative to ensure a safe and trustworthy environment for children engaging with new technologies and spending time online, considering their specific needs, context and knowledge.

Developing such an environment will require multistakeholder cooperation to foster appropriate policy, legal and regulatory environments, responsible practices by all stakeholders online as well as ensuring relevant capacity building and skills development.

Further reading

 

IGF 2022 WS #517 New data on fairer access to health using the Internet

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The intersection between health and Internet Governance is underappreciated, and more work needs to be done on bringing these areas closer together.

Calls to Action

Health is the most fundamental aspect of human existence, and Internet Governance actors need to be mindful of their role in enabling people to have access to it.

Session Report

The session focused on the draft report entitled "Health Online Indicators LAC ‘22-‘23: Data on access to health solutions using the Internet", produced by Governance Primer with funding from LACNIC, presenting its findings to a global audience in order to seek further input and evaluate what next steps should be pursued prior to publication of the document.

Mark W. Datysgeld (Governance Primer) and Ron Andruff (ONR Consulting) had an open debate with the audience in relation to the contents of the report, in a session that focused less on exposition and more on exchange of ideas.

The first point that was brought up is how, in spite of the global COVID-19 pandemic, some countries in LAC still lacked proper regulations around telemedicine, a concerning issue that reduces the safety both of practitioners and patients.

It was next noted that the ability to purchase medicines using the Internet is also limited and sometimes outright banned in some LAC countries. This is at times circumvented with the use of delivery apps, which leaves open questions such as liability and medicine safety.

The subject of medicine importation was debated, with an understanding that this practice is not even seen as a matter of discussion in many jurisdictions, when in fact the Internet could be leveraged to increase access and reduce inequalities in relation to medicines.

Finally, a dependency map was presented showing the potential routes of action to generate impact on processes where health and Internet Governance intersect, pointing out that there could be much more action than there currently is.

In discussion with the audience, it was brought up that such a study would be useful to the African region, seeing as, much like LAC, it is also constituted of a patchwork of legislations that do not harmonize with each other.

It was then discussed that the IGF itself is a good avenue for such debates to take place, considering its open-ended nature and multistakeholder nature, but different fora need to be sought for the theme to be socialized and increase in awareness.

The final report will be published in 2023, reflecting the discussions from this session.

IGF 2022 Open Forum #84 Digital Education and the Future of Women’s Work

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

As part of the 17th UN Internet Governance Forum, held in Addis Ababa from November 28 to December 2, the BMZ Digital Transformation Centers (DTCs) together with the global project “Future of Work” organized an open forum on Digital Education and the Future of Women’s Work, on November 29. The session invited experts from a broad range of countries and background to a panel discussion on opportunities and challenges faced by women in the digital economy and the role of digital education for equitable access to IT-enabled jobs in the future. 

Sabina Dewan from the JustJobs Network India and Prof. Kutoma Wakunuma, partner of the DTC Kenya, shared findings from their research on women’s experience in the digital economy, and learnings on good practices as well as policy implications for ensuring new digital opportunities benefit women equally. Their discussion was followed by practical experiences shared by Hannah Adams from Harambee Youth Accelerator on Rwanda, Salih Mahmod from Mosul Space Iraq and Yayaha Amsatou from the SmartVillage project Niger. All panelists highlighted the need to leverage women’s opportunities in the digital economy, while universally agreeing on the many obstacles female economic empowerment remains to face.  

Participants off- and online engaged in the discussion and emphasized the importance for policymakers in the global south to ensure requisite levels of basic education as an essential starting point to promote effective digital skills development and future-oriented competencies.​ At the same time, participants called for more inclusive policy reforms education systems to ensure access for people with disabilities and other vulnerable groups. 

In conclusion, the session identified several key takeaways from the panel discussion and subsequent round of questions: While digitalization and new technologies are sought to provide new employment opportunities, their positive impact is often limited to selected, highly educated groups with access to internet and IT-infrastructures. As a result, policies and regulations must consider the needs and interest of vulnerable groups such as women to guarantee equitable access to jobs and benefits in the digital economy.  

Yet, access to digital technologies is not a guarantee for women’s economic empowerment and no end to itself. In fact, the lack of sufficient regulations threatens to exacerbate existing inequalities. Yet, there are no one-size-fits-all solutions – rather, policies need to cater to women’s needs and consider their country-specific context to empower women in the labour market as well as in their homes and communities.​ 

Socio-cultural factors prevent women from realizing their full potential and limit equitable access to jobs in the digital economy as social requirements and unpaid care work continue to bind women to traditional reproductive roles and routine jobs. Experts from India, Rwanda, Niger, Kenya and Iraq agreed that governments must prioritize job creation and commit towards reducing the gender skills gap by creating education systems that promote digital skills for younger generations. For that it is crucial that training programs are demand-oriented and offer clear pathways to jobs to ensure female participation. As shown in the case of Rwanda, female role-models can further promote women’s participation in skills trainings and increase their participation in digital markets.​ Meanwhile, governments should also incentivize the hiring of women and supporting female entrepreneurs in the digital economy.  

In addition to that, government actions must include public awareness campaigns to make digitalization more accessible in society and implement curricula that focus on digital literacy to promote key digital competencies. For that, increased investments in education are needed and gender-disaggregated data in national statistical systems must inform new regulations as well the design of systems evaluating women’s reproductive roles. As demonstrated on the example of Niger, in rural areas where digital literacy rates and access to digital financial services are particularly low, more collaboration between private and public sector is needed, as well as better stakeholder-engagement, especially including women, for promoting women’s financial and digital literacy.  

IGF 2022 WS #69 Governing Cross-Border Data Flows, Trade Agreements & Limits

Updated:
Key Takeaways:

The risks of not developing a minimal common international approach in governing cross-border data flows is high, the bilateral or regional trade agreements are not adequate to address the cross-border data flows.

,

The approaches adopted by China, the USA, India, ASEAN, and the EU towards cross-border data flow differ, both in their degree of stringency and in their priorities (e.g. national security, free trade, privacy, etc.), while this fragmentation increases barriers to trade, harmonising them would bring risks of not respecting national interests and the different degrees of development.

Calls to Action

The comparability mechanisms could be a possible solution , but different degrees of development and digital capacities need to be brought into the equation.

,

Bilateral and regional agreements are important but not adequate, minimal global rules are needed to minimise costs to businesses

Session Report

The workshop had around 120 onsite and online audiences. Six experts from China, Europe, India, Singapore and Africa had shared their analysis.

Dr. Chin and Mr. Deng explained that China’s cross-border data governing approach is based on  values of  sovereignty, personal data protection  and national  security. Chinese authority is trying to strike a balance between safeguarding national security and promoting those international trade.The Data Security Law of China and Personal Information Protection Law govern China’s cross border data export. The vast majority of data can be legally exported after meeting the requirements of security  assessments, security certifications, standard contracts, etc. However, it is totally prohibited to provide any information any data in China for any foreign authority or any foreign judicial authority except obtaining approval from the Chinese Government. This is a challenge for the Chinese authority to solve. The new Data Export Security Assessment Measures  have provided some legal certainty by defining four categories of  personal and important data that are required for a prior security assessment  to assess the risk. This formulation  is moderately referenced to the practices of  general and national security exceptions provisions in many regional FTAs. On the other hand, China's regulation on the cross-border data transfer is still in the process of very fast and rapid progress; China focuses much more on the outbound data with few regulations in term of the inbound data transfer. China's cross-border data rules will inevitably affect the direction of international cross-border data rules, to engage with China to becoming part of a community crafting shared norm and rule creation for the future is inevitable.

Prof. Locknie Hsu explained ASEAN’s approach in governing cross-border data that involves a set of principles, a set of frameworks, and a set of very useful tools to enable data flows to be smoother and to be facilitated. They include ASEAN data management framework, ASEAN cross‑border flows mechanism consisting of ASEAN certificate for cross‑border flows and cross‑border contractual clauses, and ASEAN e‑commerce agreement. For ASEAN, there is another layer of data concerns, i.e. consumer protection, and also cybersecurity.

Prof Rolf H Weber pointed out  that within the 27 Member States of the European Union, cross‑border data flows is not really restricted or only restricted to a lot of minimal extent.  Since it is the practice of the European Union to have free and liberal and open space for businesses and for Civil Society. There is an important distinction between personal and nonpersonal data.  As far as nonpersonal data is concerned, there are not many impediments to cross‑border data flows.  As far as personal data is concerned, cross‑border data flows is subject to compliance with relatively strict rules. As far as cybersecurity is concerned, EU have a regulation on the security of important networking infrastructure and a cybersecurity strategy.  But to a very far extent, cybersecurity remains a national domain. Comparing the European approach with the Chinese approach, the rules in European Union are less strict and more open. Cross‑border data flows should be legal, free, and secure, there should have more rules on a global level.

Dr. Mansi Kedia pointed out that India’s model is going to be somewhere in the middle, using bilateral arrangements, informal arrangements or formal arrangements.  Or a model at the global level, which would address their concerns about national security and access to data in instances of cybercrime. India is much more conservative. India has seen a series of data localization norms that started very early on with  public sector, financial and banking services sectors. It was followed by consultations and deliberations on a privacy bill.  And the first and second versions which had hard localization policies and requirements. This strong need for sovereign control of data is also reflected in India's position in several multilateral and bilateral positions.  The recent softening of the stance is reflected in a recognition that cross‑border data flows have economic costs, cross‑border data flows will help trade and the digital businesses.  But the government has made in favor of data localization because of the dominance of big data and monopolization that might take center stage in the days to come. The other economic argument is that data localization will force companies to set up data centers in India which can lead to a domestically created digital economy and India won't be dependent so much on foreign companies for the digital services. India wanted a broader jurisdiction over citizen data, India should have extraterritorial jurisdiction over that data. 

For the roundtable question:

What are the possibility of cooperation and collaboration and where are the challenges? 

Dr Yik Chan Chin: First, There is a convergence in privacy protection as the foundation for data export despite how strong to protect privacy is varied between countries. Secondly, many countries  put national security as one of the preconditions and important exceptions in FTA as well. But there is important divergence amongst national governments in how to define the national security,

Prof. Locknie Hsu: National laws are fragmented in terms of how to deal with data localization, data transfer rules are also different. It is important to be mindful of businesses' needs in compliance and understanding what the boundaries are and understanding how to navigate the rules and understanding what they can and can't do, so on.  The comparability mechanism can help bridge some of the gaps between international, regional and global rules.

Prof Rolf H. Weber: We do have special instruments that can be applied if the equal level of protection should be achieved, 

Dr. Mansi Kedia:  We shouldn't worry about the fragmentation at this point.  Each country will work out a way to get to this when they're ready. 

Mr. Zhisong Deng: China is the second economy in the world and export a lot of goods and services out of the country, it needs more dialogues among those major jurisdictions to figure out how to promote free trade.

Ms. Linda: Africa has a general framework on trade which is the Africa free trade agreement that came into force in 2021.the power of big tech in Africa is concerned.

For the roundtable question:

What are the risks of not developing a nonnational approach? would it lead to divide or would the bilateral trade agreements be enough to continue to use in both trade and digital sovereignty? 

Dr. Yik Chan Chin: there is no global agreement, which is not ideal situation, but there is the co‑existence of different regional agreements and mechanisms.  Ideally, we want a digital agreement at the global level. At the moment, there is slow progress in the WTO in terms of e‑commerce negotiation. hoping a minimal framework developed out of these discussions. 

Dr. Mansi Kedia: Harmonization of cross‑border data flows is important, but  other policies will need to be addressed to address the problem of the digital divide before it reaches the point of harmonization. Otherwise, it will completely breakdown digital economic engagement across different countries.

Prof Rolf H Weber: minimum harmonization would certainly lower the administrative costs of internationally active entities.

 Prof Locknie Hsu: there is a question mark in terms of what these exceptions scope should be, how to clarify what businesses need to know, what they can and cannot do as mentioned before. 

Mr. Zhisong Deng: Without a common international approach, I am noticing three risks.  trade barriers; localization requirements will increase the burden on the global enterprises; losing the maturity of the digital economy. We can start with reaching more bilateral or regional agreements. The core issue is that all the countries should respect the national and security interest of other countries. 

Ms. Linda: Africa where most of the data protection laws now are enacted post GDPR mirrors the GDPR, sometimes it didn't work because of the different context, different budgets, and different political structures.  Different ways of looking at data governance and see where does the public interest lie, and where does also the national security lie.   How do we achieve data serenity without data localization.  And sort of balance tools and agree on that as we proceed to have the global conversations.  Because I think those differing points of view really have people in their own corners, deciding on their own data Government approaches. 

 

 

 

 

 

 

 

 

 

 

 

 

IGF 2022 DC-CIV DC-CIV: Geopolitical neutrality of the Global Internet

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

Maintaining a generally neutral, connected and resilient core Internet infrastructure is of vital importance. Combating bad behaviors sometimes can be locally implemented (dropping DOS traffic, seizing abusive domain names, filtering “bad” content, detecting and filtering malware, spam and phishing).

,

The major problems today are at the application layer whether this is misinformation, disinformation, CSAM, phishing, surveillance, etc. Finding ways to create incentives for good practices and to discourage bad ones is a challenge. When incentives don’t work, we need ways to hold badly acting parties accountable. This will require international cooperation in cases where harns are inflicted across jurisdictional boundaries.

Calls to Action

Stakeholders should study the process and suitability by which (a) sanctions are decided, either as a result of UN processes, or developed in a multi-stakeholder way --- and whether sanctions are indeed the right response, whilst sanctions might indeed be damaging the Internet and (b) are implemented or indeed implementable?

,

Stakeholders should weigh the above against fierce opposition from some countries and stakeholders to any kind of damage to the Internet and its Core Values, thus these should be making them limited, proportional, and with minimal undesirable side effects - plus procedurally clear, multistakeholder, and transparent.

Session Report

 

MAIN REPORT

 

The Internet is a Network of Networks, global, not only in a geographical context, but by several shades of the term 'global' in terms of being free of cultural, ideological, political bias, and global in terms of the technologies that converge into it. On many occasions, the decision taken was to separate geopolitics from the Internet which would make the Internet into two or more 'Splinternets' in place of the unfathomably valued One Internet.

 

Yet, recently, some leading members of the Internet Community signed a common statement "Towards the Multistakeholder Imposition of Internet Sanctions" - opening the door to the Internet Community having some means to decide on whether sanctions such as disconnection from the Internet would be appropriate. This Statement and background can be found on: https://techpolicy.press/towards-the-multistakeholder-imposition-of-internet-sanctions/

 

This brings forward the question of whether the Internet should be part of a sanctions regime.

 

In the session, panellists were asked the following questions:

 

  • Is the Internet technical architecture and infrastructure as currently defined able to impose sanctions?
  • Is the Internet management/administration as currently defined able/willing to impose sanctions
  • Indeed, should it impose sanctions, knowing these would break Core Internet Values?
  • What path could the Internet’s Governance take in the future:
    • Is the future that of assuming a technical mission, simply maintaining the geopolitical neutrality of the Internet or will it expand to reflect ways of the Internet bridging real world geo political and cultural divisions to make One World, or Two, Three or more?

Panellists answered as follows (some comments submitted in writing by the authors themselves, some paraphrased from their interventions):

 

(Vint Cerf, Google)

  • Maintaining a generally neutral , connected and resilient core Internet infrastructure is of vital importance. Combating bad behaviors sometimes can be locally implemented (dropping DOS traffic, seizing abusive domain names, filtering “bad” content, detecting and filtering malware, spam and phishing). 
  • The major problems today are at the application layer whether this is misinformation, disinformation, CSAM, phishing, surveillance, etc. Finding ways to create incentives for good practices and to discourage bad ones is a challenge. When incentives don’t work, we need ways to hold badly acting parties accountable. This will require international cooperation in cases where harns are inflicted across jurisdictional boundaries. 
  • It is important that attempts to apply sanctions proportionately and to follow the principle of subsidiarity. It is a mistake to apply sanctions at the wrong layer in the architecture. For example, shutting down the Internet to deal with bad behavior by some parties is an overreach that creates a lot of harm for those innocently relying on the operation of the network. 

 

—-------

(Bill Woodcock, Packet Clearing House)

  • The implementation of sanctions via Internet means currently faces two principal challenges: On the network side, network operators typically under-comply or over-comply, due to difficulties in appropriately scoping enforcement actions. On the governmental side, sanctions regimes are not typically published in a uniform, consistent, or machine-readable format, they’re not published in a single predictable location, and they’re not harmonized with other regimes. 
  • Many very specific implementation issues exist as well, starting with governments’ predilection for transliterating foreign-language or foreign-character-set names of sanctioned entities in diverse and inconsistent ways, rather than using the most-canonical form of each name, in its native language and character set. Network operator implementation has been occurring within the Sanctions.Net community since March of 2022, and governmental harmonization efforts have been occurring principally within the Digital directorate of the OECD.
  • Most conversation about Internet sanctions implementation has been positive and collaborative, since governments wish to see their sanctions regimes respected, and network operators wish to comply with the law and protect their customers. Dissenting voices have questioned the legitimacy of sanctions regimes from both the right and the left, principally fearing governmental overreach.

 

—--------

(Veronika Datzer, Advisor at German Parliament)

  • It is impossible for politics to refrain from the internet because it already is. This process cannot be reversed. We therefore need political solutions because the technical infrastructure of the internet must remain neutral.
  • Solutions to making the internet a peaceful place must not include internet sanctions as these impact all people and can have dramatic adverse consequences. They must be based on a multistakeholder model and co-create what it means to establish a peaceful internet, as such an understanding should not be imposed.
  • We need to be in close cooperation between the technical community and the political community. 

 

—--------

(Iria Puyosa, Toda Institute)

  • The global multistakeholder governance ecosystem should center the protection of human rights to safeguard internet core values. Sanctions against States that violate international law may be necessary in cases of widespread human rights violations or credible allegations of crimes against humanity enabled by  State agencies' internet usage.  
  • The global internet may need to create a multistakeholder policy advisory body that provides guidelines on targeted sanctions that may be enforced if necessary. Nonetheless, sanctions must be targeted, specific, and proportional. Also, a robust and reliable due process must be established for making these decisions 
  • The establishment of rules and processes to define and enforce sanctions should not be decided by a small number of governments (such as those belonging to the OECD). The policy formulation process should involve countries from different regions of the world. Otherwise, the sanctions regime may be considered unilateral and provide an excuse for the "sovereign internet" model leading to the splinternet.
  • All of the countries are working their model of sovereignty and Internet some are taking an approach that is completely different than that model we're used to, the open, free, Internet.

 

—------

(Bastiaan Goslings, RIPE NCC)

  • It is not in the mandate of technical organisations like RIPE, not within the policies that determine how these organizations are run, to make decisions on sanctions. Policies  are set by multistakeholder communities across an entire service region, which includes many jurisdictions. If there are sanctions, they need to be decided following due process, democratic fashion demonstrating that the sanctions are proportionate to the goals to be achieved. Economic sanctions are set by the European Union.
  • RIPE has no authority to actually enforce what they are doing as it operates as a trusted technical organisation, a neutral authoritative entity in this case, but no enforcement power of any kind. Networks using the RIPE database operate on Trust.
  • Anyone can decide they do not trust this system and operate their own registry. From that perspective, it is a vulnerable system.

 

From the Floor

  • The way to protect the Internet is to isolate the Internet from politics completely and emphasize to the parties that are geopolitical that this is not a geopolitical space and politics has to be out of Internet governance in order to protect the Internet. (Sivasubramanian Muthusamy, ISOC Chennai)
  • As a national regulator, when it comes to technical issues or technical harm, I guess from a technical point of view, it is easy to spot something going wrong or to spot harm. And stop it in a way or another. But Nationalisation of the Internet is a concern - how can we stop that? Secondly - it is already too late to keep politics out of the Internet as it is used by politicians. (Hadia El Miniawi, Egypt Telecom Regulator)
  •  it can't be Germany. It can't be the European Union. And it also can't be Google. We all have to be included in it. That is why the IGF is so important. It is multistakeholder and through a process that is completely inclusive to all states and big companies, at least. (Veronika Datzer)
  • There is no such thing as a splintered Internet. Once it is splintered, it is not the Internet. We have to deliver to the Government and leaderships that we can't splinter the Internet. If you start imposing a sovereign law, you will start actually removing functions of the Internet, totally or significantly. The Internet Society and myself, others, have created frameworks, whereas you can see how different policy proposals and Treaties work with the Open-Ended or Governmental experts in the U.N. It shows how the proposals can be commensurate with the Internet or not. With that knowledge, we have to work on a multistakeholder basis first. (Alejandro Pisanty, UNAM)
  • From the Global South perspective what we need the international community to do is actually work on the basis of humility, empathy, solidarity rather than punitive approach, which could actually be prone to bias or to political and economic agendas. Because always when we have a political and economic agendas rolling the dice, the reality that we have on the ground is that those that are the most vulnerable and that are least responsible and least involved in great power competitions, they're the ones that suffer. …we now have the Sustainable Development Goals and these need to be the priority of the international community….  So we very much understand that there are some trends related to the use of Internet for political and international security and geopolitical issues. But this trend must stop. And we must use the Internet to achieve the Sustainable Development Goals.(Tulio Andrade from Brazilian Ministry of Foreign Affairs)
  • The issue, the important issue of geopolitical neutrality of global Internet should be reflected through Global Digital Compact. The first one is development of internationally legally binding (?) from cybersecurity based on the international law. The suggestion is establishment of framework, rules and norms and accountable behavior of digital platforms and serve providers in data security and content and law. And defining a common region for Internet as a peaceful and development oriented environment for public good. Not as a new battlefield and militarized environment. Through signing a global Declaration by all members. The last one, internationalization of Internet and public core as a trust building measure could help global Internet to be geopolitically neutral. (Amir Mokabberi, Tehran University)
  • We have to have an agenda to continuously help the public officials to give them knowledge and help them determine what they think from a public interest perspective they have to do. It is useful to distinguish the core of the Internet, the functionalities, the numbering, naming, routing systems as opposed to everything that happens on top of the Internet. (several participants)
  • What we need is a space where politics can take place forever without destroying the structure, the Internet itself. The Internet is not one monolith as Bill emphasized. There are lots of networks. There is a great incentive for the Governments to recover the mantras which we had forever on the Internet side. Connectivity is its own price. You lose more than you gain when you lose connectivity. So let's start pushing more for outcomes at the multilateral level that are compatible with what has been happening under technical and multistakeholder sides for so many years. (Alejandro Pisanty, UNAM)

Summary by Olivier Crépin-Leblond - 28 December 2022

 

 

IGF 2022 Open Forum #108 Combatting Disinformation without Resorting to Online Censor

Updated:
Enabling Safety, Security and Accountability
Session Report

Combatting Disinformation without Resorting to Online Censorship – Open Forum organised by LATVIA

Date, time, venue:

30 NOV 2022, 10:50 UTC, Caucus Room 11

Moderator:

Viktors Makarovs, Special Envoy on Digital Affairs, Ministry of Foreign Affairs of the Republic of Latvia

Speakers:

Ms Anna Oosterlinck, Head of the UN team, Article 19; Mr Allan Cheboi, Senior Investigations Manager, Code for Africa; Mr Rihards Bambals, Head of Strategic Communications Coordination Department, the State Chancellery of Latvia; Mr Lutz Guellner, Head of Strategic Communication, Task Forces and Information Analysis Division, European External Action Service; Ms Melissa Fleming, Under Secretary General of the United Nations for Global Communications.

Main information about the panel:

Disinformation is a major threat to public safety, security and democratic stability. Governments around the world fight disinformation in different ways. Right now we see a particularly concerning trend for online censorship by some governments as a way to address real or presumed threats posed by disinformation. By applying censorship, governments take away or limit citizens’ freedom of speech and expression. Online censorship also often goes hand in hand with information manipulation.

The main panel discussion focused on identifying ways for governments, organizations and platforms to address disinformation without resorting to online censorship, bans or internet shutdowns. There was broad agreement that disinformation is most effectively addressed by means of wholistic and multifaceted policies, and that this approach is also rights-compatible. Successful implementation of this approach must be based on a conceptional framework that identifies and defines the challenges. It starts with a critical examination of the true cause and context of disinformation and of the risks it presents to society at large. Free and independent media and better public communication are the best tools to fight disinformation. A free, open, safe and secure online environment is most resilient to disinformation.

Key points by each speaker:

Allan Cheboi: Misinformation is false information without intention to do harm, but disinformation is false information that is used to harm or influence other people’s decisions or thoughts. Disinformation is also used to gain power. In most of the cases, misinformation misrepresents facts, while disinformation is centred around a narrative. To address disinformation, we need to customize laws to include information monitoring. Specifically, for the local disinformation attempting, for example, to influence the outcome of elections. We need to make substantive investments to cope with the challenge. Another alarming development is disinformation targeted at the United nations (UN) peacekeepers. That needs to be tackled swiftly and decisively.

Rihards Bambals: Disinformation is false or misleading content disseminated on purpose to mislead and to gain political benefit. Disinformation is a global man-made disaster that is hazardous and influences vulnerable people. Latvia addresses the challenge with centralized information environment monitoring capabilities covering both traditional and social media. Latvia’s strategy is based on three pillars: effective government communication, quality of independent journalism and media, societal resilience. It is of utmost importance to invest in media and information literacy. Governments need to strengthen citizens’ capacity to think critically and recognize and report disinformation cases. Some governments invest billions in spearing disinformation. One example is the Russian Federation’s massive disinformation campaign accompanying its military aggression in Ukraine.

Lutz Gullner: First, we need to define the problem and distinguish between misinformation and disinformation. Misinformation is false information with no intention; disinformation is based on a clear intention. Disinformation can be used as a way to gain economic benefits. There are five characteristic elements we need to look when distinguishing disinformation: harmful, illegal, manipulative, intentional, coordinated. The European Union uses the ABC model which stands for: A – actor, B – behaviour, C – content. This is a technique for distinguishing disinformation and identify actors that are trying to manipulate the information. The approach allows governments to prevent censorship and look at given information in an objective manner.

Anna Oosterlinck: Disinformation must be seen in a wider context, including: reduced pluralism and reduced diversity of information that we can access online; challenges connected to the digital transformation of media; underlying social causes including economic and social inequalities leading to mistrust and polarization. All these factors combined create environment where disinformation can flourish. Disinformation has been addressed in some laws as restriction on false statement of fact that can cause substantial harm or laws on election fraud or misleading advertising, or sales of certain products.

We need to fight disinformation with a number of positive holistic measures by a range of actors. To fight disinformation, we need free and independent media environment, strong protection for journalists and media workers online and offline; implement a comprehensive right to information laws including by complying with the principle of maximum disclosure of information and by proactively releasing information of public interest. Governments should not spread disinformation themselves; they need to ensure connectivity and access to free Internet; invest in digital media and information literacy; adopt positive policy measures to combat online hate speech; work with companies to make sure they respect human rights.

Melissa Fleming: Back in 2018, the UN found that disinformation and hate speech online played a significant role in stoking horrific atrocities against the Rohingya population. They pushed ordinary citizens to commit unspeakable acts. And similar stories have emerged in many other conflict settings. For example, recently in Ethiopia Facebook posts spread hate and inspired attacks. In Ukraine, information is also being used as a weapon. Meanwhile, in Ukraine's neighbouring countries, we're seeing how spreading of lies about refugees brings more suffering for the most vulnerable.

Free speech is not a free pass. In the era of mis- and disinformation, free speech is much more than the right to say whatever you want online. Platforms must face the fact that they are constantly being abused by malign actors; they must live up to their responsibility to protect human rights and save lives. United Nations are constantly engaging with the platforms and advocating the need for the platforms to do their due diligence on human rights and to review their business models against the UN guiding principles on business and human rights. The platforms should offer a robust framework to reduce the dissemination of harmful falsehoods, as well establish remedy mechanisms.

The UN “Verified Initiative” succeeded in getting accurate lifesaving information to communities around the world during the COVID-19 pandemic. The UN is also working to strengthen the capacity of social media users to identify and avoid falsehoods by promoting media and information literacy and by creating own teaching tools. The UN has launched two free online digital literacy courses on mis- and disinformation in collaboration with Wikipedia. The courses are in multiple languages and are being taken by students of disinformation all over the world, hopefully improving their ability to identify mis- and disinformation and not become part of the spreading cycle.

The UN also encourages governments to promote various measures to foster free flow of information, enhance media diversity and support independent public interest media as a means of countering disinformation.

Summary/Conclusions:

  • Disinformation is false information put in circulation to intentionally do harm or to gain political or economic benefits. 
  • Disinformation can and should be addressed without resorting to censorship.
  • We need to keep developing our conceptual framework on disinformation and related phenomena.
  • Policies to address disinformation should focus on fostering a free, open, safe and secure online environment and on strengthening free and independent media.
  • The online platforms need to improve their efforts to better address disinformation.
  • It is important to strengthen citizens’ ability to identify and counter disinformation by investing in digital and media literacy programmes.
IGF 2022 Town Hall #109 Jointly tackling disinformation and promoting human rights

Updated:
Key Takeaways:

There is no "silver bullet" to tackle disinformation. Addressing the issue requires multiple complementary measures, such as effective fact-checking, building digital literacy for all (including the most vulnerable), holding those who profit from disinformation accountable, and the involvement of all stakeholders.

Calls to Action

Report

Session Report

The session brought together over 60 organisations for an open exchange of ideas, experiences, and lessons on how to address disinformation through a multi-stakeholder and human-centric approach. In particular, the debate focused on how Africa-Europe partnerships can help tackle the issue, in light of the AU-EU D4D Hub’s mandate to foster digital cooperation between both continents.

The panellists explained how fact-checking has grown dramatically in recent years, becoming one of the most common measures to tackle disinformation. Nevertheless, more than effective fact-checking is needed, they warned. Africa-Europe cooperation should adopt a comprehensive approach integrating multiple complementary measures, such as building digital literacy for all (including the most vulnerable), holding those who profit from disinformation accountable, and the involvement of all stakeholders in devising solutions.

Bringing all actors to the table

Simone Toussi, Project Officer for Francophone Africa at the Collaboration on International ICT Policy for East and Southern Africa (CIPESA) highlighted how “disinformation is a multi-faceted phenomenon that directly threatens democracy and human rights and affects all stakeholders in society”.

“Disinformation manifests in many ways, and can be perpetrated by a diversity of actors,” she added.

As such, she argued that countering fake narratives needs both online and offline efforts undertaken in a coordinated manner by governments, intergovernmental organisations, civil society, media, academia, and private sector. “Multi-stakeholder collaboration is crucial to bring together different views and understanding of the roles that each actor plays,” she said.

Toussi presented research findings proving that measures to tackle disinformation can be ineffective or inadequate when they only consider the point of view of a single stakeholder. For example, fact-checking is sometimes challenged by lack of access to information. Media and civil society participation can help ensure that governments treat information as a public good.

Engaging with the private sector… how?

The debate also touched on the essential role that technology companies play in keeping disinformation from spreading. Ongoing efforts by private sector include partnering with civil society and fact-checkers — including through multi-stakeholder collaborations as proposed by Toussi.

Nevertheless, for Odanga Madung, journalist and Mozilla Foundation fellow, such measures are not enough. He argued that one of the major contributing factors to disinformation is that fake or misleading information is algorithmically amplified by big companies.

“Big companies and social media platforms profit from the spread of disinformation. It is part of their business model, which is a very serious problem,” he said.

For Madung, tackling disinformation requires strong regulations to protect users and their rights, addressing big technology companies’ dominance, encouraging competition, fostering new ideas on different business models, and decentralising the Internet.

Planting the seeds of change

Charlotte Carnehl, Operations Director at Lie Detectors, proposed further investments in training teachers and fostering exchanges between journalists and school-age kids: “Countering the corrosive effect of disinformation and polarisation on democracy requires empowering school kids and their teachers to tell facts from fake online.”

She argued that enabling journalists to visit schools to explain how professional journalism works is a win-win situation. It can help journalists to learn about how the younger generation accesses and consumes information, while teachers and children can gain practical skills in identifying fake or misleading information online.

“Everybody needs the skills to assess and critically think about information,” Carnehl said. “Kids are actually a high-risk group for disinformation because they are targeted on channels that can’t be monitored, and they are largely navigating them by themselves without their teachers or even their parents present.”

When questioned on the short-term impact of such measures by a member of the audience, Carnehl acknowledged that it’s a long-term investment, “like planting the seeds of a tree”. However, she argued that there are also some immediate positive effects for children.

Finally, Carnehl called for special attention to be paid to marginalised groups, such as rural populations. Civil society organisations could help ensure that everyone can access reliable information, she said.

IGF 2022 WS #341 Global youth engagement in IG: successes and opportunities

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

There are existing youth initiatives that put a lot of effort in bringing Internet governance closer to fellow young people. Newcomers to the IGF community could learn and benefit from joining them.

,

In the recent years, the youth-led and youth-oriented capacity-building progammes has been strongly developed and growth in power.

Calls to Action

There is a need to build platforms of cooperation between different youth initiatives in order to achieve common goals more efficientely.

,

There are still some gaps in the inclusiveness of the IGF (for ex. the languege barriers) that young leader should be aware of.

Session Report

Firstly, Emilia Zalewska did the opening and presented the objective of the session which was to present youth initiatives and opportunities existing in the IGF community.

Then, the first speaker, Nicolas Fummareli from Youth Coalition on Internet Governance (YCIG) talked about the activities of his organisation, for ex. forming 5 Working Groups of young people from all over the world that prepared 13 session proposals for the IGF (11 was selected), conducting a series of preparatory webinars before the IGF and inolvement in regional NRI’s initiatives

He also encouraged the participants to take part in upcoming elections to the next YCIG’s Steering Committee.

The second speaker, Veronica Piccollo from Youth Standing Group explained the origins of her initiative that could be found in Brazil. The objective was to foster the participation of young people from Latin America in Internet Governance. Now the initiative has been recognised by the Internet Society as a standing group. It collaborates strongly with the  YCIG. One of the current activities of Youth SG is creating a new edition of the Youth Atlas that will map involvement of young people in IG.

The third speaker was Athéna Vassilopoulus representing the Generation Connect Europe (GC), ITU. She told that her group aims to engage young people in the activities of the ITU. In the first year, the GC created a youth declaration. The next year it has worked on preparation of the event Digital Youth Jam. It has also participated in the youth summit in Kigali, Rwanda. GC is currently restructuring the group and will create a call for new participants in January.

The fourth speaker, Piotr Słowiński from NASK National Research Institute described its role  in promoting youth in IG and cybersecurity. It started in 2020 and the process is ongoing. The participation in the Global Youth Summit in Katowice, prepared by Youth IGF Poland in cooperation with NASK was massive and allowed to connect youth and experts.

The fifth and last speaker, Melaku Hailu from Model Africa Union said that their initiative is the second model that created the Woman, Gender and Youth Directory. They do simulation of the Peace and Security Council of African Union. Their model addressed the SDGs with three pillars. social, economic, and environmental approach.

The speakers’ inputs were followed by the discussion with participants of the session. They expressed the need to push youth engagement forward and to find more people motivated to engage. It was highlighted that more advocacy work should be paid. There was also a comment that the IGF is in English which is a barrier for it to be inclusive.

IGF 2022 WS #420 Skills of tomorrow: youth on the cybersecurity job market

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

There are gaps existing between what cybersecurity industry expect from gratuates and what they can offer. This disparities especially concern women and youth.

,

The cybersecurity job market needs to be more open to newcomers - for ex. not to require much experience for the starting positions - and to invest in finding and training new talents.

Calls to Action

There is a need to increase collaboration between government, academia, and the private sector to equip young people in skills neccessery in a job market.

,

Children should be taught early about career oportunities in cyber industry and have chances to learn some basic skills, for ex. at bootcamps.

Session Report

The session began by conducting 2 online polls via an online tool Mentimeter.  The moderator, Emilia Zalewska, asked  audience what their predictions are on following topics:  What are the top two skills that employers in the cybersecurity sector say they look for? Which types of cybersecurity jobs are most difficult to fill? In the first question, audience answered mostly correcly, that the top two skills are "Problem solving and teamwork". However, in the second question, most respondents chose that the job most difficult to fill is "Cloud backend engineer" while the correct anwser was "Cybersecurity manager"

After that part, the floor was taken by Teuntje Manders. She presented the audience with the ISC3 research project and its results on which the 2 questions in polls were based. The studyh involved surveying leaders in the cybersecurity industry to obtain data on what needs exist on the job market in this sector. 

This was followed by the next part of the session, in which there was a discussion among speakers, Firstly, Nancy Njoki Wachira conveyed to the international community that she is very happy that these types of projects take place, and that they will help young people with a career in cybersecurity.

Mohammad Ali Januhar commented in turn on the technical layer, related to how the industry works. He then reflected this on the market and its needs, in terms of employing young people.

Samaila Atsen Bako drew attention to universities and schools and whether they adequately prepare people to enter the cyber security profession.

Anna Rywczyńska spoke about the issue of access to development opportunities in this matter and the ease of access to information and training opportunities.

Samaila Atsen Bako also commented on the problem of the experience that business requires from candidates for cyber security positions.

Following these remarks, the floor was once again taken by Anna Rywczyńska. She referred to the insights as well as the issue of women in the industry.

Before the second round of discussion, the moderator invited questions. There were questions from
the audience:

1. What we can do to address the cultural barrier in accessing to cybersecurity job market?
The questioner argued that, from at least an African perspective, parents of young people do not
share the enthusiasm of young people to work in the cyber industry. The question was addressed by all panellists. It was pointed out that the biggest problem is to reach out to young people to promote knowledge, the industry.

2. Another question was asked by an online participant. After also pointing out the cultural aspect and his own family experience, the questioner posed this hypothesis: If you acquire the right skills, will you earn high enough. Therefore, he asked the provocative question whether it is not the young who should support their younger siblings, to choose the right path in terms of working in the field of new
technologies. The question also alluded to a question from the audience, which referred to cultural issues.

3. Many people, including those in governing circles, do not understand cybersecurity, so how do we require them to support knowledge in this area?

4. Where are the platforms for sharing knowledge, for acquiring knowledge about cybersecurity?

Some members of the audience presented thei statements as an input to the discussion:
1. Schools do not teach practical IT issues. We don't learn at school how to work with online
threats, for example.
2. Creating additional communities in schools and universities helps to understand the complexity of cyber security and also to get non-technical people interested in the subject.
3. It is necessary to help educational establishments outline the needs of young people so that they can develop their interest in cyber security.

After a round of question collection, the panellists  began to respond to questions and statements from the audience. Anna Rywczyńska and Mohammad Ali Januhar tried to address the issues that were flagged up in the room. Samaila Atsen Bako and Nancy Njoki Wachira have
completed the replies with their observations:
Parents want to protect their children, so they don't see their childrens careers as technology experts because they themselves don't know how to guide their children in this world so as not to make them addicted to technology or cause them to use it for bad things. This causes a cultural blockage.
There is a need to create an environment that promotes and facilitates access to knowledge on
cybersecurity.

IGF 2022 WS #292 connectivity at the critical time: during and after crises

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

 

Report: IGF 2022 WS #292 Connectivity at the critical time during and after crises

The session was moderated by Innocent Adriko who gave a background information and introduction to the session.

Ethan Mudavanhu started by acknowledging that everyone has a part to play when it comes to preparedness before or post-crisis. He said civil society’s role might be to provide strategies on how to minimize damages to critical infrastructure. He also mentioned the need to define the roles of each stakeholder around national emergency telecommunications plans and integrate those plans as climate change and adoption policy priorities.

He also mentioned that government has a role in regulating future technologies in emergency preparedness and resilience of internet connections. He added that, that could be achieved by envisioning satellite and the Internet of Things (IoT) as part of the emergency system which can be used as search and rescue alternatives.

He again mentioned partnership is important for the government, the private sector, and civil society to create better solutions and activation protocols. He gave examples of partnerships that helped in mitigating damages and ensure connectivity. One example was in Australia where the Stand program was a disaster satellite service and the first to be rolled out through funding provided by the government which was strengthening telecommunications against national disasters. He urged the need for Africa to consider a combined effort of terrestrial and extraterrestrial going forward.

Shah Zahidur Rayamajhi noted that connectivity issues might be looked at by the national and local authorities as each country has National Disaster Management Organizations. These management disaster management organizations are included with ICTs, ISPs, humanitarian organizations, and civil society organizations. He again noted the need to look at the option and evaluate connectivity that is relevant to the affected population due to a disaster.

He also mentions internet connectivity solutions for the affected population and designing solutions depending on each context and considering the different needs such as human, socio-cultural, economic, and affordability. He added that connectivity solutions bundled with internet, voice, SMS, and other data services are understood to be the best solution. The need to deploy various VSATs and WI-FI accessories through local partners or other stakeholders who may help in the deployment and service restoration process.

He highlighted the need to have portable connections and ensure the privacy and protection of data services provided to the affected populations. He again mentioned that after the emergency has been restored, the service has to be maintained and operated for a certain period because the telecom service provider might be affected and not be able to provide services to the community. He suggested that community networks can be deployed to support the connectivity efforts of response activities during and after crises.

CALEB KWABENA AYITEY KUPHE on his side started by defining a critical infrastructure as a system and an asset, whether physical or virtual in our digital world. In terms of crisis, there is a need to understand the three (3) elements of critical infrastructure; the physical, cyber, and human. We must also understand the effectiveness of the critical infrastructure for ensuring the effective functioning of the economy as that is an essential factor in determining the location of the economic activities or sectors that can develop in a country.

He highlighted that developing countries need to understand the framework and plan for the infrastructure system and know how to protect the critical infrastructure sector by understanding the risks involved and knowing where the vulnerability lies. He also added that collaboration between stakeholders like governments, the private sector, and civil society is important. He said the government and the private sector can help with some resources like routers, switches, computers, et cetera while civil society, academia, and the technology community can also collaborate to provide training to deploy the telecommunications infrastructure.

He again noted that developing countries must be able to resource personnel that can understand the emergency response. He referred to the collaboration between the Internet Society Ghana chapter and NetHope trained people in disaster management and he stated that he believes when we come together and provide resources for such people, we will be able to manage critical infrastructures.  He concluded by stating the need for developing countries to double the current investment level in emergency response projects.

EILEEN KWIPONYA started by noting that disasters can strike at any time giving Covid-19 as an example of such an emergency which can happen anywhere and at any time. She said the pandemic brought the world to a standstill which resulted in the closure of companies and some of them had to let their employees work from home. She highlighted that those that worked from home needed laptops and internet connectivity which partnership between stakeholders helped to solicit resources to enable continuity of work and daily life activities. She added that schools had to continue running and the government came in ad worked with organizations to solicit resources such as laptops and provide internet connectivity in schools to enable students to learn.

She again noted that ITU plays a critical role in disaster risk reduction and management by supporting its member states in the process of disaster management through designing of national emergency telecommunication plan whereby in 2023 all countries should have a national emergency telecommunication plan as part of their national disaster risk reduction strategies. She continued to add that low-income countries are left out when it comes to having a national emergency telecommunication plan and it will be difficult for a country to manage a disaster without a plan. She stated that it is important for governments and local stakeholders to come and develop emergency telecommunication plans to enable their response to disasters without the need to seek help from outside.

She noted that civil society can also contribute by creating awareness and providing capacity building through training communities in emergency response for them to be able to respond to emergencies when they strike. She highlighted that funding is a critical aspect when managing a disaster.

Ernestina Lamiorkor Tawia noted some of the basic things that the telecom sector does during a crisis. She highlighted that during a crisis, the telecom sector helps to make people stay safe, connected, and informed by notifying people of the occurring disasters, where disasters are being hit. This she said, helps to save lives.

She acknowledged the need for backup for critical infrastructure as communities don’t have to depend solely on telecom companies for their communications as they wouldn’t know what will happen to the infrastructure during a disaster. The need to get backup batteries that would last long and generators to provide power to communications infrastructures in the case a telecom infrastructure fails due to a power outage.

IGF 2022 DCPR Platform responsibilities in times of conflict

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

Although there may be convergences between different national legal regimes regarding platform governance, national regimes will inevitably diverge, fostering Internet fragmentation. Speakers highlighted the necessity of a multistakeholder agenda to enable the development of shared guidelines and procedures towards meaningful and interoperable transparency, for improving platform governance and law enforcement, and also to ensure users' rights.

,

One of the main points brought by the speakers was the obstacle of fragmentation for the regulation of digital spaces, something that was addressed from an infrastructural perspective to a rhetorical level.

Calls to Action

More transnational and national collaboration among independent regulators and authorities to foster platform observability;

,

Platforms should cooperate by providing continuous data access and information to researchers and independent investigators and not only publishing transparenc reports;

Session Report

The session aimed at discussing the impacts of platforms' regulation on internet fragmentation and how common standards and interoperability could help to mitigate it. The speakers stressed that to improve platform governance and enforce users' rights – especially regarding content moderation and possible impacts on fundamental rights –, it is necessary to develop a multistakeholder agenda on shared guidelines and procedures towards meaningful and interoperable transparency. 

The panel began with Prof. Luca Belli presenting the session and highlighting the impacts that both private ordering by platforms and national regulation pose to internet usage. Prof. Belli also emphasized the great role that platforms and its regulations – or lack of them – play in the responsibility have in respecting human rights. Those remarks were followed by Yasmin Curzi’s statement over the objective of the DCPR Session, which is to explore how platform regulations are affecting internet fragmentation worldwide. She stated that "such regulations may be causing negative externalities for users and for the enforcement". Some examples are data concentration, censorship, conflicts of jurisdiction, and others. The speakers were encouraged to discuss possible guidelines to orient policymakers in creating a more trustworthy and inclusive digital environment, that may be able to foster user control, and interoperability.

In this regard, one of the main points brought by the speakers was the obstacle of fragmentation specifically on the regulation of digital spaces, something that was addressed from an infrastructural perspective to a rhetorical level. From the first moment of the discussion two points can be highlighted: accessibility to information and the actual ability to use it. Oli Bird, from Ofcom, and Vittorio Bertola, from Open-Xchange, commented on the internet infrascrutural fragmentation that can be observed from software programming to hardware operation, for example. Bertola highlights that although States can contribute to IT with shutdowns etc, also the big internet platforms can disrupt national regulations on content. Oli Bird, on the other hand, brought examples of user experience on mobile devices to point out the multiple possibilities of the digital world both in terms of ideation and use. He added that, although there may be convergences between different national legal regimes regarding platform governance, "national regimes will inevitably diverge", fostering internet fragmentation. Point also highlighted by Yasmin Curzi, who recalled the infrastructural issues of access and its implications for regulation in the unfolding of the possibilities of usage. 

On a rhetorical level, some of the concerns pointed out by both Prof. Luca Belli and Prof. Rolf Weber were around determining a "common semantics" and methodologies for a jurisdiction capable of encompassing regulation that serves both local and global levels. On this matter, Prof. Weber also highlighted that open standards are important tools to promote interoperability. Adding to the discussion, Prof. Nicolas Suzor, from Oversight Board, talked about some of the challenges the current way platforms use automatic classification systems: marginalized populations face errors more frequently due to the lack of data regarding them. 

In response to Nicolas Suzor, Emma Llansó included to his point that the reluctance of platforms to make information available is part of the big picture, the researcher from Center for Democracy & Technology and Action Coalition on Meaningful Transparency, states the importance of independent research regarding online services in order to inform public policy-making, adding to that, she states the entering into force of the Digital Services Act in the EU as a way researchers will have to access data from platforms, which will be useful for policy making, especially regarding transparency. Luca Belli had as final remarks that the quality and relevance of the information in the reporting activity is more relevant than raw data.

IGF 2022 Open Forum #25 Explore the road of intelligent social governance in advance

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

AI brings continuous impact on human society, and human beings need to meet the transformation of intelligent society with governance innovation. Building a humanistic intelligent society requires the full cooperation of the government, enterprises, social organizations and academia. The formation of a people-oriented intelligent society governance (ISG) system is an important issue in exploring intelligent society.

,

It has become a new concept and new direction of the international community and academic community. The research on evidence-based intelligent society governance (ISG) based on social experiments can provide a new positivist framework for ISG. China has the advantages of in-depth application and extensive scenarios of AI, providing practical experience and cutting-edge exploration for the global AI development.

Calls to Action

Facing the global opportunities and challenges arising from AI, exploring the road of intelligent society governance (ISG) requires a new cooperation framework and governance model. Panelists proposed that it is necessary to respect the differentiated institutional and cultural backgrounds of various countries and enhance the diversity and inclusiveness of ISG while conducting international comparison and cooperation in ISG research.

,

Panelists further called on the international community to promote innovation through governance and share the future through cooperation by jointly initiating a global academic alliance, establishing an information sharing mechanism, as well as setting up an international cooperation fund, and strengthening discipline construction and talent cultivation in the field of ISG.

Session Report

Moderator: Dr Zhang Fang
Panelists: Zhang Peng / Prof. Su Jun / Prof. John E. Hopcroft / Dr Zhang Xiao / Prof. Simon Marvin / Prof. Huang Cui / Dr. Wang Yingchun
Rapporteur: Lian Xiangpeng / Tu Shengming / Zhang Yu / Ren Tianpei

This open forum session aims to discuss the research on intelligent society governance and the feasible path to realize people-oriented intelligent society governance. In today’s era, digital technology represented by artificial intelligence, as the leading force of the global technological and industrial revolution, is increasingly integrated into the whole process of economic and social development in all fields, profoundly changing the mode of production, lifestyle and social governance. The forum was organized by the Information Development Bureau of the Cyberspace Administration of China, and jointly hosted by the Institute of intelligent Society Governance of Tsinghua University and Center for Science, Technology & Education Policy of Tsinghua University. Seven experts from China, the United States and the United Kingdom discussed the opportunities and challenges of intelligent society governance and international cooperation, bringing together research topics such as AI social experiments, algorithmic governance, data governance, and urban governance. They agreed that intelligent society governance needs to adhere to the concept of people-orientation, to promote evidence-based intelligent society governance research, and to strengthen cooperation on a global scale, to build an academic community on intelligent society governance with shared resources and information, and to respect the diversity of governance cultures.

(1) Adhere to the concept of people-oriented intelligent society governance, and jointly build a humanistic intelligent society
The development of intelligent technology is based on embedding in human society. Humans are the initiators of the intelligent technology revolution, the main bearers of the new challenges of intelligent society, and the explorers of the risks of the intelligent society. The ability to promote people’s well-being and realize technology for good is the key to whether AI technology can serve the development of human society. 
In order to achieve a comprehensive balance of technical rationality and value rationality, intelligent society governance should establish people-oriented values, ensure the core position of people in the process of intelligent technology development, application and governance, and make the construction of a humanistic intelligent society as a goal of governance. Su Jun believes that a humanistic intelligent society is a people-oriented society with highly developed science and technology, widely used intelligent technology, a comprehensive balance of technical rationality and value rationality, a harmonious coexistence of people, environment and technology, an open, inclusive and harmonious society, and a rich humanism. Humanistic intelligent society governance needs to achieve a deep integration of social values and technology.
Wang Yingchun highlighted that social values and rules should be deeply integrated with technology and applications, so that values can be embedded in technology and applications can reflect rules, and the two aspects can boost each other and promote each other. Data governance is one of the core aspects of intelligent society governance. Huang Cui mentioned that there is an urgent need to establish a new global data governance framework, build a more inclusive, people-oriented and warm global data governance system, and effectively protect personal data and data related to intellectual property rights and national security intelligence, in order to promote the global digital economy and social development.
Panelists agreed that the concept of humanistic intelligent society governance is in the fundamental interest of human society and can guide the development of AI technologies in the direction that benefits human society.

(2) Conducting social experiments on AI and promoting evidence-based research on intelligent society governance
Artificial intelligence will be an important part of future human society. Exploring the path of intelligent society governance requires not only a focus on the development of AI technology itself, but also a systematic and comprehensive assessment of the ongoing social, economic, cultural, and political impacts of AI technology. As John E. Hopcroft argued, reducing AI social risks by strengthening AI governance is critical to ensuring the beneficial and safe nature of AI systems. In the face of systemic changes and future scenarios of intelligent society, Su Jun proposed to adopt evidence-based research methods such as “long-period, multidisciplinary, and wide-field AI social experiments” to address the various issues, risks, and problems brought about by AI technologies. This will further promote the application and development of AI technologies and enhance the effectiveness of governance of intelligent society.
Simon Marvin suggested that in the post-smart city era more attention should be paid to the integration of AI technologies in urban infrastructure construction and urban planning, and that regulatory reform and innovation should be promoted to better serve the development of AI technologies and social experiments. In order to give full play to China’s advantages of wide, deep and diverse AI application scenarios, and to deeply explore the impact of AI technology on people, organizations and society, China has actively organized and carried out AI social experiments in multiple fields across the country in recent years, which can provide practical experience and real feedback for the development of global AI technology. Moreover, Zhang Xiao believed that AI social experiments are important in exploring AI new paths in AI governance and promoting reform of the AI global governance system.

(3) Actively promoting international exchange and cooperation in intelligent society governance research, and building a community with a shared future for mankind in the intelligent era
Panelists agreed that it is necessary to explore the path of intelligent society governance under the concept of building a community with a shared future for mankind, strengthen international cooperation in research on intelligent society governance, promote the inclusive sharing of intelligent society, and realize the development of differences and win-win cooperation among countries. As suggested by John E. Hopcroft, it is necessary to strengthen international cooperation among governments, enterprises, industry associations and other diversified subjects around the world, to guarantee the right to speak for all parties involved in AI governance, to continuously improve laws and regulations on AI governance, to accelerate the development of procedures and standards for AI applications, and also to create more valuable activities for the marginalized groups in the AI system to improve their quality of life.
Finally, Panelists unanimously called for a new cooperation framework and governance model to build an international cooperation ecology for intelligent society governance. They advocated:

  • to initiate the establishment of a global academic alliance for intelligent society governance;
  • set up an international cooperation fund for intelligent society governance;
  • build a data sharing platform for AI social experiments;
  • strengthen the capacity building of intelligent society governance;
  • and train professionals through training, workshops, and academic conferences.

 

IGF 2022 WS #70 Fighting the creators and spreaders of untruths online

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Pre-emptive actions (e.g. pre-bunking, digital literacy initiatives) are needed to protect people from the risks and harms of false and misleading content online.

,

There is no silver bullet to stop untruths online. A cocktail of approaches are needed (education, media literacy, resources incl. technologies like ML, collaboration among factcheckers, int'l collaboration).

Calls to Action

Governments and the multistakholder community need to pool resources (monetary, knowledge) to fight the creators and spreaders of untruths online.

Session Report

This workshop explored the different types of untruths online – disinformation, misinformation, propaganda, contextual deception, and satire – and innovative ways to reduce the negative effects they have on people and society. Molly Lesher opened the workshop and set the scene, focusing on the OECD’s work in this area, including the  “Disentangling untruths online” Going Digital Toolkit note. She highlighted that while the dissemination of falsehoods is not new, the Internet has reshaped and amplified the ability to create and perpetuate content in ways that we are only just beginning to understand.

She recalled why access to accurate information important, including in the content of the fundamental rights to freedom of speech, the right to choose leaders in free, fair, and regular elections, and the right to privacy, which are essential for healthy democratic societies. She highlighted that false, inaccurate, and misleading information often assumes different forms based on the context, source, intent, and purpose, and that it is important to distinguish between the various types of untrue information to help policymakers design well-targeted policies and facilitate measurement efforts to improve the evidence base in this important area.

Sander Van der Linden then discussed his work to study false and misleading content online, and how from a psychological perspective false and misleading how misinformation infects our minds, how it spreads across social networks, and the ways in which people can protect themselves and others from its negative effects. He discussed how people can build up “immunity” through “prebunking” – that is, by exposing them to a weakened dose of misinformation to enable them to identify and fend off its manipulative tactics.

Julie Inman Grant discussed the Australian eSafety Commission’s work on addressing online risks and harms facing adults and children from the circulation of untruths online. She highlighted some practical approaches, programmes, and initiatives to address online harms, as well as differences in the impacts on and interventions for children versus adults and men versus women.

Rehobot Ayalew then gave remarks from the perspective of a seasoned fact checker who fights against untruths online daily. She underscored the importance of fact-checking, as well as its modalities and limits (e.g., scalability). She also highlighted the complications that non-anglophone countries face in combatting untruths online, and the mental health burden of having to research some of the more graphic and disturbing false and misleading content.

Pablo Fernández discussed Chequeado’s experience with the Chequeabot AI tool that facilitates factchecking in Spanish. He discussed how to find a balance between human intervention and digital technologies in the fight against untruths online, as well as how Chequeado usefully co-operates with the global fact checking community.

Mark Uhrbach then spoke about Statistics Canada’s efforts to measure misinformation so far, what surveys might be able to help us measure, and why surveys alone can’t measure everything, so we need to use some less traditional methods to fill in the gaps.

The panellists took several rounds of questions from the audience onsite and online. A key point from the discussion is that governments and the multistakholder community need to pool resources (monetary, knowledge) to fight the creators and spreaders of untruths online. Another important point is that pre-emptive actions (e.g., pre-bunking, digital literacy initiatives) are needed to protect people from the risks and harms of false and misleading content online.

IGF 2022 DCCOS Translating data & laws into action for digital child rights

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

1. The collection of data using tested methodologies that enable comparison is essential to ensure child rights in the digital environment. This can directly influence policy at the national level. 2. Education remains essential as a preventive measure but cannot replace the need for proactive measures by online service providers.

Calls to Action

1. Governments must provide funding for such data collection and analysis, as outlined in existing legal frameworks. 2. Industry must respond in a coordinated way and adopt a safety-by-design approach.

Session Report

SUMMARY REPORT: The round-table discussed how to ensure the safety, security, and privacy of children in digital environments through data, legal, regulatory, policy, and technology analysis in order to build a more robust prevention strategy and enhance a more efficient response mechanism. 

The session explored how evidence-based analysis from the Disrupting Harm project, combined with international laws such as General Comment #25 made a positive impact on children's digital lives. Furthermore, participants have discussed how and to what extent new regulations from certain countries and regions have informed actions in other parts of the world. The important role of digital service providers to act upon evidence, promote regulations, and ensure safety across all technologies they deploy globally has been further explored in the discussion. 

Amy Crocker from ECPAT International opened the discussion and commented that as the digital world develops, expands and diversifies, it is becoming not only necessary but also a legal and moral obligation that we address the diverse positive and negative implications of the digital world for children’s rights.  

Prof. Sonia Livingstone from the London School of Economics and Political Science highlighted the key link between data and legislation, saying that ‘data’ has taken on a new meaning in recent years as it has become focus on data that is collected. Initiatives such as General Comment #25 (GC 25) to the UN Convention on the Rights of the Child represent are a milestone in child protection in the digital environment. In the GC 25, the Committee explains how States parties should implement the Convention in relation to the digital environment and provides guidance on relevant legislative, policy, and other measures to be followed by States to ensure full compliance with their obligations under the Convention and the Optional Protocols. Implementing robust data collection is essential, including research “with and by children”, to inform policy makers and legislators. The principle of "non-discrimination" is also key. The Commission establishes that "all children" must have equal and effective access to the digital environment, and "all their rights" must be respected by policymakers and stakeholders. Privacy and a high standard of protection must be ensured. Gaps exist, such as the difficulty of collecting useful data and evidence from the Global South; the delicate balance between privacy and data collection; the soft-law nature of the Comment; and the difficulty of fully understanding the "interest of the child" by policy makers and legislators. 

Interventions from the floor: 

  • To a question about whether the GC 25 also addresses the need for better data on the phenomenon and of what works to tackle it, Prof. Livingstone asserted that the GC25 does oblige evidence-based policy and action. 

  • Rodrigo Lopez from the Brazil youth committee asked whether the GC 25 addresses targeted advertising to children because this is an issue in Brasil as this has been considered abusive for years but it still happens. The response was that it does, that that the DC has increasingly recognised the commercial risks to children and needs to find a way to deliberate on the tipping point between commercial use and commercial exploitation, perhaps by identifying some criteria and building on the CRC Committee assertion that State Parties should prohibit by law any targeting based on inferred characteristics.  

Serena Tommasino from End Violence’s Safe Online team affirmed that data and evidence-based information have been a useful tool to advocate for children’s rights. Disrupting Harm (DH), funded by Safe Online, is a large-scale research project producing unique national, regional and global insights into how online child sexual exploitation and abuse in Southern and Eastern Africa and Southeast Asia (Kenya, Uganda, Thailand, Tanzania, Ethiopia, Philippines, Viet Nam, Namibia, Indonesia, Malaysia, Cambodia, Mozambique, and South Africa). They are now investing in implementation in these countries, and research will begin in 10 more countries. Data scarcity and poor quality have limited our understanding of OCSEA. Generating high-quality evidence is fundamental to developing a coherent methodology that might be replicable in different countries. A recent event in Brussels, Safe digital futures, saw experts discuss how to improve the data ecosystem and devised a joint roadmap to inform policy and stakeholders. 

Rangsima Deesawade, Regional Coordinator for South East Asia ECPAT International highlighted that a reliable and standard data collection methodology at regional and global level leads to comparability. By aggregating data extracted in different countries following the same methodology, it would be possible to have a scientific comparison of information on OCSEA. She pointed out that some of the findings were new even for ECPAT, such as in relation to gender. The data showed that there were similar rates of victimisation between girls and boys, with interesting variances across the countries. This type of information can help change persistent narratives.  

Interventions from the floor:  

  • Dave Miles, Director of Safety Policy for Meta, pointed out that Meta is using clear standards to design online/onsite guidelines and codes of conduct. Standards such as the GC 25 and age-appropriate design code are influencing the way Meta designs solutions. Larger platforms can influence others in the sector. Age assurance, child-friendly products, and empowering the trust and safety community should be the main concerns of the industry sector. 

  • Jennifer Chung, Dot kids raised the point that when we look at enabling activities, harm does not always start with clear criminal activity, making it hard to know where to intervene and at what stage.  Jutta Croll responded that Stiftung Digitale Chancen are advocating for proper age verification so that platforms know how their users are and can adapt support for them.  

  • Jutta Croll from Stiftung Digitale Chancen raised the importance of improving national risk assessment mechanisms and presented a model they have co-developed. Current risk assessment is addressing harm after the fact. Pre-emptive, comprehensive, and concomitant assessment of potential threats to children must be implemented among different platforms where anticipatory conduct might consequently lead to criminal offences. Precautionary measures such as age verification may fit this scope.  

Interventions from the floor: 

  • Jonathan Andrew from the Danish Institute for Human Rights wondered what the response has been from lawmakers to this risk assessment tool. When the draft CSA Regulation in the EU came out, most were still considering what risk assessment really means. He pointed out that education is key because risk starts before a criminal act, but precautionary measures by service providers are also needed. The role and mandate of national human rights instruments is key.  

Conclusions 

  • The GC25 is legally binding and comprehensive framework to ensure digital child rights. 

  • Education is key but cannot replace proactive measures by online service providers. 

  • Common data collection methodologies ensure comparability and drive change. 

  • Key to tackle tensions between children's data privacy and protection.

  • Governments must provide funding for data collection and analysis, 

  • Industry must adopt a safety-by-design approach. 

IGF 2022 WS #370 Addressing the gap in measuring the harm of cyberattacks

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

Developing both qualitative and qualitative measurements as well as general indicators and sector-specific indicators is key part of advancing the harm methodology. The methodological framework should consider different kinds of harm inflicted by cyberattacks and include issues of re-victimization and redress. It is recommended to link the discussions on the methodological framework for measuring harm to the accountability framework.

Calls to Action

Addressing the harm stemming from cyberattacks is a collective responsibility. There is a need for multistakeholder initiatives that can break existing silos between different communities and experts to meaningfully advance the harm methodology. Outreach is important to the wider community, especially cybersecurity experts, policymakers, economists, and mathematicians who can meaningfully contribute to developing the harm methodology.

Session Report

Recent years have seen a growing number, scale, and impact of cyberattacks. State and non-state actors increasingly exploit vulnerabilities in cyberspace for financial profit or to gain an advantage over their adversaries. The CyberPeace Institute has been recording cases of cyberattacks related to the healthcare sector and in connection to the war in Ukraine. From June 2020 to November 2022, the Institute aggregated a total of 501 incidents affecting 43 countries around the world as part of the Cyber Incident Tracer (CIT) #HEALTH – a platform that records and analyses data on cyberattacks in the healthcare sector and, importantly, their impact. Similarly, to this date, the Institute’s Cyber Attacks in Times of Conflict Platform #Ukraine has featured 834 cyberattacks and operations in 35 targeted countries. While this is only a fraction of the full scale of the threat landscape, these platforms attempt to bridge the current gap in the understanding of the harm to people stemming from cyberattacks.

This workshop posed some key questions for developing the harm methodology, including how “harm” should be defined in cyberspace, what categories and indicators can effectively help to measure harm from cyberattacks, and how a methodological framework for harm caused to people can improve policymaking and ensure greater accountability for cyberattacks. While a significant effort has been devoted to documenting cyberattacks and understanding their economic impact, there is a remaining gap in understanding the damage they cause to societies. The harm methodology introduced during the session attempts to close this gap and proposes a novelty approach to the selection of indicators of harm and the assessment of harm in cyberspace. This framework aims to contribute to the efforts of comprehensively measuring the harm to people from malicious cyber behavior and this way advance policymaking and decision-making through an informed and human-centric approach.

The panelists spoke about the lack of data related to cyberattacks and what is the generated harm caused by such attacks. It was proposed that currently, we see the impact on records, facilities, and the economy, among others, but such an assessment is too narrow. There is a need to develop qualitative and qualitative measurements for societal harm, particularly in regard to the impact on vulnerable people and the possible re-victimization – both online and offline. Harm originating in cyberspace can be represented in many ways, and it is important to have an impact assessment as part of related policies and legislation. When discussing the state of play, the panelists noted that some attacks are already high on the agenda, including ransomware and spyware, but more efforts are needed to understand different kinds and degrees of harm. Research conducted on this topic remains insufficient, but some important contributions have been already published, including on the taxonomy of cyber harm. Conversations are important for advancing the initial methodological framework, especially concerning the number of indicators, quantifying numerical values, and qualitatively documenting and tracking the results. The CyberPeace Institute welcomes contributions from stakeholders. 

A methodological framework for cyber harm can improve policymaking and ensure greater accountability for cyberattacks with a human-centric approach. Given the complex landscape of cyberspace, policymakers need to understand the impact of cyberattacks in order to be able to base policies, strategies, and legislation on empirical assessments. By extension, it is key to not only measure the economic impact of cyberattacks but also the harm they cause to people. The panelists remarked on the critical need to consider redress for those who were affected. The effects of cyberattacks are often localized, meaning that many people experience harm to some extend and we need new safeguards and effective remedies.  Furthermore, it is important to link the discussions on the framework for measuring harm to the accountability framework. There are remaining silos in areas relevant to developing the harm methodology, including cyber insurance, law enforcement, and education, but building this framework should be a collective investment. Entities across the stakeholder communities need to cooperate and test the proposed approaches in different sectors as the indicators can vary. 

In conclusion, this workshop contributed to raising awareness about the methodology for measuring harm from cyberattacks. Such a framework has the potential to inform policymaking and decision-making and help prioritize resilience efforts based on areas with high levels of societal harm caused by cyberattacks. It was outlined that a follow-up should include outreach to the wider community, specifically to cybersecurity experts, policymakers, economists, and mathematicians. The input from the participants at the workshop has been gathered and analyzed to further inform and improve the methodology.

IGF 2022 Open Forum #71 From regional to global: EU approach to digital transformation

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

When regulating digital platforms, a whole society approach is needed. The public needs to be fully informed about the regulatory process and the governments must put the citizens in charge of safeguarding transparency.

,

There is no single policy, no single legislation or no single actor to secure an open, free and secure digital future for all, but the prerequisite for any massive transformation, like the digital transformation, is the structured dialogue among all stakeholders based on shared principles.

Calls to Action

1. Ensure regulation work in practice - aim for effective implementation and clear guidance. 2. Digital developments require agile governance. Massive digital transformation needs structured dialogue and shared principles. 3. Policymakers should ensure public information and transparency.

Session Report

The EU is impacting global regulatory trends in the digital sphere. EU regulations are shaping the global marketplace and influencing governments in their policymaking decisions, and to a great extend setting the regulatory standard for global companies.. Most regulations are applicable also to non-EU companies operating in the EU and hence have immediate effect. This is a huge opportunity but also a huge responsibility for the EU to put in place and effectively implement its regulation for real market outcomes. The objective is and should remain the safeguard of a free, open and interoperable Internet ensuring human centric digital transformation serving citizens and society. The EU value-based approach is challenged in the current geopolitical landscape of tech war and digital authoritarianism.

The DSA Package is about greater responsibility (DSA) and market contestability (DMA). It aims at addressing a global problem and represents a global opportunity. DSA Package’s implementation within the EU will have a global effect due to the experiencing and building up of auditing and enforcement capabilities. In jurisdictions where the Package is not applicable, it would however set a threshold for elementary safety provisions since jurisdictions would not accept lower standards than those set e.g., in the DSA. The DSA should allow the balancing between content takedown and safeguard of freedom of expression (freedom to speak, seek and receive information). When regulating digital platforms, a whole society approach is needed.

To create a safer and more open digital space, protect European citizens from illegal and harmful content and conduct online, create a level-playing business environment it is essential that key elements of the ecosystem are put in place before or together with regulation (free space for the journalists, independent auditors, effective system of notifiers, among the others). This may ensure the accountability of gatekeeper platforms. The approach may depend on the country or the society. The overall vision of a human-centric approach to digitalization may result in an articulated complex regulation, like in EU, with many members states of different societies or like in Japan with a very homogenic society, in more agile and a multistakeholder governance approach.

The Data Act will impact IoT products’ design, which will have spill-over effects also on non-EU markets. It is also about ensuring interoperable and well-functioning cloud market. Fundamental values should be attached to data flows which affect privacy, national security, and intellectual property. It is important to build the interoperability among the various approaches to data governance across jurisdictions. This is one aspect into which looks the Data Free Flow Trust initiative.

EU’s aim is to have a trustworthy and human-centric AI to avoid algorithms’ possible harmful effects. AI policies depend also on a given country’s democracy values. While other parts of the world may have different approaches to regulation, the effect and interest in framing digital developments has been increasingly recognised by governments world-wide. AI raises questions on innovation and regulation, and the approach to these two may be altered by a possible Beijing effect.

There is no single policy, no single legislation or no single actor to secure an open, free and secure digital future for all, but the prerequisite for any massive transformation, like the digital transformation, is the structured dialogue among all stakeholders based on shared principles. One lesson learned from the several EU regulatory processes (Artificial Intelligence, Data Act, Digital Services Act) is that even if the legislation process in EU might be the most transparent in the world, it is still complex for the citizen to navigate the extensive information material. There is a need to secure not only the participation of the citizen in designing the regulation but also the public supervision of the whole process. The outcome of the legislative process is as important as its quality. For DSA/DMA to secure the maximum level of transparency the idea was to inject the transparency in every aspect of the regulation, as for content moderation, journalism, audit or trusted flaggers and to introduce the cardinal principle that there is no general monitoring.

IGF 2022 WS #440 Declaration for the Future of the Internet

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

Internet Governance Forum 2022

[Workshop # 440] Declaration for the ­­Future of the Internet

Addis Ababa, 30 November 2022

Speakers

  • Pearse O'Donohue, Government, Western European and Others Group (WEOG) 
  • Timothy Wu, US Government, Western European and Others Group (WEOG)
  • ­­Marietje Schaake, Technical Community, Western European and Others Group (WEOG)
  • Anriette Esterhuysen, Civil Society, African Group

Moderators

  • Grace Githaiga, Civil Society, African Group
  • Sonia Toro [Online], Private Sector, African Group

 

Organisers

  • Julia VAN BEST, European Commission
  • Esteve Sanz, European Commission
  • Anca Andreescu, Stantec
  • Sonia Toro, Stantec
  • Noha Fathy, Independent

Rapporteur

  • Tom Mackenzie, Stantec / ITEMS International

 

Discussion: The DFI and how to keep the Internet open, trusted, interoperable

This workshop was proposed to discuss the principles of openness, interoperability and resilience of the Internet, and policies that are needed in order to safeguard it.

The two questions on the table were:

  • How to ensure that the Internet remains open, global, and interoperable, in line with universal values and fundamental rights?
  • How can governments, private entities, civil society, and the technical community translate the principles of the DFI into concrete policies and actions and work together to promote this vision globally?

Connection with previous IGF outcomes

The workshop was organised as a follow-up to IGF 2021 and earlier IGF meetings on the ‘Economic and Social Inclusion and Human Rights’, ‘Universal Access and Meaningful Connectivity’, ‘Inclusive Internet governance ecosystems and digital cooperation’ and ‘Trust, Security, and Stability.’

The purpose of the workshop was to have a multi-stakeholder discussion on how to preserve an open, global, interoperable, reliable, and secure Internet, and how this is a key objective in the drive to achieve sustainable development and digital inclusion. This further includes providing meaningful and sustainable Internet access to everyone and safeguarding Internet openness to promote democracy and human rights.

The discussion was also intended to tackle the open internet policies and actions that are needed to promote the trust, security, stability, and interoperability of the Internet including a human-centric approach. This cannot be achieved without considering the sustainability of the internet governance ecosystem that hinges on well-structured coordination and consolidation among the different stakeholders in order to promote a positive vision for the future of the Internet.

While an Internet that imperils fundamental freedoms and human rights online threatens the achievement of almost all the SDGs, there is a direct link between the proposed workshop and the following SDGs:

  • Decent Work and Economic Growth: a free, open, global, interoperable, reliable, and secure Internet creates new working opportunities and contributes to growth. It would also provide career opportunities, support the emergence of new businesses, extend distribution channels to remote areas, increase employment in higher-skill occupations, and create new jobs for less-educated workers.
  • Industry, Innovation, and Infrastructure: a global and open Internet promotes innovation, also contributing to industrial development and infrastructure building/roll-out. It also has a positive impact on economic growth and social well-being which are important for the peace and happiness of individuals and societies at large. It further allows for cultural exchanges and the open exchange of knowledge and creativity which could greatly influence a lasting peace.
  • Reducing Inequality: A global and open Internet contributes to reducing inequalities between those who have studied and those who have not, urban centers and rural areas, developed and less developed regions, men and women, and ultimately between rich and poor.
  • Peace, Justice, and Strong Institutions: a global and open Internet ensures transparency, rule of law, democratic societies and processes, reliable institutions capable of regulation but also respective fundamental rights.
  • Partnerships for the goals: the session would allow structuring the effort of stakeholders around a global, open, and human-centric Internet to build/further partnerships around accelerating the achievements of the SDGs. In this context, the DFI is open to the broadest group of countries from all geographies and development levels, who actively support a similar future of the Internet and want to re-affirm the commitment to protecting and respecting human rights online.

Declaration for the Future of the Internet

Many internet stakeholders grapple with complicated questions that relate to Internet safety and openness. This includes how to expand Internet access while keeping the Internet safe from illegal content and dangerous goods; how to fight against disinformation while protecting fundamental rights, i.e. freedom of expression and freedom of information; and/or how to keep the digital space contestable, open for innovation and inclusive.

At a time when the negative developments of the Internet – including high market concentration and abuses of market power, diminishing pluralism and data privacy, and increasing disinformation, harassment, and censorship – are justifiably and decisively being addressed we must not forget and give up on the great benefits a well-functioning internet can add to our societies and economies.

Against this backdrop, the Declaration for the Future of the Internet (DFI) was published on 28 April 2022, rallying over 60 countries around an affirmative, positive agenda for a free, open, global, interoperable, reliable, and secure Internet. The DFI sets out shared fundamental principles that re-emphasise the great positive potential of the Internet. A well-functioning global Internet will reinforce democracies, promote social cohesion, and protect universal rights while allowing for digitally spurred economic growth and development.

To this end, the workshop discussed how stakeholders can ensure that the Internet remains widely accessible, open, human-centric, and in line with universal values and fundamental rights. In the same vein, discussion focused on how these stakeholders can translate the principles of the DFI into concrete policies and actions and how countries and stakeholders can work together to promote this vision globally.

Reaffirming the principles of Openness and the multi-stakeholder governance model

The Declaration for the Future of the Internet can be seen as an attempt to reaffirm the basic principles of openness, resilience and interoperability on which the Internet was founded. These principles are common currency within the multi‑stakeholder community. However, there is a need to reaffirm them at a time when geopolitical tension threaten the integrity of the Internet and some analysts have raised the prospect of a splintering of the Internet.

The objective of the DFI was to produce a declaration including a set of principles that national governments around the world could to sign up to. At the same time the DFI allows countries to state that there are certain things they can and cannot do to ensure the Internet is and remains an open, interoperable trusted space which respects the individual including their integrity, physical and online, their personal data as well as their identity. 

The key issue at stake is trust. All citizens as well as businesses should be able to trust that they are safe when they are online, that their data is secure and that their transactions are confidential.  This will lead to a trusted environment and further innovation.  It also means that the data economy can thrive and it can be something to which individuals can place their trust. 

Relationship between the DFI and the IGF

Some critics have identified possible tensions between the stated aims and principles of the DFI and the mission of the IGF.  However, these are exaggerated. On the contrary, the DFI can be seen as a timely and useful expression of the principles of openness and inclusion that have been established by the IGf for over 20 years.  The IGF has shown that it has been able to adapt itself to address the questions of the day in order to ensure that, in the future, not only is the multi‑stakeholder community, but the Internet governance forum which you have all invested so much in will continue to play and an increasing central role with regard to the governance of the internet.

Key Take Away 1: The DFI states important principles in favour of the Open Internet. But what now?

The DFI can be viewed as a response to what the US and partner states have viewed as alarming patterns behaviour by certain nation states with regards to Internet governance, and the technical standard setting processes on which the stability and interoperability of the Internet relies. The stated goal of the DFI is to reaffirm basic norms, and to restate basic principles that have long been taken for granted. These notably concern how nation states are expected to comport themselves when it comes to governance issues and the management of critical internet resources.

A key aspect of the DFI is respect for the Internet’s multi-stakeholder governance processes, and the notion that one of the Internet’s founding principles was that it couldn’t be controlled by a single country. However, there are growing concerns that certain nation states are seeking to increase their power or increase their leverage at the expense of the multi-stakeholder governance process, particularly affecting the technical sides of it. 

What next? With 70 signatories overall, including a small fraction from the Global South, this suggests that significant work still needs to be done to achieve buy-in for the values, standards and governance principles as put forward in the DFI. This should be an objective in the run-up to upcoming multi-lateral, multi-stakeholder processes e.g. the UN Global Digital Compact (Sept 2023) which is expected to “outline shared principles for an open, free and secure digital future for all”. 

The DFI can serve to smooth the path for the principles of the Open Internet to be encapsulated in whatever comes out of the GDC (involving 170+ countries) but a more inclusive, transparent process, may need to be put in place to ensure that all stakeholders, have an opportunity to engage.

Key Take Away 2: Risk that a DFI might be perceived as exclusive (either you’re in or you’re out). A roadmap for more inclusive, multi-stakeholder consultation may be needed to ensure wider international buy-in.

The DFI has been promoted as reflecting global aspirations to build an inclusive rights-respecting Open Internet, and using the potential of the Open Internet to foster economic and social development.  

However, the DFI was ostensibly drafted by actors of the Internet ecosystem in the US and Europe. It appears not to have been the product of a broader consultative process involving ecosystem actors from other regions of the world. This may have led to perceptions around the world that it is a declaration to which like-minded actors or like-minded states can subscribe to. But it may also have led to the perception that “either you are in, or you are not”, an unintentional deviation from the path of inclusive dialogue on which the future of the Open Internet undoubtedly relies.

To ensure greater buy-in to the principles and standards of the Open Internet, a wider consultative process which takes into account, more explicitly, the concerns of countries in the Global South could be sought. This might involve fixing what some might see as a procedural weakness and allowing non-state actors to become signatories of the DFI.   

Concern was expressed during the workshop about the perception that the internet may be being used as political football in terms of global geo-political tension and conflict. This will harm the Internet, and the Internet's potential as a platform for holding peace will be compromised. In this sense, the declaration should perhaps be a viewed starting point rather than an end point. Instead of being a declaration perceived as having been prepared by well-identified actors in the multi-stakeholder (but not necessarily internationally representative) internet ecosystem, to which other countries should simply come to the table and “sign on”, it might be better if it was perceived as an invitation to join-in in the process of building a global consensus.

There are lessons that can be drawn from the Net Mundial process in 2014 in Sao Paolo, Brazil. This resulted in a powerful, and simple document. However, it never quite made it into the multi‑lateral space.  Therefore, it never went any further. This may call for a re-appraisal of the multi-stakeholder model of internet governance; how it is defined; how it works in practice; and how individual stakeholders (including states) that take part in the system can be held accountable.

Key Take Away 3: Greater support for DFI in Global South will result from enhanced cooperation between development agencies to deploy internet infrastructure, and promote connectivity

Conspicuously the DFI lacks signatories from the Global South (two countries in Africa). Before it is reasonable to expect strong buy-in for a rights-based Internet, built on the principles of openness and multi-stakeholder Internet governance, it is vital to ensure the deployment internet infrastructure globally, and the delivery of reliable, affordable, physical access to the Internet to a majority of the world's population. This can be achieved through better coordination between development agencies, and better engagement with countries regarding local needs and practices.

The future of the open, interoperable Internet will be assured insofar as civil society and the non-governmental multi stakeholder community are empowered to comment on the deployment of internet infrastructure, and Internet based services, freely and without intimidation.

Multi-stakeholder actors should feel free to positively advise governments regarding the practical implementation of the principles set forth in the DFI. This will also serve to reinforce accountability. 

Video & Session Transcript

Video: https://youtu.be/KY3N-0NaVBw

Transcript: https://www.intgovforum.org/en/content/igf-2022-day-2-ws-440-declaration-for-the-future-of-the-internet-%E2%80%93-raw

IGF 2022 WS #217 Joint efforts to build a responsible & sustainable Metaverse

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

Metaverse is a concept that all parties have yet to unite because it is a collection of scientific ideas and potential technologies in the next few years, such as Web 2.5/3, Blockchain, and AI. Its subsequent evolution, application scenarios, and corresponding specifications in various industries and fields are still unclear and need to be built jointly by multi-stakeholder groups.

Calls to Action

A standard development organization can be set up in conjunction with governments, technical communities, private sectors, civil society, and other multi-stakeholder groups to provide a stable environment for its members to discuss, define, and compile metaverse technical specifications and reports.

Session Report

On November 30th, IGF 2022 workshop #217 on "Joint efforts to build a responsible & sustainable Metaverse" was held. There were five experts from different fields and countries presented their views on Metaverse development, which will serve as an open and fair foundation for the future world.  Specifically, they discussed the Metaverse's key challenges, governing issues, and what policy framework may help a healthy development.

 

Xiaofeng Tao, professor at Beijing University of Posts and Telecommunications (BUPT), the vice chair of Consultative Committee on UN Information Technology (CCIT), China Association for Science and Technology (CAST), chaired the workshop.

 

Professor Gong Ke, the chair of  CCIT/CAST, and the past president of the World Federation of Engineering Organizations (WFEO), made an opening speech on building the accountability and sustainability of Metaverse. He introduced the core values and principles that could and should apply to Metaverse development for the good of humankind and the planet.

 

In this workshop, five speakers presented their views on the topic "Joint efforts to build a responsible & sustainable Metaverse", and the details are below.

 

Horst Kremers, the General Secretary of CODATA-Germany, presented the critical elements of a digital information strategy. He introduced basic management principles and challenges for international legal instrument information management. The demands for coherence and mutual synergies for the UN declarations and other UN instruments texts are urgent.

 

Li Yan, the Vice President of Singapore Blockchain Association, explained what Web 2.5 is, and Web2 and Web3 seem to converge towards Web2.5. From his speech, the most significant risks are regulations and a need for more non-technical talent familiar with political economy.

 

Wen Sun, professor at School of Cybersecurity, Northwestern Polytechnical University, shared her excellent ideas and thoughts about the challenges and potential solutions to build a trusted Metaverse. One of the promising solutions is Blockchain technology and other advanced technologies such as Digital Twin and Federated Learning. These new emerging technologies can ensure data integrity, privacy, and security and implement seamless and secured data sharing.

 

Daisy Selematsela from the University of the Witwatersrand and Lazarus Matizirofa from the University of Pretoria noticed the gap between academic research and the knowledge needed by policymakers and bridged the gap by providing policymakers with access to relevant research.

 

Ricardo Israel Robles Pelayo, professor of Universidad Anahuac online, UNIR México and EBC, introduced legal education and information communications technology such as digital books, AI, and big data. And in the end, he reminds us what the role of Metaverse in legal education. That is, the Metaverse is the link between academic theory and professional practice.

 

After all five speakers finished their presentations, the experts started an open discussion. Several key questions are discussed, such as the main challenges, including technical and legal aspects of building an ecological, responsible, sustainable Metaverse, cooperation among multi-stakeholders, and how to build a policy frame. The experts mentioned that the more important thing is to conceive what the issues are in Metaverse instead of how to apply Metaverse related technologies in education.

 

At last, we agreed that building a legal framework is challenging. This is because 1) we need an innovation of new technologies and 2) potential risks brought by these new technologies need to be identified. Hence, instead of technologies, ethical principles might a good starting point for the very first step to build a legal framework. We shall encourage a broader participants and efforts not only from tech engineers and scientists but also from multi-stakeholder communities to join the Metaverse development.

IGF 2022 WS #491 The future of Interplanetary networks-A talk with Vint Cerf

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

Report Session 491: The future of Interplanetary networks

  • There’s a potential for the interplanetary network to solve the problems with the diverse internet network infrastructure on the ground. It is not only a topic related to how to transport data from Earth to the different planets but also how to communicate between them. This implies the development of new ways or forms of packaging the information and ensure will be transmitted fast and similar to what we do on Earth. 
  • The main developments started in 1998 with the Pathfinder project on Mars —(1997). The necessity of creating communication between Earth and Mars. 
  • What we need from there in 25 years (2023) is IPN to use TCP/IP as it works on Earth. 
  • The solution found is in the Bundle protocols to answer the constantly changing nature of the planets. Store data on the network. Transmit to the right planet, and then you try to figure out how to get there. Two-way process (28.5 kb/s direct to Earth radio link // Delated communication.)
  • For this some technical aspects are the ones to have into account: Hold, orbiters, deep impact space networks, transmit. Low latency → TCP/I, IBR-DTN // ION, the implementation of DTN and LEO, near-term terrestrial and cloud testing 
  • But is not everything about the technical aspect, there will be legal aspects to this and some of them are part of Artemis' accord on how jurisdiction would like in an extraplanetary context. Mission Artemis to the moon.
  • This can be considered as a possible beginning of the commercialization experience of the Internet beyond the LEO. Private operation on space. 
  • At this moment, Vint asked himself and invited the public to think about how we create new institutions in order to cooperate together in a way not threaten space in the same way as earth space.
  • Additional registry for IPN? No, it will be better an independent Internet on the other planets, with distinct and separate IP addresses for the lower cases we have on the moon and other planets. 
  • Solar relay to transmit no matter where you are in the internet solar system. Congestion Control: Storing models, store, and forward network. 
  • All standards in IPN are open, and usually reject things that have patents and restrictions on them. Open source as a way of driving innovation. 
  • Veronica addressed the problem of using commercial technology in order to develop a new one. How this could possibly translate into the future of IPN?
  • Nowadays, research is built on top of the previous ones, like in a pyramid. In order to scale the pyramid and place a new block on the top, you have to get the permission of each person who placed the block below and supports yours.
  • The problem is when the block is a patent, you have to pay handsomely to lay your block on top of it, and in the ICT industry, there’s a huge amount of patents.
  • Some patents get included in standards (SEP, standard essential patents).
    • Bottleneck, because all the market operators have to pay to implement that standard, and it gives the patent holder the power to foreclose the market and stifle innovation.
  • The reasons why a patent becomes a SEP.
  • Market-driven, 
  • the standardization process is not very transparent. When private companies join a Standard setting organization, they should be obliged to disclose their patents (to allow the SSO to find an alternative to proprietary technical norms), and, in case no alternative can be found, they should commit to grant FRAND royalties.
  • Not all the SSOs provide for mandatory disclosure, or they don’t provide penalties for violation.
  • The technical community has to keep into consideration these elements if they want the interplanetary networks to be developed on open standards.
  • A way to solve this could happen with a more gender balance in the development of the technologies related to IPN one example of the benefits of diversity is the development of federated learning made by a woman which consists of different AI models and gathers as one.
  • Leading to the conclusions,  one can be promoting alliances between commercial and others to reduce costs, some way of multistakelholderism with Government and private sector. 
  • How this development of IPN actually also is going to help possible extraterrestrial life, we found 
  •  Serve humanity and not governments.
  • LEO can possibly overcome shutdowns, and we say possibly because after the signals and all of that can be detected. 
  • Possible Takeaway: One is the legal and commercial aspect of the development and deployment of the technologies related to the Interplanetary Network (IPN) and how these regulations would happen in a way that actually what we do in space doesn’t become a thread as it happened and is happening now on Earth. Also, how these are going to allow easy collaboration between the private sector and the States with interest to design, develop, and implement these Technologies related to the creation of an or several IPN.
  • The other Takeaway is related to how these technologies can work as a way to solve problems of interconnection here on Earth and in Space, how the combination, implementation, and appliance of those could complement and work in a way that can guarantee basic human rights and a constant connection beyond cultural, political and other related sociological activity, these means get the technical standard that can work as much in the independence of human intervention but having embedded a humanity first perspective. 
IGF 2022 DC-SIDS A Global Platform for SIDS Collaboration: The 1st SIDS IGF

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

For Small Island Developing States (SIDS) to flourish in the internet economy, there is a need to invest in resilient infrastructure that is resilient to disaster and adverse climate changes in the SIDS regions.

,

Sustainable funding is required to achieve resilient internet infrastructure as well as digital technology and e-commerce which have immense potential to support the participation of SIDS in international and regional internet markets/economies.

Calls to Action

Collaboration among all SIDS is critical to share best practices and develop solutions to challenges faced. The SIDS IGF process and associated tools and platforms are absolutely necessary ingredients to ensure this happens.

,

The DC-SIDS calls upon the United Nations system, regional developmental agencies and all stakeholders interested in and concerned about the sustainability and survival of SIDS in the global digital economy to engage with us so that we can develop a framework for sustainable funding and problem solving.

Session Report

After welcoming all participants to the DC-SIDS IGF 2022 Round Table, the onsite moderator (Tracy Hackshaw) invited introductory messages from participants from the CTU, Fiji and Mauritius.

Remote moderator, Maureen Milyard was then invited to lead the next phase of the Roundtable by welcoming everyone, introducing herself and thanking the CTU (Caribbean Telecommunications Union), and the Pacific-based participants for their involvement, their leadership and commitment to the DC-SIDS.

Maureen introduced a couple of key takeaways from the Inaugural SIDS IGF held from August 25-26, 2022. Firstly, the development of a framework and protocols to research SIDS IG issues that could be used to inform presentations at DC-SIDS such as these at the global IGF and this event would ensure that all regions get an opportunity to be involved in the discussions. Second, was from the presentation given by the Honourable Simon Cossey, the Minister of Justice Communications and Foreign Affairs from the island of Tuvalu in the Pacific. Simon spoke of how the pandemic had pushed Tuvalu's government towards prioritising and driving internet connectivity so that they could maintain connection with their isolated outer Island communities through virtual meetings during the lockdowns. He applauded the SIDS idea not only for bringing Island communities together to share the initiatives, the challenges and achievements but also for fruitful and constructive dialogue which is one of the main goals. He also explained his country's future project which is aimed at securing the future of their country against the violent impacts of climate change that they're already starting to experience. They're currently pursuing initiatives that involve creating a digital Nation by preserving its government infrastructure, geography, people and culture they believe is important to do in the event that their land is eventually lost due to climate change and sea level rise at the same time as they're going through this transformational phase. He also stressed that in order to maintain the Integrity of what they aim to achieve takes a values-based approach by fusing the positive commonalities that arise out of ICT development with a unique cultural values of their community-based society so the ICT strengthens rather than undermines their values and can contribute to climate resilient development and Innovation adoption to combat climate change impacts.

Leticia Messiah, from the Solomon Islands,  the current Board Chair of the Pacific Islands Chapter of the Internet Society (PICISOC) shared some of her takeaways from the SIDS IGF. Internet connectivity is crucial for economic development, excess of information opportunities and service delivery. There is an opportunity of ongoing digital transformation and digital readiness happening in the small island developing states. It was also highlighted that there is ongoing work on e-health and digital and health information roadmap. It has been seen that the development of cyber security and online child protection policies coming into place accessibility of services online and inclusive access to devices and ongoing online safety, digital literacy and awareness are being conducted by various ICT societies. There is a need to increase skill sets, create workforce space in the Pacific for digital health, the need for cyber security building and  voices to be heard in the space since there is no representative of a seat in the IGF Leadership Panel. Lastly, Universal and meaningful connectivity is defined as the possibility for everyone to enjoy a safe, satisfying and reaching productive and affordable online experience. There are some activities that took place in this year, one of them is the Pacific Hackathon that aims to find an ict-based solution to a problem state. The event connects young Innovative Technologies in the Pacific. There is also a “Girls in ICT” event that aims to create an environment that empowers and encourages girls and women to pursue careers in STEM fields enabling both health and technology companies to realise the benefits of greater female participation in the ICT sector and other STEM sectors through collaboration and partnership. It is also aimed to build this network and increase female participation in developing ICT-based Solutions in the region as well as have more female participation in future hackathon events. Lastly, She highlighted that civil societies are the voices in the local communities to those looking to invest in National or Regional activities projects in seats and asked full Partnerships to collaborate, cooperate and work together in a multi-stakeholder approach to advanced IG in the globe, the SID’S and especially in the Pacific Island countries. 

Selu Kauvaka, board member for PICISOC and currently the president of Tonga women ICT presented what currently they are doing in Tonga. She discussed about Tonga challenges, way forward and mainly their vision which is to allow women and girls to enter the industry without fear of discrimination and also giving them the opportunity to the industry transformation. One of the main roles they do is reaching out to Grass level, getting them to understand more of how the internet works, how to be safe online and be updated with technology. 

Dalsie Green Baniala presented a project approach to connect the most remote areas that is overseen by the government of Vanuatu and implemented by the ITU which is based in South Maliko, Central Islands of Vanuatu.

Rodney Taylor, Secretary General, Caribbean Telecommunications Union (CTU) discussed that they held in partnership with the Pacific Islands and other stakeholders the first Small Island Developing States Internet Governance Forum. The discussions were quite engaging and the entire IGF was a three-in-one type of event so it included the 18th Caribbean IGF because the CIGF has been running for 18 years. An Internet Governance policy framework that guides the regional governments on internet issues in about four core areas including infrastructure, security, exchange points were developed. The Youth IGF was held for the first time that was planned by the young people for the young people in the region. It’s important to collaborate, consult and discuss the issues with global digital compounds,  internet and the environment, the issues around the energy consumption of these large data centres and cloud services and things like cryptocurrencies and Mining of Bitcoin. There is a need to look for sustainable sources of energy, renewable sources of energy, how to deal with E-Waste and so on in a sustainable way and this is all part of the climate change discussion. The voices need to be amplified within International ICT policy development processes such as the ITU and SID’S IGF gives that opportunity by strengthening the numbers and raising the voices within these organisations.

What can we do next and how can we collaborate? Build a  disaster mitigation preparedness response and recovery and look at some strategies on how to make the infrastructures much more resilient most of the time especially as we talk about Internet Governance. Submit proposals so as to become active and have the voice out there so as to discuss the Caribbean talk about the Pacific by submitting these proposals for next year.  Be more visible and include youth and bring more people to the IGF. Language is also often a barrier across the regions. Looking at common languages other than English is also important.

IGF 2022 Town Hall #81 Digital Rights Learning Exchange

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Promoting open and democratic Internet will take everyone, not just well-established professionals and elites, but those from different sectors and geographies. Hence, it is crucial to support entry-level digital advocates and provide them with opportunities to make an impact.

Calls to Action

Connect with digital rights advocates from the Digital Rights Learning Exchange program for collaborations on Internet advocacy projects.

Session Report

 

The session aimed at discussing challenges faced by digital rights advocates and spotlighting the Digital Rights Learning Exchange (DRLX) program that Digital Grassroots held in cooperation with the Open Internet for Democracy Initiative. During the opening of the session, the onsite moderator and Digital Grassroots co-founder, Uffa Modey, introduced the objectives of the program, highlighting the importance of providing capacity-building for digital rights advocates coming from underrepresented communities. As part of the program overview, one of the program leads and Digital Grassroots founder, Esther Mwema, presented program core components and highlights from participants' feedback, mentioning that the majority of program alumni found it specifically beneficial to work with other advocates from different regions on developing a campaign and learning from each other over the course.

Sarah Moulton, the deputy director for the National Democratic Institute, discussed how the Digital Rights Learning Exchange came to life as a learning space for emerging digital rights activists starting their advocacy work to acquire basic skills across such areas as stakeholder mapping, communications, digital safety, and security. The panelist also emphasized the lack of foundational programs that can help budding digital advocates start and lead their advocacy projects and the need to support activists at the entry level. 

After the program overview, the word was given to a DRLX alumna Fatou Sarr, who shared her experience of participating in the program. Fatou acknowledged the importance of the participatory and interactive nature of the program, where participants have space to engage with hands-on learning materials and build connections with fellow participants and advocates from different backgrounds.

During the open floor, the program leads were asked about the commonalities of the participants in the program cohorts. It was noted that coming from different countries, the participants were able to find common ground for cross-country collaboration and take a regional perspective on the hypothetical advocacy campaigns that they were working on during the program. Another feature of the program spotlighted by the program leads was participants' interest in particular thematic areas, such as access and affordability, freedom of expression, and internet shutdowns. The panelists also covered the issue of program sustainability, stressing the significance of building networks between the participants and hosting organizations and offering alumni different pathways to engage after the program, i.e. as project mentors and guest speakers.

The session was attended by 12 online participants, who had an opportunity to engage in discussion with onsite panelists. The participants came from different backgrounds, including individuals interested in digital rights training programs, civil society representatives working with digital issues, and IT specialists. The questions were asked on the Zoom platform and the social media channels of Digital Grassroots by participants following the session live stream on Youtube.

 

IGF 2022 WS #475 Balancing Digital Sovereignty and the Splinternet

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

There is a need to define more precisely what internet fragmentation is, but we are currently far from this consensus

Calls to Action

There is a need for more dialogue among stakeholders with diverse positions, especially to define if there is such a thing as fragmentation on higher layers of the Internet

Session Report

There is great difficulty in discussing digital sovereignty and Internet fragmentation today due to the lack of common grammar and concepts among the different actors that address the issue. This makes the discussion to debate the relationship between these two concepts even more difficult, which is why it is necessary to maintain a continuous dialogue between the various stakeholders involved to try to achieve solutions and cooperation.

There is a perception that people talk about fragmentation on different layers of internet architecture. The most worrying kind of fragmentation happens on the transport layer. It is still not an immediate major threat and previous attempts have failed, but these acts, such as the ones from China, may even re-design core internet protocols. If these ideas and actions spread, in the medium term this could threaten the Internet as we know it today. Governments should be careful to avoid creating legislation that results in obligations to Internet actors that may create profound barriers to access, and this concern requires appropriate assessments. Some experts argue that digital sovereignty will inevitably lead to nternet fragmentation, because the traditional sovereignty exerted by states within their borders do not apply well to cyberspace.

However, one should not ignore that states have legitimate interests in regulating that do not interfere with the adequate functioning of the Internet, sarising those concerns arisen from the Global South orsignificante that take into great consideration the protection of fundamental human rights that may conflict weconomiclogical and economical development. People should not confuse these core values with features that can be hindered by regulations or generate costs for companies, since this is just a shift in priorities. Market actors clearly prefer the idea of no regulation andbehaviorrs, but this behaviour may harm other stakeholders.

There is no clear consensus if we can define content issues as aspects of internet fragmentation. It should bg on the specifics of what happened, such as censorship acts. This is particularly important because there is a legitimate discussion at the application (or content) layer, debating if it is problematic, leading to dangers such as communication and discourse control by governments, or necessary to protect citizens, national and individual security, or even democracy.

The session stresses the need for continued dialogue between actors not only from different sectors, but also from different regions, to align bit by bit the concepts that are being used in the global debate.

IGF 2022 WS #502 Platform regulation: perspectives from the Global South

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

There is consensus among stakeholders that regulating digital platforms is necessary since, despite being private companies, they have a growing importance in the public sphere, influencing democracies, human rights, and people's lives. Regulation must follow international human rights parameters, and the debate on the subject must involve governments, private companies, civil society, and the technical and academic community.

,

The debate must move forward regarding how regulation should be carried out. First, while some agents think that existing rules, such as consumer protection and personal data protection, are sufficient, others believe that new regulations are needed to advance on issues such as content moderation, transparency, and competition. Secondly, it is necessary to debate the coexistence of global norms and parameters and local legislation.

Calls to Action

National States must formulate antitrust and anti-concentration policies for digital spaces, with mechanisms to guarantee plurality and diversity, aiming at the constitution of a balanced digital ecosystem that respects freedom of expression, privacy, and personal data protection.

,

Global platforms must have representatives in the countries of the Global South and consider local laws and regulations in their operating policies.

Session Report

The workshop "Platform regulation: perspectives from the Global South" brought together representatives of different sectors. Speakers agree with the need for the digital environment to be regulated but differ about the scope and nature of regulation. They also agree that new laws can not contradict human rights and respect international standards.

For Raúl Echeberría, executive director of the Latin American Internet Association, regulation is not the only answer to every problem. Different policies can impact regions differently, so we need other principles and measures, considering variations in each society. In his words, we need innovative public policies, international instruments, multistakeholder agreements, and good practices for companies and society.

Echeberría says that companies are not against regulations but that in the digital world, there are already many regulations that platforms are already subject to, such as consumer protection and personal data protection laws. Thus, he thinks the focus of the discussion should be how to improve the scope of regulation.

He also believes that the sectors must agree on some aspects. He says there are two groups of thought: one that thinks platforms should do much more to moderate content, avoiding hate speech, online violence, etc., and the group that believes that there should be no moderation.

To resolve the fake news issue, Echeberría says we need to avoid criminalizing platforms and focus on the responsibility of other sectors, for example, governments, as the digital world is a representation of the real world.

Brazilian federal deputy Orlando Silva, who is a rapporteur for the Internet Freedom, Responsibility and Transparency Bill (PL 2630/2020), a benchmark in Latin America, brings another point of view. Silva believes few rules protect human rights and the public interest in the digital environment. He highlighted the importance of regulating the internet because it cannot continue as a territory where the rules only serve to promote the profit of big techs.

For the deputy, the rules must exist to protect freedom of expression; digital platforms can not operate according to their private criteria but according to publicly agreed standards. Bringing the example of the presidential elections in Brazil, Silva said that the moderation of content that platforms do without public regulation puts freedom of expression, privacy, free expression of conscience, and democracy at risk.

Silva also says that, despite the need for national laws that adapt to the reality of each country, it is necessary principles, concepts, and parameters for a global, international plan. He also stressed that the presence of civil society in the debates for the definition of these global parameters is fundamental. He believes that the regulations to be made should focus above all on the transparency obligations of digital platforms.

Agreeing with Raul, the deputy thinks that the quality of the public debate is not just a problem for the platforms but that public leaders have to be more responsible for the content they post and be sanctioned since they are accounts of public interest. He also thinks that the German experience of combating hate speech, based on "regulated self-regulation," should be a reference for the debate.

Representing civil society, Lillian Nalwoga, president of the Internet Society's Uganda Chapter and Policy Officer at the Collaboration on International ICT Policy in East and Southern Africa (CIPESA), said that in Africa, investments in digital platforms are increasing. Many people who connect to the internet do so through social media such as Facebook and Whatsapp, even before using search engines such as Google.

In this context, she said many platforms allow violations of human rights and disinformation, and moderate content, so it is necessary to talk about regulation. On the other hand, some governments are going to extremes that violate freedom of expression, such as blocking and closing social media.

Nalwoga also said it is essential to talk about jurisdiction because today, when there is a problem, people in Africa have to resort to the country where the platform is located. So it would be necessary for each of these platforms to have local offices in each of the nations in which they operate.

Lillian also said that civil society is trying to understand how regulations in the United States and Europe will impact Africa and what regional approach civil society can take to ensure that law is in line with international human rights standards.

She also said that in Africa, they are talking not only about content moderation, disinformation, and hate speech, but other aspects, such as the taxation of platforms, which already happens in Europe.

Representing the technical community, Paulo Victor Melo, researcher at the Institute of Communication at Universidade Nova de Lisboa and assistant professor at Faculty of Design, Technology and Communication/European University, said that if we talk about regulating digital platforms, we must first understand the support structure through which these platforms operate, which some authors call "digital colonialism." For Melo, the colonialist logic is visible, for example, in technovigilance policies, which segregate the public space and reinforces punitive practices in public safety. It is also present in the illegal extraction of minerals in the Amazon, mainly due to the invasion of protected indigenous lands. It is also present in the production of an industry of slavery, death, and environmental destruction, for example, in the Congo, where ores used to manufacture digital devices such as smartphones, computers, etc., are explored.

For Melo, from the point of view of the Global South, regulation mustn't serve, on the one hand, authoritarian government projects; fascist governments use digital platforms as echo chambers to promote disinformation and hate speech. On the other hand, it must prevent platforms from continuing to determine the "game's rules," as this has allowed hate speech, especially against minorities. Despite being private companies, digital platforms operate in the digital public space, which is not separated from the public space, people, and territories.

The professor highlighted that in countries with little diversity and plurality in the media, it is essential to have legislation to encourage competition, avoid monopolies and oligopolies of the large digital platforms, and encourage balance in the digital ecosystem. He also says that regulation also needs to consider that digital is dynamic and the specificities of each context, for example, in countries where internet connectivity is low and where there is no digital literacy. Finally, the researcher believes that no democratic regulation can occur without the active participation of the territories and communities that inhabit them and are affected by platform actions, from mining to hate speech.

IGF 2022 WS #458 Do Diverging Platform Regulations Risk an Open Internet?

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

The elaboration and enforcement of global standards may pave the way for greater alignment in regulating digital platforms. Its feasibility and desirability, however, remain much-contested. The development of standards, and more generally deliberations on platform regulations, must be done against the democratic context of each country and region in addition to their respective political, socio-cultural, legal and historical backgrounds.

,

In addition, platform regulations bear great importance in shaping the power balance between governments, ‘big tech’ companies, civil society and everyday users of platforms. Greater resources must be dedicated to promoting and facilitating honest and inclusive multi-stakeholder discussions; protect digital platforms as an open and neutral civic space; and ultimately foster a healthy digital ecosystem for all.

Session Report

Background

The last few years has seen a plethora of new laws and proposals which would regulate online platforms and other internet intermediaries. In the absence of any international frameworks or consensus on how to govern intermediaries, many of these laws and proposals are diverging widely. This is leading to potential impacts on intermediaries’ ability to operate globally, barriers to entry for new intermediaries in already concentrated markets, and risks to freedom of expression and other human rights.

Chatham House, along with Global Partners Digital, seeks to better understand the regulatory landscape around digital platform governance, and how this varies between regions. To this end, Chatham House convened a workshop at the 2022 Internet Governance Forum, bringing together experts and practitioners from Latin America, Africa, Europe, South Asia and South-East Asia to discuss and better understand the various regulatory approaches, and to extract common themes between them. The discussion sought to communicate and share this understanding widely and highlight areas which could benefit from further investigation and exploration.

The discussion focused on several questions, including: What forms of online platform regulation are emerging in different parts of the world, and in what ways do they diverge? How is the local democratic context in each region? What does a human rights-based approach to regulation mean? Are these measurable through reviewing legislation? What risks does policy divergence pose to an open an interoperable internet, as well as to human rights? And how can these risks be mitigated? What opportunities are there for encouraging harmonisation and consensus? 

Key Regulatory Trends Across Regions

The European region is often perceived as pioneering the regulatory landscape surrounding digital platform governance. Legislations both at the European Union (EU) level (e.g., the Digital Services Act, DSA) and at the national level (e.g., Germany’s Network Enforcement Act) often serve as an example shaping regulations in other countries. Within the context of the EU, the adoption of the DSA is hailed as a particular success, and requires, among others, platforms to: have clear terms of service and redress systems in place for users; publish transparency reports; and each member state to appoint a national independent regulator as the Digital Service Coordinator, likely fostering greater collaboration and information sharing across countries. Yet, the success of the DSA will highly depend on implementation and enforcement – particularly in the light of human rights. Furthermore, beyond the EU bloc, concerns arise surrounding ‘outliers’ (in particular Belarus, Russia and Turkey) with regards to their non-alignment with the Act, in addition to vaguely worded restrictions on politicised content types (e.g., those offensive to public morality) and potential criminal sanctions on individual platform employees. 

In Latin America, there is no established regulatory body producing regulations; this results in legislations varying from one another and for which there seems, at the moment, to be no appetite for alignment. Despite this fragmented regulatory landscape, one common approach across countries corresponds to considering major platforms as holding great influence over social discussions; yet user experiences and harm (e.g., misinformation and abuse) are often overlooked. In this sense, there is a pattern in regulatory approaches where instruments now pay greater focus to harm on social media platform over the bigger picture/internet regulation. With regards to the protection of freedom of expression, the Inter-American human rights system serves as a system to help safeguard this right across the region. 

Regulatory instruments in Africa have also shifted their focus in recent years: the main concern changed from ICT access to heavily-politicised legislations. The most common approach consists of exercising control over the platforms and, subsequently, their users, which provides greater power and protection for states and their respective governing regimes. This approach is in contrast with standards where the overarching aim is to provide and guarantee protection for all. Such control is, for example, reflected in the increase in requirement, over the past 18 months, for platforms to formally register; thus paving the way for risks related to licensing; these platforms’ accessibility to the people; and proactive requirements by states in the realm of content moderation. In addition, issues surrounding non-compliance with human rights norm in the ‘offline’/real world bear influence online; for example as seen through the prevalence  and normalisation of emergency laws, and their effect on online platform governance.

The South Asian regulatory landscape is, at the moment, highly dynamic and evolves very quickly. It does not only comprise legislations directly governing digital platforms; they also include those indirectly affecting these platforms, their users’ activities, and the power of states and governments over them. In India, the dominating approach is characterised by a general sense of distrust against non-Indian platforms; while greater protection is provided for national platforms by fear of external influence on the civic space. Echoing, to a certain extent, the approach adopted in African countries, two draft legislations were flagged as raising questions surrounding the government’s power and control exercised over platforms: the Indian Telecommunication Bill, establishing a licensing requirement and thus, raising questions over an open and free internet; and the Digital Personal Data Protection Bill, which will expand surveillance powers to the government. Pakistan’s regulatory landscape is also heavily focused on control, given that digital platform governance is framed around criminal law, with a particular focus on exercising control over dissidents. Concerns also arise with regards to the mandate conferred to regulators to interpret constitutional provisions, who often may overstep on the role of judges. Nevertheless, in both countries, multistakeholder advocacy efforts at preserving human rights and an open, free internet bear strength and influence over the regulatory landscape. 

Commonalities and Question Marks

  1. In the absence of a supra-national regulatory body (akin to the European Union), the alignment and eventual harmonisation, within a region, of regulations governing digital platforms remain a challenge. Whether such harmonisation is, at all, desirable remains, however, debatable: in the light of countries’ respective priorities, legislative landscape and regulators’ varying mandates, the adoption of global standards working for all constitutes a challenge. This fragmented landscape makes it difficult for digital platforms to navigate different, and sometimes competing regulatory instruments across countries; especially with regards to enforcement and implementation. 
  2. Concerns arise surrounding the increase of regulatory tools conferring the responsibility to moderate and respond to online content towards the platforms (in contrast with an independent regulatory body); oftentimes threatening, if not ‘hostage-taking’ platform employees with risks of individual criminal liability, while paving the way for deteriorating compliance with human rights norms. 
  3. Digital platforms remain, at times, the last civic space available and accessible to all. This is particularly in the light of licensing requirements and other restrictions surrounding other forms of media (e.g., the radio, television broadcasting, etc.); thus, in certain countries, these platforms ought to be maintained as the ‘last fortress’ to enable open, democratic and participatory civic engagement.

Risk Mitigation & Solutions

  1. Discussions and deliberations surrounding the regulation of digital platforms, as well as the eventual establishment of international standards and other soft law must be inclusive and of multistakeholder nature. There is a particular desire for governments to demonstrate greater political will in engaging and including civil society in shaping the regulatory landscape surrounding digital platforms. 
  2. Stakeholders with significant resources must facilitate and pave the way for inclusive and multistakeholder discussions and fora, in addition to leveraging these resources to improve the general understanding, across stakeholders, on the dynamics and trends surrounding platform regulation.  
  3. Governance deliberations and analyses must take into account local democratic contexts. These include, for example, local laws and customs, socio-political realities on the ground, human rights approaches, as well as the power relationships between the state and the people. 
  4. There is a need for digital platforms, in particular those bearing great presence over the population (e.g., Meta), to acknowledge the important role and influence they have; exercise responsibility in their approach to content moderation while preserving and safeguarding human rights norms; and show and exercise greater equity in the way they engage with users across regions. 
IGF 2022 WS #411 Move Fast and Fix Policy! Advocacy in an era of rapid change

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

 

On Friday, December 2, the Center for International Media Assistance (CIMA), the Center for Private Enterprise (CIPE), and the National Democratic Institute (NDI) hosted a hybrid roundtable discussion, “Move Fast and Fix Policy! Advocacy in an era of rapid change.” 

The objective of the session was to discuss processes and mechanisms for diverse stakeholder groups to provide input on proposed digital policies. The session aimed to explore different modalities to provide this meaningful input. Participants heard from two alumni of the Open Internet for Democracy Leaders Program, Paola Galvez (University of Oxford) and Catherine Wanjiru Muya (Article 19), as well as Mira Milosevic (Global Forum for Media Development) and Constance Bommelaer (Project Liberty’s McCourt Institute). Daniel O’Maley (CIMA) moderated. 

 

Key takeaways

This insightful one-hour discussion focused on the need for inclusive, participatory, and long-term approaches in the development and implementation of policies that impact the digital space. From this session, two main takeaways emerged:

 

Firstly, speakers agreed that successful policy change in an era of rapid digital transformation requires building a foundation of trust between all involved stakeholders. Recommendations for accomplishing this include establishing ongoing, meaningful engagement opportunities (avoiding one-off meetings and events), identifying neutral mechanisms and spaces in which to collaborate, and ensuring participants are informed and have access to the information they need ahead of time in order to provide meaningful inputs.

 

Secondly, when working on policy change initiatives, it is important to conduct a regional mapping of experts and stakeholders. Civil society in particular tends to be treated as monolithic, when in fact there is typically a diversity of opinions and perspectives among that stakeholder group.

 

Identifying the problem: knowledge gaps and disconnected legislation

Mira Milosevic, who leads an international network of journalism support and media development organizations, noted that amid the challenge of an accelerating change in the digital policy landscape, the media sector is facing a huge gap in capacity and knowledge. She points to a lack of experience, evidence, research, and understanding of different policies and decisions. 

 

Despite these changes in landscape, laws are being brought forward and collective action is contributing to policy spaces. And although some legislation can be beneficial for users in regards to privacy, Catherine Muya highlighted that legislation can either be cumbersome for editors or in countries with repressive governments could be harmful for news outlets -- especially freelancers without support.

 

During the question and answer portion, virtual and in-person audiences asked about the issue of speed in developing legislation. Throughout the conversation speakers addressed the process of engagement, but when you factor in course correction and feedback loops, how are stakeholders able to address a timely and speedy solution without letting policies fall behind? Milosevic acknowledged that there are no mechanisms in place to act quickly given that there needs to be more transparency from key actors such as platforms and big companies.The session left audiences with food for thought.

 

Tackling policy gaps through multi stakeholder approaches

Digital policy takes form in various countries with different governance structures, and there is no one-size fits all approach. Paola Galvez, a Peruvian lawyer, said that just because teams are working on understanding legislation, they have the adequate knowledge of the digital ecosystem and embrace multistakeholderism. “Building trust and having meaningful relationships with all stakeholders is key. I didn’t realize that at first, but when we tried to approach stakeholders to do roundtables and workshops, that’s very practical, but we need to take a step back and figure out how I am building trust with these stakeholders.”

 

Reflecting on existing fora

On the subject of building “trust,” Milosevic recognized that in order to help build trust over time, there needs to be a space for different stakeholders to come together on a permanent basis while recognizing confidentiality and safety of the issues discussed. Ad hoc and last-minute engagements are “not enough.”

 

Constance Bommelaer, whose organization ensures that digital governance is prioritized in development of new tech and embedded in the next generation of the web, pointed to the IGF as a multistakeholder example where representatives of civil society, consumer societies, youth audiences, and a diverse set of countries come together. She noted that it’s important to not only participate but also support and create impartial platforms such as the IGF where learning can occur. Catherine Muya quickly jumped in after this to agree that when contributions come from both civil society and the private sector, the government can take issues more seriously. She pointed to the time when a strong community of artists came together in Kenya to educate legislators on the impact of a copyright bill. 

 

Participating with a hybrid audience

In terms of participation, the session drew an online audience from a wide range of countries outside of Africa, including India, Philippines, North Macedonia, and Armenia. Questions both online and in-person came from a range of individuals from organizations that work with the private sector, civil society, and technology education. Guy Berger, former director of Policies and Strategies regarding Communication and Information at UNESCO, live tweeted the event to his network and highlighted Bommelear’s comments on transparency and compliance for existing regulation and Muya’s comments on the need for evidence-based advocacy to speed-up policymaking. In-person and online participants also noted that the title of the session intrigued them. 

 

IGF 2022 WS #254 Trustworthy data flows: building towards common principles

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

Cross-border data flows underpin today’s business, government & societal functions. Processing & transfer of personal data are integral to these, making trust a vital element for sustainable growth. Trust is eroding over concerns that government demands to access data may conflict with universal human rights & freedoms or cause conflicts with domestic laws when access transcends borders.

,

High-level principles & safeguards on government access are a much-needed foundation towards scalable measures and global dialogues.

Calls to Action

Principles for trusted government access must be based on international human rights law, which may demand protections that some countries do not currently have in place. (see more on these principles in the ICC White Paper on Trusted Government Access)

,

Such principles must lead to effective multilateral and multistakeholder collaboration to foster interoperable approaches and legal certainty to enable data to be exchanged and used in a trusted manner, thereby aiming for high privacy standards.

Session Report

 

Introduction

Global data flows are at the heart of the world’s economy and well-being. This became evident with the COVID-19 pandemic lockdowns in 2020, when companies of all sizes, across all sectors around the world enabled remote work and transitioned their businesses to online-first or online-only. Data transfers are estimated to contribute $2.8 trillion to global GDP—a share that exceeds the global trade in goods and is expected to grow to $11 trillion by 2025.

Companies rely on these flows to conduct day-to-day business with customers, partners, and suppliers; innovate in their business and operations; detect cyber threats and intrusion patterns; and compete more effectively – in sectors as diverse as agriculture, healthcare, manufacturing, banking and shipping. In 2021, the G7 also emphasized the importance of cross-border transfers, noting that the “ability to move and protect data across borders is essential for economic growth and innovation.”

However, trust in international data flows is being eroded over concerns that government demands to access data for criminal and national security purposes may conflict with universal human rights and freedoms, including privacy rights, or may conflict with other national laws when such data transcends borders. These concerns have led to uncertainty that may discourage individuals’, businesses’, and even governments’ participation in a global economy, negatively impacting inclusive and sustainable economic growth. 

Key takeaways

Participants in the session agreed that governments need to access personal data to protect public safety and national security, but warned that access without safeguards inevitably leads to abuse, violations of individuals’ fundamental rights, and a loss of trust in data flows.

The session highlighted the need for cooperation between governments and stakeholders, including business and multilateral organizations to develop interoperable policy frameworks that would facilitate cross-border data flows and enable data to be exchanged and used in a trusted manner thereby aiming for high privacy standards.. The OECD effort to define principles and safeguards for government access to personal data held by the private sector was referenced as an example to provide a firm foundation for data-free-flow-with-trust.

The conversation also highlighted various factors that impact trust when governments request access to private sector data. For example, legal barriers to data transfers can arise from differences in laws governing government access and discrepancies in safeguards when data transcends borders. In addition, companies that receive government requests for data they hold must decide (a) whether the demand is lawful; (b) whether any cross-border demand presents a conflict of law between jurisdictions in which they operate; (c) how much data they are compelled to disclose; and (d) what information about their responses to these demands may be disclosed to customers and the public. These concerns also significantly contribute to public sectors’ reluctance to deploy digital technologies broadly, fearing that this would potentially expose their public sector data to third-party governments that may demand access.

Call to action

The session called for the creation of commonly agreed shared principles for trusted government access to ensure that government access to personal data is consistent with the protection of individual rights and the rule of law.

Such principles must lead to effective multilateral and multistakeholder collaboration to foster interoperable approaches and legal certainty to enable data to be exchanged and used in a trusted manner, thereby aiming for high privacy standards.

Policymakers should support open cross-border data flows, while also ensuring that users have adequate privacy, security, and IP protections and that those protections are implemented in a manner that is transparent, non-discriminatory, and not a disguised restriction on trade.

Further reading

IGF 2022 WS #253 Towards Cyber Development Goals: Implementing Global Norms

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

Tension between the need to advance digital transformation versus the lack of a strong cybersecurity posture poses risks to achieving the SDGs and a safe, secure online environment. While doing more to increase the resilience of digital infrastructure is necessary, it is not sufficient. Translating existing international agreements into feasible actions is long overdue.

,

The international community should explore practical ways to mainstream cybersecurity capacity building into broader digital development efforts.

Calls to Action

To promote safe digital transformation during the Decade of Action and beyond, the international multistakeholder community should come together to agree and adopt Cyber Development Goals (CDGs) to mainstream cybersecurity into the development agenda.

,

Complementing SDGs, CDGs can help define global benchmarks and practical activities to support countries implementing universally endorsed UN norms and mobilize the UN Development System and stakeholders worldwide to achieve concrete goals, facilitate coordination.

Session Report

 

Introduction

Accelerating digital transformation is essential to achieving the Sustainable Development Goals (SDGs). A secure, trusted, and inclusive digital infrastructure is the backbone of today’s economic and social development. With just over half of the world’s population connected to the Internet, closing the digital divide is essential to reducing inequalities and socioeconomic gaps between those with access to digital services and those without.

Digital transformation and the expansion of the digital ecosystem also comes with increased cybersecurity risks, especially in low- and middle-income countries that may lack adequate cyber resilience against constantly evolving digital threats. This tension between the need to close digital divides and advance digital transformation versus the lack of a strong cybersecurity posture can be considered a risk to achieving the SDGs and a threat to achieving safe, secure, and rights respecting online environment.

Against this background, the workshop, discussed how the international community could explore practical ways to mainstream cybersecurity capacity building (CCB) into broader digital development efforts to empower and protect societies from increased cybersecurity risks associated with digital transformation.

Key takeaways

Through an engaging and dynamic conversation, the panelists and audience members debated the idea of developing Cybersecurity Development Goals (CDGs), a set of aspirational and feasible goals to rally the international community to collaborate in closing digital divides, bolster resilience by fostering access to digital transformation, and enable the implementation of international law and norms to curtail malicious cyber activities.

The conversation confirmed the pressing need to mainstream cybersecurity into digital development to support a safe digital transformation and thus a better and more sustainable future for all. The idea of coalescing around shared goals received much support, highlighting the need to bring together various policy silos that work on cyber issues, as well as those working on development issues, to create a shared language and to build on the existing work, in particular the various capacity building initiatives.

All participants strongly underlined that  any process to develop  such shared goals must fundamentally hinge on trust and inclusiveness, and their success will depend on the ability of translating goals, norms and international frameworks into understandable and practical action through a whole-of-government and whole-of-society approach.  

Call to action

The session aimed at promoting a safe digital transformation during the Decade of Action and beyond. In particular, it called for the international multistakeholder community to come together to agree and adopt Cyber Development Goals (CDGs) to mainstream cybersecurity into the development agenda.

Complementing SDGs, CDGs can help define global benchmarks and practical activities to support countries implementing universally endorsed UN norms and mobilize the UN Development System and stakeholders worldwide to achieve concrete goals, facilitate coordination.

Further reading

IGF 2022 IS3C General Meeting: Recommendations to Make the Internet More Secure and Safer

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

The findings of the IoT and Education and skills research are best practices, in line with IS3C's goals. Key findings are applicable to all stakeholders. To go from theory to practice, the outcomes need to be acknowledged and deployed. Gender balance is still a problem. An extra effort is needed to reach female respondents and key persons opinions. Our report is here: https://is3coalition.org/docs/study-report-is3c-cybersecurity-skills-ga…

Calls to Action

Outreach to all stakeholders to distribute the findings of IS3C's outcomes. Set up hubs and teams to discuss deployment of outcomes. Maintain a relationship with related stakeholders, as a resource for future work. Reach out to female resources and key persons from developing countries to establish the balance of the research. Cybersecurity and its related issues is long-term work; each day lost in another day.

Session Report

IS3C held its general meeting at the IGF in Addis Ababa where it presented on the results of work carried out in 2022 and looked ahead to the near future.

Working Group 1 : Security by design, sub group on Internet of Things

Research has confirmed that there is a large gap between the theory of security and the daily practice of IoT security. The working group focuses on identifying the solutions needed to close this gap. The first results will be reviewed in December and January, after which the final report will be published in the winter of 2023. The IGF open process of consultation with stakeholders worldwide will be announced soon.

The WG's research focused on: (a) a review of current security-related IoT initiatives and practices worldwide, and (b) to develop a coherent package of global recommendations and guidance for embedding security by design in the development of IoT devices and applications. The report will include the outcome of research questions shared globally. One of the outputs of the research is a compilation of all the Security Best Practices that could be collected from the documents. These best practices are divided into four categories: Privacy and Exposure; Update; Non-technical; and Operation/Community.

Also, attention is given to the consumer side. What do they need to know about IoT security by design when they deal with a device containing IoT? Current labelling schemes have been compared to ascertain this. When consumer knowledge is upgraded and it is ensured that they are fully equipped to use a device securely and are aware of their rights, focus shifts to the manufacturers of the IoT devices and tools. They will feel the burden of having the obligation to make sure the device is in a good condition and safe to use more strongly, as well feel a growing awareness to deliver security updates to the devices they manufacture.

A  call for action was launched by chair Nicolas Fiumarelli to all stakeholders to participate in the open consultation process of the draft report.

 

Working Group 2 : Education and skills

A major factor undermining the development of a common culture of cybersecurity is that students graduating from tertiary ICT-related educational programmes often lack the skills that business and society as a whole need in order to understand the benefits of security-related Internet standards and ICT best practices. In order for ICT security to be better understood, it has to be integrated into tertiary ICT educational curricula; at all levels. This may result in the structural development of ICT(-related) products and services that include cyber security Internet standards and ICT best practices. The coalition’s Working Group 2 has therefore identified the following goals:

  • To detect and resolve cyber security skill gaps in tertiary ICT education curricula;
  • To encourage tertiary educational institutions to include in their ICT curricula the essential skills, knowledge and understanding of security- related Internet standards and ICT best practices, building on current best practices, in order to bring tertiary education in line with emerging workforce requirements;
  • To strengthen collaboration between educational decision-takers and policy makers in governments and industry in order to align tertiary ICT curricula with the requirements of our cyber future;
  • To ensure effective collaboration between key stakeholders in order to keep tertiary ICT educational materials in step with new technologies and standards and prevent new skills gaps from developing;
  • A need to make cyber security education more interesting to young people and especially women;
  • To make cyber security education part of life-long learning programmes.

The research used two methodologies. First interviews with cybersecurity experts in multiple countries and second, a questionnaire that was extensively distributed through internet governance fora. This resulted in input from 66 countries from all regions around the world.

The results show the gap between what people learn in the formal education and what the need in the cybersecurity industries is. When the technical skills could be learned in formal education, employers need to add some soft skills, creativity and critical thinking for example. The resources said that there is a need for collaboration between education and industries, to ensure that knowledge becomes more compatible to employers’ demands. Also, there is a need for constant knowledge sharing from the experts in cyber securities.

The report 'Closing the gap between the needs of the cybersecurity industry and the skills of tertiary education graduates' was formally presented by WG 2 representative Teuntje Manders to MAG chair Paul Mitchell and to the research's sponsors, Mieke van Heesewijk of SIDN Fonds and Julia Piechna of NASK. It can be found here: https://is3coalition.org/docs/study-report-is3c-cybersecurity-skills-gap/, on IS3C’s website.

The working group on Data Governance and security was not able to present but will present its report in the winter of 2023.

 

Global Digital Compact

IS3C has launched a special working group for its response to the Global Digital Compact. Dr. Allison Wylde leads this body of work which will reflect the outcomes and work underway within IS3C that ought to become a part of the GDC to ensure a more secure and safer Internet, thus world.

 

The future

Two working groups will start their work in 2023, Procurement and supply chain management, and a prioritisation list when procuring secure by design ICTs. Others are in the process of formulating their mission statements: post-cyber encryption; A working group aims to offer a roadmap for anticipatory governance strategies for the field of emerging technologies, initially focusing on AI and Quantum technology; Consumer protection and advocacy and finally; A working group that focuses on the (barriers preventing) deployment (of) three standards: DNSSEC, RPKI and IPv6.

Everyone is invited to join and/or support the upcoming body of work that IS3C endeavours to undertake in 2023.

IGF 2022 Open Forum #48 Internet Society Open Forum "Protecting the Internet"

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

Internet fragmentation is a theme for good reason, this year we have seen an increasing number of government decisions on geopolitics that are bringing greater concerns that could lead to the splinternet.

,

We all (stakeholders) have a responsibility to protect the Internet from fragmentation

Calls to Action

Join the Internet Society movement to protect the Internet

Session Report

Session Notes

Augustina and Andrew introduced the topic and speakers.

Panelists' discussion on the topic:

  1. Natalie
  1. Referred to knitting as an analogue to appreciate the efforts put to build the internet and making it such an incredible source. She emphasized that stakeholders could work together to build a bigger, stronger and more resilient internet by following a simple pattern that is bigger than anyone as it brings value through global connection.
  2. We all have a responsibility to protect the internet from fragmentation.
  3. The internet has seen concerning trends on the splinternet that can lead to a future we do not want (splinternet).
  4. The internet is made of the foundation of critical properties that all together form the internet’s way of working. She compared it to being a business model for the internet. It is the simple foundation that the internet exists, it is what separates it from other types of networks like an office internet.
  5. Referencing the knitting analogy, the internet is a simple pattern that enables any network to become a part of the global sweater that benefits all.
  6. The internet is not just about technology, every network that wants to participate on the internet must adhere to the foundation that enables us to be globally connected.
  7. The splinternet is the opposite of the internet. It is the idea that the open globally connected internet that we all use that splinters into a collection of islands that do not connect to each other.
  8. We are worried about the splinternet because businesses, governments and organisations are increasingly making decisions that can undermine how the internet works. And unknowingly or knowingly starts to unravel the incredible resource that we have put so much effort into creating.
  9. Internet fragmentation is a theme for good reason. This year we have seen an increasing number of government decisions on geopolitics that are bringing greater concerns that could lead to the splinternet.
  10. The splinternet would complicate our ability to communicate with each other by fragmenting the internet to separate networks that do not work together so easily. Having zoom calls will be difficult, and people may have to pay to work on a shared document.
  11. There many causes of the splinternet including:
  1.  Internet shutdowns- When the government tries to disconnect the networks within its borders from the internet has serious consequences for its citizens. It's like unravelling the sleeve from the sweater and disconnecting from the global resource (internet).
  2. There are political decisions that could lead to the splinternet. For example, in the Ukraine war there have been calls to disconnect other networks from the internet. This goes against the principles that make the internet thrive.
  3. Threats to mitigate policies and business decisions that do not protect the internet and its existence. With governments tackling hard and complicated issues like misinformation, disinformation, and online harm. In trying to mitigate these harms, governments are proposing decisions that do not understand the impact on the internet and what makes it thrive.
  1. To protect the internet, the Internet Society has created an impact assessment toolkit which is like the environment impact assessment but for the internet. The toolkit is based on two white papers:
  1. The critical properties of the internet’s way of networking which establishes the internet; and
  2. The enablers of a globally accessible, secure, and trustworthy internet.
  1. The Internet Society has been using the tool kit to analyze decisions and proposals around the world to understand how they impact the internet and educate businesses and governments on how to mitigate these harms. This has allowed the Internet Society has collaborated with the community and led to engaging in conversations with the said governments and businesses.

 

  1. Emmanuel
  1. Heads the #deargov organization in Nigeria.
  2. He indicated he had applied the Internet Society toolkit, which is easy to understand the incentives and motivations for those governments around the world, particularly in Africa continue to seek pathways for consolidating interests and control around the internet and data resources.
  3. The concern of bits of the internet being controlled without recourse of the long-term effect and the impact will be on the open model of the internet.
  4. They have been supporting the Nigerian government through policy recommendation. Recently they worked on two policy recommendations or regulations that the government sought to propose they include:
  1. The Social Media Bill which we recommended the government to address the impact of the regulation on an interconnected model of the internet.
  2. 2021 Twitter Ban. They looked at the economic implications in the long run on the ban.
  1. Governments usually have genuine intentions and interests in tackling internet issues e.g., cybercrimes, child pornography, hate speech, misinformation, and disinformation. However, the approach is the problem there is a disconnect between the intentions and the drafting of policies and laws. This leads to stakeholders feeling their interests and concerns are undermined in the drafting or proposals of the regulations. Thus, the Dear gov organization works with governments and stakeholders to try and build a multi-stakeholder perspective that allows all stakeholders to see the entirety of what the regulation proposes in terms of human rights, and accessibility of the internet and does not favour certain players more than others.
  2. Mirja
  3. The Internet Architecture Board (IAB) is one of the three leadership groups of the IETF, the main organization tasked with developing and maintaining some of the internet protocols.
  4. The role of IAB is to provide architecture oversight, not just the protocols, but it tries to get the big picture of how everything works together and whether there are gaps and trying have conversations with the organisations to fill the gaps.
  5. It's also a contact point for STOs.
  6. The IAB is a group of experts who sometimes have differing opinions, but the group is monitoring and discussing the global development of internet governance and its impact on the internet and the quest for digital sovereignty.
  7. The IETF’s mission is to make the internet better. But is you look at its mission statement, it defines ‘better’ in terms of where the internet comes from. The mission statement states, ‘We want to make the internet useful for communities that share our commitment to openness and fairness.’ It is what the internet is based on and what the IETF is committed to.
  8. The success of the internet is based on how we design it based on a set of principles and create building blocks that you can use in different ways together, which has provided the success and innovation on the internet that you can see so many services that have bloomed over time. It is one of the base principles for maintaining the protocols and technology that provide the internet.
  9. Deployment of new technology (protocols) gets blocked in the name of digital sovereignty. This is concerning, especially if the protocols are supposed to provide better security and user privacy. It is also concerning that the blocking is happening on the internet infrastructure, thus fundamentally affecting internet connectivity and interoperability. Further, it limits innovation.
  10. The internet is designed as a global network of networks, and trying to enforce measures that set national boundaries on it goes against some of the basic design principles and puts the future of the internet at risk. Thus, we should all work together to keep it as one, open and globally connected internet.
  11. Noelle
  12. On the topic of there being many paths of internet fragmentation and focused on one path. The one has been driven by governments that want to exercise their sovereignty over how the internet works within their borders. It is referred to as digital sovereignty or internet sovereignty, or tech sovereignty.
  13. This has been a subject of the Internet Society this year. Digital sovereignty means a lot of different things to different people in different countries. Thus, it's unwise to equate internet sovereignty to internet fragmentation. Some may be well-meaning people who are using this term (fragmentation) and expressing support to digital sovereignty.
  14. There is one approach to digital sovereignty that could fragment the internet. One of the reasons a government or State wants to assert sovereignty in the digital space is when it is worried about its national security. Thus, its reasons for wanting to secure the digital space within its borders are a way of making the country more secure. This becomes a threat to the internet because a State wants to implement it by giving itself the power to control how the internet works locally. It wants to have a greater hand in managing the internet infrastructure or directing how networks operate; for example, when a government through its agency wants to control the flow of traffic within the country or to and from the country, it comes up with its own routing policies.
  15. Another example is when a government tells everyone to synchronize their clocks in their ICT system to only one source at a time (the government servers). Typically, systems synchronize their clocks with multiple time sources to minimize the risk of getting the wrong time. It's part of what makes the internet robust and resilient.
  16. From the above examples, we can see an attempt to centralize the processes and mechanisms that are decentralized and distributed on the internet.
  17. A final example is one country requiring operators to use the DNS reservoirs that are controlled by the government. This allows governments to change how name resolution works in the country or creates an alternative to the global DNS. This would prompt fragmentation in the global network of networks. Imagine if all governments were doing this.
  18. Invitation to read the Internet Society’s upcoming report called, ‘Navigating Digital Sovereignty and Its Impact on the Internet’ coming up on Thursday, 1st December 2022.
IGF 2022 Town Hall #100 Who is being left behind by Internet governance?

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The multistakeholder model shaped for Internet governance shows itself as probably not sufficient to guarantee that different realities are taken into account. Power asymmetries and gaps between the Global South and North are some of the characteristics that show the limits of the agenda currently developed by the current hegemonic Internet governance stakeholders.

,

It is necessary to think about technology from the territories and different realities. Additionally, legal frameworks developed within the Global North should not be replied uncritically in other regions. The risk of replicating colonial relations should be taken into consideration. Providing connectivity to the Internet is not sufficient nor mandatory to guarantee digital rights, for instance.

Calls to Action

States and companies should take into consideration meaningful connectivity, in addition to respecting autonomous decisions on how to exercise digital rights, in order to not summarize digital rights in a relationship between companies and consumers. Groups that do not use technology and the Internet traditionally (or at all) should be allowed to participate in this debate as well.

,

The current Internet governance stakeholders - academia, private sector, government, and companies - should foster the participation in the corresponding debates of social movements and human rights organizations that are underrepresented at that sphere.

Session Report

The panel took place on December 2nd, from 09:00 to 10:00 (UTC). Its main objective was to provide a constructive approach to Internet Governance. In this context, the session focused on how digital policies affect organizations and members of social movements that deal with vulnerable populations, although those groups do not necessarily participate nor have their voices and demands heard during their formulation.  

The first panelist, Vladimir Cortés, Digital Rights Program Officer at Article 19 Mexico and Central America, pointed out that there is a complexity when it comes to Internet Governance. There are different elements to be considered, namely (i) the multistakeholder nature of Internet Governance, such as the private sector, academia, and technical community; (ii) the decentralized way in which Internet Governance is organized and how it reaches different entities, considering not only the different official forums; (iii) the fluidity of this theme, which constantly changes. Furthermore, there is the issue of overlapping mandates on regulation and governance in this area, which also presents an ambiguity and a lack of a definition in terms of hierarchy. With all this in mind, he pointed out the urge to involve not only the governments in the Internet Governance debate, or stakeholders that represent privileged groups, but the need to also bring the historically marginalized groups and other stakeholders, such as non-professionals, ethnic minorities, women, youth, non-English speakers, people with disabilities, among others. He concluded that there must be an effort to take this debate to a different sphere, through an effectively democratic and inclusive process. 

The second panelist, Nandini Chami (IT for Change Senior Research Associate), spoke about the difficulty in defining the concept of governance, once there are new challenges with the increased use of the Internet that go beyond technical cooperation. In that sense, there is an increase in the use of data and resources, which changes the structures of the economy and society. In this sense, she pointed out that the UN Secretary General has already pointed out the need to deal not only with connectivity and technology, but also with the promotion of human rights, AI regulation and data commons in this new scenario. The issue to be addressed is that this new economy and society agenda ends up being formed by corporations and their interests, which are aggravated by the absence of binding rules, especially for cross border data flows. This can enhance data colonialism and data extractivism, aggravating human rights violations. Although there are forums that seek a multiplicity of points of view, there is a problem related to the fact that large corporations “capture” multistakeholderism, stratifying participation and making their agenda prevail, without considering a truly democratic process. There are ways, however, to change this scenario, by allowing solutions focused on people, and not only on the interests of big tech companies. Additionally, in regard to AI regulation, it is important to consider the needs of the Global South, not just replicating the governance that exists in the Global North, since different regions deal with human rights and cultural identity issues in different ways - especially considering second and third human rights generation and the issue of self-determination. Finally, regarding data commons and their use by big tech companies, the solution encompasses stopping the exclusion of the affected voices and, in this context, there must be a shift in perception, attributing these data as common knowledge that belongs to communities and people, including more people in a more democratic way afterwards.

The final panelist, Catalina Moreno, from Karisma Foundation, enriched the conversation by bringing examples of the scenario in Colombia. She reiterated that, although there is a belief that the Internet should be a free and decentralized environment, this does not happen in practice, since it is not available to everyone, nor does everyone participate in decision-making processes. In the Colombian case, she mentioned 04 cases that demonstrate the exclusion of participation in Internet Governance: (a) the existence of a surveillance system developed by the government via which the police can intercept internet and telephone signals, in addition to allowing agencies to access internet traffic, enabling the capture of communications. This tool has been used to monitor journalists and human rights defenders; (b) during the protests of the last few years, the internet connection was shut down, so that it was not possible for the protesters to report human rights violations. In both cases, the internet was used to monitor and silence human rights violations, without the affected populations being heard or considered in these scenarios; (c) the digitization of various data and services during the Covid-19 pandemic, which did not cover the entire population. Many people (such as indigenous communities) did not have access to connection nor command of Spanish and were not informed in time to participate in public civil activities, so decisions were made excluding such vulnerable groups; (d) finally, the case of civil society organizations that use the internet to disseminate their work, but do not necessarily know how these environments work and how their content may be disseminated. The possible solutions presented were: (i) organizations must  be aware of the challenges of digital environments; (ii) civil society must be able to help to engage the affected populations and take these demands into discussions with the authorities in the creation of governance-related policies; (iii) research-based advocacy should be conducted considering local needs; (iv) civil society should promote and expand regional and international networks to amplify the voices of human rights defenders in digital spaces. (v) authorities should develop and implement models of effective participation that consider the views of different stakeholder groups, including academia, civil society, and the technical community; (vi) there must be a strengthening of knowledge about Internet Governance from a human rights perspective by the Judiciary Branch, when approaching such questions.

After this stage, the floor was open for questions from the audience, both remotely and in person.

The audience engaged with the discussion in a meaningful manner. Questions related to issues about  connecting everyone in the world, considering different areas and communities, were raised and then addressed in a sense of recognizing its relevance but, at the same time, considering the need to respect those who do not want to be connected and their rights

Furthermore, a reflection was made on the structure of governance, considering that some regions and some populations are perceived as consumers of technology, and not producers, keeping them away from the discussion on the real governance of the internet. In this sense, it was considered that it is necessary to democratize the construction of these technologies. The idea of democratizing infrastructure was also endorsed by the panelists.

Even more, a comment was made on how spaces for debates exclude people who do not conform to straight, white and cis standards. It was mentioned that a positive path would be creating multisectoral and intersectional spaces, but it should be observed if such spaces are really representative and not just represent a way to "co-opt" an inclusive speech without actually guaranteeing the effectiveness of this inclusion.

Finally, it is important to highlight that there was a concern with gender issue in this panel, which was reflected both in the panelists (out of the 03 guests, 02 were women from the Global South). In addition, two women were involved in the organization and elaboration of the panel (responsible for the moderation and reporting). This presence fostered the reflection on gender issues in governance, which encompasses the need to not only include women in relevant spaces, but also to think about truly feminists agendas in those areas, so that the corresponding demands are really addressed.

IGF 2022 WS #326 Platform Responsibilities for Journalist Digital Safety

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

Gendered online violence against journalists is a structural problem, how to combat it needs to be an integral element of the internet governance discussion and a more effective institutional response, including by social platforms is urgently needed.

,

The UNESCO/ICFJ research "The Chilling" provides key insights into the impact of online violence as well as clear recommendations to different stakeholders on how to address it.

Calls to Action

All actors of the internet governance structure, including social media platforms must take gendered online violence against journalists serious as an attack on Freedom of Expression and put in place effective counter-measures.

Session Report

The session was co-organized by UNESCO and APC and brought together speakers with different areas of expertise on the topic of gendered online violence. UNESCO and APC both have implemented an extensive range of projects on the topic of safety of women journalists and as well as cooperated on this issue, last by organizing a consultation process looking at how gender perspectives can be more strongly integrated in the implementation of the UN Plan of Action on the Safety of Journalists and the Issue of Impunity.

Julie Posetti from the ICFJ presented key statistics from a report jointly published with UNESCO. “The Chilling” highlights the severity and the impact of gendered online violence on women journalists and on freedom of expression more broadly. She specifically stressed the online to offline trajectory and pointed out that 20% of surveyed women journalists said that they had been attacked offline in connection with online violence. According to Posetti, “online violence aids and abets impunity for offline violence”.

Nompilo Simanje confirmed similar findings for the Southern African region, where perpetrators also target women journalists largely without consequence. Due to this situation of impunity, Simanje spoke of a “normalization of online violence”. She particularly emphasized two manifestations of online violence, namely doxing, the sharing of personal information of a victim and digital surveillance.

Guilherme Canela from UNESCO stressed that gendered online violence is a structural problem, calling for an institutional response. He argued that combatting gendered online violence should be considered as an integral part to the internet governance discussion and called for an internet which is free, independent and pluralistic but also safe for all of its users. He introduced recommendations published by UNESCO and ICFJ in 2022 as part of “The Chilling” which provide actionable advice to different stakeholders on how to effectively address online violence against women journalists.

Building on this introduction of the recommendations, Julie Posetti provided further insights regarding the specific sets of recommendations directed at internet platforms and at political actors and States. In both cases, she emphasized the need to put in place mechanisms and structures that specifically stop actors perpetrating violence against women journalists.

In the following, Nompilo Simanje emphasized the need for tech platforms to increase capacities that allow for an understanding of local languages and contexts. Julie Posetti raised the increasing issue of extraterritorial attacks against journalists and how violence against them perpetrated online radiates into offline spaces, event internationally.

Finally, UNESCO’s Guilherme Canela introduced a risk assessment framework currently being developed by UNESCO for digital platforms. This risk assessment framework can guide platforms on how to better minimize risks and harm for users, including by taking into account risks of gendered attacks and the proliferation of gendered disinformation. The risk assessment framework will be presented in February 2023 during the “Internet for Trust” conference on platform regulation organized by UNESCO.

The session terminated with a series of questions from the audience.  

 

 

 

 

IGF 2022 Town Hall #91 The war in Ukraine and the disinformation war

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

The war in Ukraine has been a relevant stress test for the European fight against disinformation. The organisational framework put inplace and recently augmented by the Digital Services Act on the basis of the coordination of all interested parties (Code of Conduct forthe platforms, fact checking, access to data, intervention of the regulators and validation of the results) has demonstrated to be robust,even if progress is still needed

,

It is useful to stress the importance of having a distributed system, not a vertically integrated one. Interaction and collaboration betweenthe differents organisations involved is fundamental, as is the transparency and neutrality of the actions and decisions of thoseorganisations. The Commission which has the power to inflict heavy fines, will only intervene when one of the actors does not respectagreed rules and procedures

Calls to Action

Measures to counter disinformation needs to be improved all around the world, in a multistakeholder form.

,

Independent regulatory bodies could play an essential role and are a good compromise between the respect of Human Right and the restrictions required by the fight against disinformation.

Session Report

Giacomo Mazzone introduces the Town Hall meeting explaining that the title “The war in Ukraine and the disinformation war”, was proposed by Edmo (the European Digital Media Observatory) and Eurovisioni. The objective of the session is to understand how the Internet can be used as a weapon, not in the battlefied, but in a battle to influence public opinion about the war in Ukraine, in the rest of Europe and in the world. 

Disinformation around Ukraine is also a testbed for the recent measures that the European Union has put into place to fight disinformation, such as the Code of practice and the European Digital Media Observatory (which were presented at the IGF in past meetings) and the newborn network of national observatories of EDMO. The present war is a laboratory of what could happen in a future cyber war.

Krisztina Stump, Head of the Unit in charge of the Commission’s policy to fight disinformation, presents the unique European approach. It is unique because it is based on a strong toolbox, whose tools are fully rooted in freedom of speech, combining regulation and industry-led solutions (reflected in the Code) in the form of co-regulation, with the Digital Services Act backing up the Code; it is rooted in a multi stakeholder approach, which is also demonstrated by EDMO, its national/regional hubs and the diverse stakeholder community it is assembling. 

Within the Code, did we find a single magic bullet to fight disinformation? This is not the case, as disinformation is a complex problem, requiring complex solutions. The code of practice is a therefore toolbox with a variety of instruments that all together can be efficient in fighting disinformation.

The key areas of the revised, 2022 Code of Practice are Demonetisation, Transparent political advertising, Reducing manipulative behavior including detecting fake accounts, User empowerment measures including media literacy, Fact checking coverage throughout EU with fair financial contributions and Data access for research.

The Code comes with strong transparency measures to allow users to consult how signatories of the Code implemented it. There is a Transparency Center, a Permanent Task-force chaired by the Commission which continues to work on the implementation of the Code, and a robust monitoring framework to make sure the commitments are properly implemented.

There are 35 signatories of the Code, which include major online platforms (Google, Meta, Tiktok, Twitter, Microsoft, etc.), but also associations and smaller and specialised platforms, the advertising industry, fact-checkers, research organisations and players offering technological solutions to fight disinformation. This is putting into practice the multi stake-holder approach.

In case signatories – who are considered as Very Large Online Platforms -  do not live up to their responsibility to mitigate the risks stemming from disinformation, the Digital Services Act offers regulatory intervention (hard regulation entered in force in all EU countries on November 16th 2022.

The war in Ukraine and the war propaganda surrounding it is a very specific situation. The Kremlin’s propaganda machine is part of hybrid warfare, it is in that light that the EU adopted  sanctions against certain Russian broadcasting channels. At the same time,  the implementation of the Code of Practice by the signatories offers also a variety of measures fighting disinformation around the war. The Commission is working with the signatories to make sure they live up to their commitments, notably to demonetize Ukraine related disinformation, to increase  fact-checking, and to apply all the other measures such as giving users reliable information, labeling State affiliated accounts, and take measures against coordinated manipulative behaviour.

EDMO secretary general Paula Gori presented the Ukraine’s war observatory, that since February is regularly analyzing and reporting about disinformation campaign across Europe while  Claire Wardle – EDMO expert for Media Literacy- explained that disinformation need to be tackled also in the long term through digital and media literacy regular efforts and campaigns.

Two fact-checking organizations participating to the debunking activities of EDMO (Tommaso Canetta/Pagella Politica and Adam Maternik/Demagog.org) presented some of the cases of disinformation and of toxic information propagated during the war, mainly by Russian sources, but also, in a smaller percentage, by Ukrainian sources.

Francesco Sciacchitano from the Italian regulatory body added that the disinformation war has been a very tricky issue to be treated by independent authorities of regulation across the EU, because this effort lies on the thin edge that divides freedom of expression from hate speech and propaganda from reliable reporting.

The panel’s session was followed by a short but intense session of questions and answer with the audience in the room and on line in which intervened Russian, Ukrainians and Iranians participants.

LINK TO THE PRESENTATIONS SHOWED IN SESSION:

https://edmo.eu/wp-content/uploads/2022/09/Stump-IGF-EDMO-2022.pptx

https://edmo.eu/wp-content/uploads/2022/09/Canetta-IGF-EDMO-2022.pdf

https://edmo.eu/wp-content/uploads/2022/09/Maternik-IGF-EDMO-2022.pptx

 

IGF 2022 WS #403 Cross-Border Data Sharing for Public Safety

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

Efficient cross-border data sharing is essential for public safety and so is striking the balance between protecting our fundamental rights to privacy. Multi-stakeholder discussions and participation can improve understanding of this topic and where current mechanisms fall short. Despite the complications, the cost of doing nothing on this issue will lead to further lawlessness.

Session Report

This roundtable discussion outlined the importance of the role of cross-border data sharing for public safety and highlighted the difficulties surrounding it. The moderator of the roundtable, Emily Taylor, CEO of Oxford Information Lab introduced the significance of this topic, focusing on the fundamental premise of the Internet enabling the free flow of data and information across borders. With mass uptake of the digital network, we have seen an emergence of new patterns including cross-border data sharing.

Marjorie Buchser, Executive Director of the Digital Society Initiative at Chatham House who is leading a project on cross-border sharing, set the scene at the beginning of the discussion as to why this is important for public safety. Traditionally, crime was analogue in nature and was investigated locally. In the digital age, an increasing portion of our activities include an online aspect. Most crime today has a digital dimension and therefore has a digital trace. Buchser highlighted the importance of the multi-stakeholder approach and in broadening the debate to improve inclusivity, understanding and trust. It is essential to have a multi-stakeholder discussion because there is a balance between the benefits we get from our public safety authorities having access to the data they need for criminal investigations,  and the importance of the protection of privacy and fundamental rights. This balance should not only be resolved by states and online platforms, it also requires civil engagement to develop trust and accountability.

The second speaker, Aisling Kelly, Assistant General Counsel for Law Enforcement and National Security at Microsoft agreed on the importance of the multi-stakeholder approach. Kelly described the current mechanisms in place to address this issue and the shortfalls of this current outdated system. The MLAT (The Mutual Legal Assistance Treaty) was designed originally to protect sovereignty to stop one country’s law enforcement agents going into another country’s territory and interfering with sensitive matters. Kelly asserted that although sovereignty remains to be an important factor in this issue, the system is unfit for purpose in the modern world. Sovereignty, however, continues to arise in the debate and Kelly illustrated an example of this in her own state in Ireland, whereby the Chinese government attempted to open a Chinese police station there, which was abruptly prevented as it infringed on sovereignty.

Kelly, who has witnessed both sides of this debate in action, as she was previously a Public Prosecutor, also addressed the key tensions between the public authorities and the actors involved in these criminal investigations. These actors are not used to working together and can include police forces and online platforms or cloud providers which can lead to a gap in trust. In addition, there is often a barrier of understanding in terms of the type of language and terminology that we use to describe the issues. For example, the title of this roundtable describes cross-border data sharing, which law enforcement agents would call digital evidence sharing or “e-evidence”. The absence of a common language to describe the problem is also a barrier to effective cross-border data sharing for public safety.

Bertrand De La Chappelle, Executive Director and Co-Founder of the Internet and Jurisdiction Policy Network highlighted the evolution of how we see data and evidence for the purpose of public safety investigations. Electronic evidence is not only needed for cybercrime, but also for every type of criminal investigation, such as in a theft or a murder with any type of digital trace that can be used as evidence. A few decades ago, these investigations required evidence which were in the form of documents or papers that were taken in a safe or as an inquiry that required a warrant for an investigation in a particular house. Today, most of these elements are available online or in digital form, and in most cases, it is stored by large companies that can be cloud providers or service providers such as emails. Because of the distribution of international actors in the digital space, those actors are often located outside of the country where the investigation is taking place. 

The discussion around the current mechanisms and how they fall short to solve the issues of cross-border data for public safety was given more context by the fourth speaker, Dr. Theodore Christakis, Professor of International European and European Law. In Europe, more than 85% of criminal investigations are reliant on digital evidence with over 55% of this data being located beyond the borders that the crime took place in. Human behaviour and practices have changed drastically in the last few decades and while there are signs of progress, existing frameworks today do not yet adequately cover all aspects of international transfers. In addition, there remains to be tension around the implications for the integrity and protection for the citizens privacy. An example of how they tried to solve these issues is with the e-evidence regulation or the Second Additional Protocol to the Budapest Convention. Christakis described the progress of the e-Evidence regulation and the lengthy process as well as the debates around language that have delayed real progress. Discussions around the e-Evidence Regulation in the European Union started six years ago, and although there are hopes this will be adopted it is still under discussion, highlighting the complexity and polarising nature of the debate.

Despite this complex landscape in cross-border data sharing, there is hope amongst the speakers for newer mechanisms to address not only the issue of sovereignty, but also the balance between the importance of accessing data for public safety purposes and our fundamental rights to privacy. The speakers concluded outlining the complexities of cross-border data sharing for public safety and agreed on many of the problems and shortfalls of the current system. Despite the difficulties, it was determined by the speakers that the cost of doing nothing was only going to lead to further lawlessness. These issues can affect real people’s efforts to get justice for horrendous crimes. Amongst the technical and legal debates around language and definitions or process and mechanisms, real people can be affected. It is still early in the technological explosion in the digital age and there is much more that can be done to effect real change to make it easier for cross-border data sharing for public safety. One of the areas agreed upon by the panel is the significance in multi stakeholder discussions such as the IGF roundtable which can improve understanding, and therefore improve outcomes. It is therefore important to have further multi-stakeholder debates as well as improving transparency, inclusivity and oversight which can improve trust.

IGF 2022 WS #219 Global AI Governance for Sustainable Development

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

Approaches to global AI governance should be based on transparent and inclusive multistakeholder processes to render them resilient against changing political interests, and they should acknowledge the different realities in the Global North and Global South with regards to digital inclusion. 2:There is a need for AI governance structures that actively promote sustainable development. AI governance should be focused on ethical foundations an

,

There is a need for AI governance structures that actively promote sustainable development. AI governance should be focused on ethical foundations and safeguard human rights.

Calls to Action

Governance for AI must focus on the entire technology lifecycle from development to application to assure ethical AI and safeguarding human rights.

,

The voice of the Global South must be heard in AI governance approaches to significantly decrease digital inequalities between the Global North and the Global South.

Session Report

After opening remarks by the Brazilian Ministry of Science, Technology and Innovation (MCTI) and the German Federal Ministry for Digital and Transport (BMDV), the moderator Onike Shorunkeh Sawyerr (GIZ) introduced the audience questions to be answered during the opening statements of the panellists. The idea was to engage the online and on-site audience early to allow for integrating their responses into the discussion. The questions were:

Q1: What do you associate with the term “AI governance”?

Q2: In which area(s) do you see the greatest potential for AI to contribute to sustainable development?

Q3: Overall, do you expect AI to have rather positive or rather negative impacts on sustainable development?

Q4: What risks do you associate with AI?

 

Urvashi Aneja (Founding Director of Digital Futures Lab, India) described how her institution investigates AI benefits for different areas, including sustainable development. She also highlighted that people still think of AI as a product, and that she considers that framing too narrow. The Digital Futures Lab sees the whole life cycle of AI interventions. For example, she argued that labour conditions for building AI and energy consumption by AI are also important aspects of the discussion. She concluded by stressing the need to improve understanding of the impact of AI on sustainable development.

As the second panellist, Ledénika Mackensie Méndez González (Executive Director for Digital Inclusion at the Secretariat for Communications and Transport of the Mexico City Metropolitan Area) said that AI is a way of innovating the public sector. It requires increasing availability of data as well as transparency. In her statement, she urged governments to assure that AI respects human rights. To accomplish that, ethical challenges must be solved.

Kim Dressendörfer (Technical Solution Leader Data & AI at IBM, Germany) started her statement by expressing that as the technology evolves so quickly, the developers should open the “black box” for everyone, teaching people how to use it properly. Dressendörfer stated that AI is an opportunity to create something new and better. AI governance has multiple layers and also involves the individual developers who must consider the ethical implications of their products. In her presentation, Dressendörfer gave examples of the use of AI: Monitoring animal health in agricultural applications, assisting astronauts on the ISS, and quantifying carbon sequestration in urban forests to store more carbon emissions.

Secretary José Gontijo (Ministry of Science and Technology (MCTI), Brazil) stated that Brazil already accomplished some progress in the field of AI governance by publishing a national AI strategy and starting a discussion on AI regulation within congress. He cited the thematic chambers for the AI strategy, which bring together government, private sector, academia, and civil society to discuss transparency and the applicability of AI. Gontijo also pointed to the ongoing debate between lawyers and technical groups about the regulation of AI and how the legislation should be applied. He highlighted that AI has great potential to boost sustainability. In Brazil, for example, AI may be used in the water management or improvement of the efficiency in agribusiness, in disaster prediction, or public security. Gontijo emphasised the need to reduce the gap between the Global South and the Global North regarding the development and the usage of AI. He expressed that science diplomacy has an important role in reducing the existing inequalities, by keeping technology in the Global South in pace with the Global North and making tech affordable and available for everyone.

Following Gontijo’s opening statement, panellists responded to the first round of questions. Asked about the building blocks of meaningful regulation of emerging technologies, Méndez González highlighted the importance of exchange of experiences between states, to enhance cooperation on building blocks of meaningful regulation of emerging technologies. Resource allocation and distribution to apply these emerging technologies in the countries is crucial to make these regulatory building blocks become a reality. Méndez González expressed that a public policy for sustainable AI should be human centric and intersectional. She highlighted that the inclusion of minorities and marginalised social groups in the AI ethical debate is crucial to reduce inequalities.

Next, Gontijo emphasised the importance of multistakeholder approaches in establishing governance structures nationally and internationally. He acknowledged that it is challenging to find consensus with diverse stakeholders at the table, but once agreement is reached, it provides a strong, broadly legitimated basis. He gave examples of the policies related to the internet and new technologies in Brazil, such as the Brazilian Strategy for Artificial Intelligence (EBIA), e-Digital, the IOT plan (still in development) or the Brazilian internet bill of rights. These strategies, some of them implemented in Brazil since 2010, adopted the multistakeholder approach and it made them able to survive subsequent political changes.

The moderator asked Dressendörfer about the role of the private sector in AI governance for sustainable development. Dressendörfer is convinced that every company has the duty to work towards sustainable AI governance. From that point of view, it is important to bring the ethical discussion into developing teams, so that everyone is aware of the potential positive as well as negative impact of their work. Companies need to make sure people can use AI and that the technology boosts sustainability. She highlighted the challenges that come with the “one-size-fits-all” regulatory approach, as AI is applied differently in many different sectors. Rather, Dressendörfer advocates for transparently describing the algorithms behind AI applications. This allows for meaningful discussions on ethics, human centricity, and governance for sustainable development.

Aneja stressed that there are multiple challenges (economic, social, political, and environmental) related to AI. Politicians sometimes do not have the knowledge about the system, so they rely on the private sector. This can lead to a biased approach. Especially in developing countries, it is harder to regulate the private sector’s influence on political frameworks. She emphasised that risks should already be reduced during the development of AI applications. Building people’s capacities is also a crucial aspect to make AI operate as humanly as possible. Aneja also warned that labour issues do not get enough attention. She ended by pointing to the challenge of building technology as green as possible while dealing with ethical dilemmas as well. While this is not an easy task, reconciling these aspects is of utmost importance.

In the next part, the moderator presented the results of the audience survey. Regarding the first question, the audience associates ideas such as cybersecurity threats, fear, security and regulation with AI standards. Regarding the second question, the audience expects that AI will have rather positive effects on sustainable development. They see this potential mostly connected to global productivity and economic growth, followed by climate action and environmental protection (third question). The main fears of the audience related to AI were digital war, surveillance and misuse that leads to human rights violations (fourth question).

Dressendörfer reacted and argued that people often associate AI with dystopian movies – but she was glad to see that people have good expectations towards the potential of AI, and that AI will not completely replace humans in labour markets. Rather, AI will take over repetitive tasks and support humans to focus on complex assignments. Aneja found it interesting that people associated the use of AI with sustainable growth when still there is a necessity to associate the economic system with sustainability in general. Spreading technologies to other parts of the world is important considering that the majority of the global population still does not even have access to the internet. The hypothesis of the positive impact of AI on sustainable development still must be proven. Although many promises for the future are being discussed, harms are currently still more evident even in developed countries – for example, when looking at the labour conditions for platform workers. According to Aneja, we need more scepticism when talking about AI, as there is still a huge gap between its potential and reality.

After that, Davis Adieno (Global Partnership for Sustainable Development Data, Kenya) was introduced to the panel. He stressed that AI is already a reality, but it is in the hands of the private sector and connects the technological avant-garde rather than the masses. For the civil society, AI and technology seem rather detached from the real world. According to Adieno, our global society has other, more urgent problems, such as poverty and lack of resources. AI has critical ingredients to be an enabler of sustainable development, but for AI to be a solution for the needs of the day-to-day life of society, we need to consider its potential harms next to its benefits.

After the speech made by Adieno, the floor was opened to the questions from the audience. The majority of questions revolved around the objectives, potential and risks connected to AI governance. Aneja as well as Dressendörfer agreed that the quality and the management of data must be improved as a precursor for more meaningful technological developments – as more data does not mean better data.

 

IGF 2022 WS #405 Splintering from the core up? Fragmentation and Standards

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

Standards making bodies are the institutions in which engineers get together to try to solve technical problems. Policy makers are increasingly engaging at the level of standards organisations due to the policy implications of adopted standards. The different Standards bodies like the ITU and IETF have different approaches to standards making which can increase fragmentation if they have competing approaches.

,

By improving outreach and engagement as well as integration we can meet the different needs of each stakeholder such as offering engagement sessions to the different groups for further integration and understanding between the policy and technical communities.

Session Report

 

The importance of Internet standards in the drive to prevent fragmentation can be overlooked amongst the many areas of discussion in how the network of networks operates. This IGF roundtable discussion focused on the critical aspects of maintaining a free, open, and interoperable Internet and the role standards development can play in either facilitating fragmentation or preventing it. The roundtable was divided into two sections, with the first focusing on the challenges that can arise within standards governance and the multi-stakeholder model, as well as addressing proposals for standards development that may seek to transform the Internet’s building blocks and fragment the Internet, such as New IP. The Second section highlighted areas which can manifest fragmentation when large divides are present between engineers and policy makers within the process of standards development or within Internet Governance.

The roundtable addressed barriers to entry in the standards development process such as the technical and complex terminology and the importance of understanding the different fora for standards development. The conversation began with an example of a proposal that may fragment the internet, New IP. Carolina Caeiro, Senior Policy and Governance specialist at Oxford Information Labs, discussed the New IP proposal and the process of tracking these sets of standards. The basis of the New IP proposal understood that the Internet protocol was unfit to meet the needs of future networks and emerging technologies, and that therefore, a new Internet protocol was needed. In practice, however, New IP sought to create a series of changes at the architectural level of the Internet within naming and addressing as well as a change in the network layer. This would transform the Internets’ way of networking that would threaten the internet’s interoperability and therefore fragment the system as well as including ways of tracking Internet activity. In addition, the use cases where the current protocol is not sufficient according to the proposal are addressed within existing standards already.

To understand the ways in which the standards development process works and how that can contribute to fragmentation, the discussion continued with Tommy Jensen, Senior Technical Manager at Microsoft and Carl Gahnberg, Director of Policy Development and Research and the Internet Society. There are different Standards Development Organizations (SDOs) with different drivers at the basis of their work. The Internet Engineering Task Force (IETF) is a predominantly engineering-based standards development body for Internet standards. This SDO works on the basis of consensus and has a high barrier to entry with a specific way of working that can be highly technical. Other SDOs, like the International Telecommunications Union (ITU) can be more policy focused with a geopolitical aspect highlighting the different countries’ visions of the internet and future technology. These SDOs all play a part in the standards development process. 

The roundtable unpicked some of the groundwork of why standards are needed to maintain an interoperable internet, and how standards are created to solve specific technical problems. Carl Gahnberg highlighted that while technology can have political consequences, we should not allow politics to guide technological decisions. This is why the different fora and their drivers are important to the standards making process. Tommy Jensen agreed with the importance of trying to understand the technological problem which needs to be solved first, and then whether or not there is a need to evolve from a technological perspective. Directing the conversation back to Caeiro’s explanation of the New IP Proposal, the internet does not need a new version of IP to evolve as we currently have IPv6 which is more than adequate according to Jensen.

Pablo Hinojosa, Strategic Engagement Director at APNIC joined the conversation and challenged the notion that engineering problems should only be solved with engineering or technological solutions alone, stating that politics is a factor. He declared that it is not possible to be apolitical in these decision-making processes and highlighted the importance of the engineers and policy-makers speaking the same voice. It is significant that we share an idea of where we want the internet to be in the future. This is what determines where the gaps are, and therefore where the problems that need to be standardised are. He agreed on the importance of having a diverse discussion and upholding the multi-stakeholder approach. 

The discussion addressed the fundamental role that standards play in shaping the Internet of today and the future. Stacie Hoffman, Digital Standards Strategy Lead for the UK Government described ongoing forms of fragmentation that might be leading us away from the open, interoperable Internet. Fragmentation can be both facilitated but also prevented through standards development. Although they are not the only vector of fragmentation, understanding their role and contribution to creating fragmentation which can be detrimental to internet resilience, is critical. The key elements in preventing fragmentation that can occur within the internet include: having a single domain name system (DNS), having a core, unfractured internet protocol and having active and multi-stakeholder internet governance bodies such as ICANN and the IETF. If there are too many different and competing standards development bodies within the process it can also lead to fragmentation in the technology itself. 

The session concluded with recommendations as to how to improve the standards development process and prevent fragmentation to the internet that can be detrimental to the network of networks. Early identification of issues and discussion of these issues in the right bodies is key for the type of industry led, multi-stakeholder model of technical standards setting that can avoid problems down the road when it gets to the deployment stage. In the global standards setting process we need global participation if we want these standards to work for everyone. This highlights the need to help integrate those policy and technical debates, bringing the expertise much closer together. In practice, this can look like policy maker and engineering engagement and outreach events so that these relationships can build understanding for their different approaches and drivers of standards setting. By taking steps to be more inclusive and reinforce mechanisms that we currently have, we can ensure effective governance of the internet and the institutions that guard against fragmentation that could be detrimental to the internet. Diversity and inclusivity within the SDOs have historically been challenging, and the moderator Emily Taylor, CEO of Oxford Information Labs addressed the question of diversity and gender within these organisations as a final discussion point.

IGF 2022 DC-Sustainability Unbreaking the news: Media sustainability in the digital age

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The digital sphere offers many new opportunities for digital media, however, issues such as monetisation of content, access to data or platform’s content and account moderation policies are threatening the sustainability of the information ecosystem. These issues are exacerbated in those regions where English is not the main language of communication as well as those in which the exchange rates weaken the value of their local currency.

,

Governments and tech platforms need to take into account the challenges faced by journalists and media in the “Global South” when devising digital regulation. Greater transparency on data collection, content/account moderation, and revenue and monetisation of content are essential to understand the scope of the problem, devise creative solutions and empower a viable and sustainable journalism sector in a constantly evolving digital ecosystem.

Session Report

 

The nexus between long-term news sustainability and internet governance is undeniable and more important than ever. How are Internet policies affecting the ability of news organizations and journalists to sustain public interest journalism in the post-pandemic world? What can we expect from proposals to tax technology platforms to fund news media? How does the online advertising business model affect both Internet governance and digital journalism?

The session “Unbreaking the news: Media sustainability in the digital age” approached these questions and identified some of the challenges for the sustainability of journalism and media online, both looking at the trends and practices of digital media natives, regulatory frameworks and their potential impacts outside the borders they were conceived and the influence (and effects) of platforms’ internal policies on the sustainability of the media around the world.

 

Sustainability of independent digital native media organizations

SembraMedia’s report, Inflection Point International was launched in 2021 with the goal to give better understanding of challenges and opportunities faced by digital media. Mijal Iastrebner, co-founder and Executive Director of SembraMedia affirmed that “diversification of revenue is key to sustainability especially for digital business models” since they are built in an ecosystem where the rules and users’ behavior change really fast.” It is also essential for the media to have a strong strategy and a strong mission that can be carried from one platform to the other. This mission needs to be also connected to a social purpose, as Iastrebner warned: “reach can be momentary, impact is everlasting. Working around your community needs and challenges is and will always be the most effective development plan for media

Startups, especially in the Majority World or so-called “Global South” are usually the ones focusing on public interest journalism, reporting on minority or marginalized communities. Their social purpose is, in fact, especially relevant considering the impact these digital natives have as key actors in countering mis- and disinformation and improving their communities’ access to trusted and quality information. Essential to the sustainability of these journalists and digital media organizations is the regulatory framework or policies that need to be designed for them to grow and thrive in the digital information ecosystem.

 

Rebalancing the digital ecosystem: Regulatory Frameworks and their impact beyond borders

News organizations are dependent on platforms to reach their audiences and monetise their content which, especially for digital natives, is essential to ensure their existence and economic sustainability. 

In her study “Making Big Tech Pay for the News They Use” Dr Courtney Radsch, journalist and scholar at UCLA Institute for Technology, Law & Policy, and co-coordinator of the DC-Sustainability, explores three policy interventions to rebalance the relationship between digital platforms and media: taxation, competition policy and intellectual property interventions. Examples presented were the Australian News Media Bargaining Code which gave news media the right to bargain and license their content to tech platforms or EU’s Directive on Copyright in the Digital Single Market, which provides ancillary rights and frameworks for news outlets to negotiate with platforms at an individual or collective level.

The ethical dimensions of the money and how news organizations around the world would use it was a topic that also emerged among the participants. While some were wary of taking money derived from platforms and their collection and commodification of personal data, Anya Schifrin, director of the Technology, Media, and Communications at Columbia University’s School of International and Public Affairs, argued that all money comes with risks. While doing research for the report “Saving Journalism”, Schiffrin noticed that journalists in some regions were wary of accepting revenues that came from their governments. Iastrebner explained that in 2021 the leading source of revenue for digital native media continues to be grant funding, followed closely by ad revenue. While grant money abruptly leaving the sector is likely to have a negative impact on the media sector, Iastrebner affirmed that the problem is not being over-depending on grants, but over-depending on just one revenue source.

Advertising, government and even private foundations money can all pose ethical questions and compromise the independence of the outlets if safeguards are not put in place. But when it comes to tech companies, the reality is that many of them are trying to minimize their tax burden, when not avoiding it entirely. 

On the other hand, small, alternative non-English speaking news outlets in the Global South are particularly struggling in the platform era because of constraints in a playing field created by dominant platforms in the Global North. Since governments in many countries around the world might not have the capacity to influence or rebalance this playing field, political powers with that capacity, like the US or the EU, should think outside their boundaries when developing meaningful policy to govern tech platforms. And, in any case, as Schiffrin mentioned, policies can help quality information and journalism no matter the situation of the country, as long as the regional context is taken into account. 

 

The missing pieces

In order to empower journalists, media organizations and other public interest content creators worldwide, participants were nearly unanimous: transparency is essential. The question of what kind of transparency and the lack of it in the digital advertising, publishing and content moderation domains were quite important in the panel discussion, especially transparency regarding algorithms, advertising, revenue and content moderation. Relatedly, the question of how “transparency mandates” could address the lack of transparency arose, particularly since the Digital Services Act in the EU addressed some aspects, but  focused primarily on content moderation. The Australian approach, as illustrated by Dr Radsch, included an “algorithmic transparency” requirement for tech platforms to share advance notice with news organizations which is a critical development since major policy shifts have an outsized impact on news media sustainability. There is also a lack of transparency by media organizations and tech companies involving licensing deals. 

Transparency is crucial to know how to rebalance the digital advertising system. Google and Facebook control 90% of the ads market and own the complicated infrastructure. This raises the question of “antitrust” laws and how to give news organizations the possibility to earn from the ads going to the platforms. In Europe the has been on copyright policy and what the critics have called the so called “link tax”, which would give news organizations some of the income from platforms ads. Tech platforms’ advertising policy also creates challenges for journalists and other content creators. An example of that was the case of sex education content being rejected because it was flagged as pornographic content, as raised by one participant. Others were simply asking for basic data: data that media outlets could use to try to understand and connect with their audiences, develop feedback loops and monetize that relationship, which is also essential for the media business sustainability.

Content and account moderation issues were also raised, especially regarding its inconsistency, particularly for content in low resourced digital languages, which impacts the ability of media organizations to serve their audiences and by extension, the public good. Issues regarding the transparency of advertising, algorithms, content and account moderation practices, highlight the outsized influence that tech companies and their internal policies have on the sustainability of news media, as noted in the discussion.

 

Conclusions

The digital space offers many new opportunities for alternative and digital media, however, to fully achieve the Internet's potential as a channel to disseminate public interest information, facilitate public service and community journalism, and enable community building, the impacts of technology policies and practices on journalists and media organizations needs to be centered in more internet governance and data governance discussions. The lack of consistent policies by both tech companies and governments and  the lack of channels to effectively address issues such as content monetisation, access to data or platform’s content and account moderation policies are threatening the sustainability of the information ecosystems worldwide, with an outsized impact in non-English, or smaller “markets” worldwide. Creators of public interest content often lack the capacity to bargain for the appropriate remuneration of their products or to address unjustified content moderation decisions that lead to account suspensions and content takedowns of legitimate public interest information.

Participants highlighted the  importance of building networks, communities to exchange information, knowledge and data, bring support and do joint advocacy. As one of the participants stressed, “network building is essential for the sustainability of media organizations: a global network is a place to share experiences and skills.” Networks have power: going back to the example of sexual education content, a big campaign succeeded in pressuring the tech company to update its advertising policy. Moreover, these types of communities are specially relevant to think globally, and go beyond boundaries: when connecting experiences from people in different places, more data is collected and stronger arguments are put together. And the more stakeholders involved in these communities, more lines of communication are opened to identify opportunities to collaborate, understand the issues and try to create remedies. The DC-Sustainability was created with this purpose:

When we started [this Dynamic Coalition] it was difficult to convince people why a Dynamic Coalition on journalism and news media sustainability belonged at the Internet Governance Forum and trying to explain why Internet Governance fundamentally shapes the sustainability of news. [...] We've always realized that the way we govern these platforms and the Internet more broadly has a fundamental impact on news and journalism, which is a public good. And which is fundamental to democratic governance and to accountable governance.” Courtney Radsch, co-coordinator of the DC-Sustainability.

As Radsch stressed: humanity needs reporting, it needs the accountability that independent media bring. The digital age has shown that information barely has boundaries, and while policy making needs to take into account regional particularities and needs, a fragmented approach would weaken the goal to ensure a sustainable information ecosystem worldwide.  

 

DC-Sustainability

https://www.intgovforum.org/en/content/dynamic-coalition-on-the-sustainability-of-journalism-and-news-media-dc-sustainability

Join our mailing list: https://groups.io/g/dc-sustainability

 

Resources

Radsch, C. (July, 2022) Making Big Tech Pay for the News They Use. Washington DC: Center for International Media Assistance. Available at: https://www.cima.ned.org/publication/making-big-tech-pay-for-the-news-they-use/

SembraMedia, (November, 2021). Inflection Point International: A study of the impact, innovation, threats, and sustainability of digital media entrepreneurs in Latin America, Southeast Asia,and Africa. Available at: https://data2021.sembramedia.org/reportes/executive-summary/

Shiffrin, A.; Clifford H., Adjin-Tettey, T. D. (January, 2022) Saving Journalism 2: Global Strategies and a Look at Investigative Journalism. Washington, DC: Konrad Adenauer Stiftng Foundation. Available at: https://www.kas.de/documents/283221/283270/Saving+Journalism+2+-+Global+Strategies+and+a+Look+at+Investigative+Journalism.pdf/a8ec2655-5636-8d69-00e5-e698e76c3845?version=1.1&t=1643317826159

Shiffrin, A. (August, 2022) Australia’s news media bargaining code pries $140 million from Google and Facebook. Poynter. Available at: https://www.poynter.org/business-work/2022/australias-news-media-bargaining-code-pries-140-million-from-google-and-facebook

IGF 2022 Town Hall #43 EuroDIG Messages - Internet in troubled times

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

• The European vision of a rights-based, open, accessible, resilient Internet has to be upheld even in times of conflict and crisis in the region and beyond. A strong multistakeholder approach to digital cooperation and Internet governance is the only way to avoid and counter fragmentation, exclusionary processes and harmful effects of existing and new technologies. Open multistakeholder processes need to include youth and marginalized voices. Al

,

• Sustainability should be at the core of digital cooperation, this includes environmental sustainability. This perspective is becoming more pertinent as questions on energy insecurity and resource effectiveness in regards to the Internet are becoming a lived reality for many users, as well as governments and the private sector. In the current consultation on the Global Digital Compact, environmental sustainability shall be suggested as a key are

Session Report

Long Report Town Hall: EuroDIG Messages - Internet in Troubled Times

 

Participants onsite: around 30

Online: around 15

Gender split: approximately 60% men, 40% women

 

The session was opened by explaining how the EuroDIG Messages 2022 were drafted and agreed upon. For the first time, the new mode of Focus Areas was being used.

 

The Focus Area surrounding the topic of digital sovereignty was presented first. The main take-aways were, that new regulation has brought clarity, but always has to be carefully evaluated against potential harms, such as Internet fragmentation – a topic heavily discussed at the IGF2022 – and harms to human rights and democracy.

In the discussion it was pointed out that a main challenge for Europe at the moment is the war on Ukraine. Peace and sovereignty are not in opposition, but the term is sometimes used to describe a closed and territorial approach.

It was mentioned that a European vision of sovereignty has to foster openness and interconnectedness.

The Focus Area on effective regulation the importance of the multi-stakeholder approach was championed. Example where regulation can play a key-role in the near future is the green transition and rapid cybersecurity standards/criminal justice.

In the discussion it was commented that the environmental aspects of the digital transformation were a topic that the EuroDIG community very actively worked on, also picked up by the IGF community and a subsequent intersessional policy network. It was recommended that EuroDIG should continue to be a strong voice. One way to highlight it was suggested in making a remark in the European stakeholder consultation on the Global Digital Compact.

It was noted that due to the energy crisis in Europe, the topic might naturally gain traction again. It was also suggested that regional and global workstreams, especially intersessional work, should be streamlined and integrated, not parallelized.

The Focus Area on effectiveness of governance bodies was presented. Some main points were the call to take a fresh look at the multi-stakeholder approach. Youth should be included more readily. Discussions and regulatory processes around artificial intelligence, digital identities and innovation such as delay-tolerant networks were noted as examples for updated approaches to policy-making.

In the discussion, it was pointed out that the human-centric approach to digital policy that Europe portrays is not the same in all world regions and it is a continuous effort to align values, while not hegemonizing different approaches.

The Focus Area of Internet in troubled times came about in the wake of the war on Ukraine. The main messages were presented. Again, Internet fragmentation has to be estimated as a risk, interconnectedness and openness should be championed by the UN Tech Envoy and by all stakeholders in Global Digital Compact. Broader engagement and open Internet governance processes are to be centered. Another aspect is the integrity of information, which has to be protected and disinformation has to be countered.

On the topic of disinformation, it was noted in the discussion, that in a pluralistic society, a diversity of opinions has to be protected, while fostering the accessibility and reach of neutral, fact-based information.

The youth messages were presented and comprised perspectives by YOUthDIG participants upon AI, social media, cryptocurrency, and sustainability. Education and literacy were highlighted as one important precondition to all of these aspects. The messages also called for more research and funding to promote a safe, sustainable, innovate digital sphere.

The youth representative thanked the EuroDIG community that the current dark times due to the war are acknowledged. Access to the Internet and digital infrastructures in Ukraine is hampered by energy shortages and attacks on infrastructure, depriving the people of many important, sometimes life-saving, services and technologies.

In the discussion the importance of youth voices was complimented. EuroDIG commits to continuously involving youth in its processes.

In a next thematic section, EuroDIG 2023 in Tampere was presented, specifically the overarching theme “Internet in troubled times – risk, resilience, and hope”. The community was invited to participate in the conference June 19-21, as well as contribute to the program. A Finnish member of European parliament extended the invitation, pointing out the rich history of the city of Tampere in science and technology.

In the last thematic segment, EuroDIG’s process regard the Tech Envoy’s survey to the Global Digital Compact was outlined. Messages and outcomes of EuroDIGs were mapped, with the outcome that almost all topics of the GDC will receive input. The commenting platform for the European stakeholder consultation is still open and all are invited to contribute. Strong multi-stakeholder engagement in the process is important regarding the high-level nature of the compact.

IGF 2022 Open Forum #58 Promoting Internet standards to increase safety and security

Updated:
Enabling Safety, Security and Accountability
Session Report

Report Internet.nl workshop
 
30 November 2022, Caucus room 11
 
This Open Forum focused on the need of modern Internet standards to be adopted in a faster and more scalable way in order to make the Internet and its users more secure and safer. It took the form of a tutorial, in which  the focus lies on a testing tool, that helps one to check whether a website, email, and Internet connection are up to date, i.e. comply with modern Internet standards such as IPv6, DNSSEC, HTTPS, STARTTLS, DANE, DMARC, DKIM, SPF, and RPKI.
 
The Dutch Ministry of Economic Affairs and Climate explained the origin of the Internet.nl tool it created in 2015. It is a multistakeholder initiative intended to create awareness on Internet standards deployment and safety. Any organization can check its own domain name whether security measures, i.e. deployment of Internet standards, are in place or not. You can check the level of security of your domain name here: www.internet.nl. Within seconds the level of security is shown to you, including advice on next steps.
 
The software behind Internet.nl is open source and can be used by other organisations willing to run a local version in their respective countries. You can find the information on Github. Three other countries have adopted the tool: Australia, Brazil and Denmark. The former two presented on their experiences in adopting the process into their local environment.
 
What stood out from the three presentations is that local customs and perceptions on standards determine the way the tool can be used and presented. These differences did, however, not stand in the way of building a local version of the tool and launching it.
 
In the first presentation, Gerben Klein Baltink of Platform Internetstandaarden (Dutch Internet Standards Platform) stressed the importance of Internet.nl being a Public Private Initiative. All participants cooperate without commercial intent, joined by the intention to create a more secure Internet that is open, transparent and safe. He showed how the tool works to the audience and points to its hall of fame. All organisations showing a 100% score can apply for “membership”.  (The local IGF connection scored 10%.)
 
Bart Hogeveen of the Australian Strategic Policy Institute (ASPI), presented .auCheck. It is technically a full copy of Internet.nl, the organization behind it is not. It proved harder to create a PPI. The current result was four years in the making. Research had shown that Australia is not in a position where the need for the deployment of Internet standards is broadly understood and accepted. There’s a lot of education and awareness raising to be done. The tool was only launched quite recently, so it’s hard to show any effects at this point in time. The tested outcomes however, show the need for more awareness. Deployment percentages on average are (too) low.
 
Gilberto Zorello of NIC.BR presented on TOP, Teste os Padroões (Test The Standards). The programme was launched in December 2021. TOP is a collaboration between the NIC.BR environment and experts. Tests show that average scores are below 25% for those who have tested for all standards. TOP is promoted in technical events in government and academia and it works closely with ISP associations. Although it is still rather early to truly measure effects, TOP already sees organisations coming back with better scores.
 
Maarten Botterman presented on behalf the Global Forum of Cyber Expertise on its Triple-I initiative (Internet Infrastructure Initiative) “This GFCE initiative is meant to “facilitate” awareness raising and capacity building events in different regions of the world in order to “enhance justified trust” in the using of Internet and/or email in those regions. Local and regional actors are stimulated and supported in setting up and running local/regional events between regional stakeholders, bringing in local expertise.“ If you need help, reach out to the GFCE. It has all the toolkits and information you need. (See: https://thegfce.org/ for more information.)
 
Moderator Daniel Nanghaka adds that this initiative started in 2017 by way of a campaign, after which some CERTs started to work together. In 2023 the trusted Africa Internet Initiative will start. It is expected that through cooperation with the GFCE all regions will be reached.
 
Gerben Klein Baltink points to the fact that in The Netherlands results are measurable. There is a clear uptake in the past eight years, where naming and faming has an effect. Around the world it is far too limited as the situation now stands. The world has to step up to make itself more secure and safer. He makes a call for action: “Modern Internet standards are essential for an open, secure and resilient Internet that enables social progress and economic growth. These standards are readily available, but their use needs to rise significantly to be fully effective. The UN is called upon to help accelerate the global uptake of key standards, by including their promotion in the Global Digital Compact, and supporting advocacy and capacity building, as well as initiatives to test and monitor deployment, especially where many people aren’t connected yet.”
 
From the room Mark Carvell pointed to the work undertaken by the IGF Dynamic Coalition on Internet Standards, Security and Safety (IS3C) that is working on recommendations and toolkits on the goal of faster and massive deployment of security-related Internet standards and ICT best practices.

 

IGF 2022 WS #354 Affective Computing: The Governance challenges

Updated:
Addressing Advanced Technologies, including AI
Session Report

Session Report
IGF 2022 WS #354 Affective Computing: The Governance Challenges

Tuesday, 29th November 2022 (12:05 UTC) - Tuesday, 29th November, 2022 (13:05 UTC)

Speakers: Dr. Diogo Cortiz, Dr. Lisa Feldman Barrett, Dr. Javier Hernandez, Dr. Jessica Szczuka, Mrs. Marina Meira
Moderator: Dr. Henrique Xavier

Rapporteur: Mrs. Pollyanna Rigon Valente

The moderator opened the session by introducing the theme of the discussion: How we can use computers to interpret and simulate human emotions and its potential, issues and other challenges.

Dr Diogo Cortiz, researcher at Web Technology Study Center (Ceweb.br), a center of the brazilian network information center (NIC.br) and professor at Pontifical Catholic University of São Paulo (PUC-SP), started his initial contributions presenting some inputs and concepts about Affective Computing (AC). Dr. Cortiz introduces the concepts of Affective Computing and how it is part of IGF Agenda: Affective Computing is not a specific technology, but an area of knowledge. It’s possible to develop different types of application to recognize, detect, simulate, and organize data about human emotions. Dr. Cortiz stated that AC is closed to AI when the discussion is about governance and regulation, because they are not a specific technology but a broad area of knowledge that could involves different types of applications. Dr Cortiz also appointed an important note: Affective computing does not always use AI, it could use technology for self-report, for example, but the most important cases in the moment are based on AI models for emotion recognition. Dr Cortiz ended his initial talk presenting two challenges (sensitive problem) that need to be addressed:

  • Using Affective computing with AI, how is it possible to be sure that an AI application is right? When inferring about subjectivity, AC may be wrong but make us believe it’s right.
  • Global models: we use models that were trained in the most cases with data from users from the global north, but that model will have impact and will be used over other regions and cultures in the world. How can we ensure it will work? What are the risks?

Dr Lisa Feldman Barrett, professor at Northeastern University, shared about one specific subject about affective computing: automating emotion recognition. Using an example from images of a research, she showed that it is possible to understand how wrong AI could be in recognizing human emotions. With more examples over facial expression and the emotion, Dr. Barrett argued that is important to remember that facial movements are only expressions and they we are not necessarily related to internal emotional state. That is the challenge for the affective computing and AI models that uses facial expressions to detect emotions. If we really want to be able to use technology to our benefit, affective computing must measure many signals, not just one, two or three. Dr Barret ended her initial talk arguing that for emotional AI to be successful, the entire ensemble must be measured across different situations, different people and in different cultures.

Dr. Hernandez spoke, researcher at Microsoft, highlighted we need to have a discussion across multiple disciplines, because probably many of us are excited and very worried about the potential applications of this technology. In addition to what Dr. Cortiz had shared previously, Dr. Hernandez also got more context about the research over affective computing: it had start around 1995 and is the study and development of systems and devices that can recognize, interpret, and simulate human affects. Talking about his role at Microsoft, he explained they have different categories and that is the area of comfortable sensing, a lot sensing with wearable devices trying to find way to capture information from users, doing lot of work on AI and how they can use it to better understand what emotional states really mean and how they can sense them in settings and with that they can create affective interaction/experiences that use that information in unique way to help achieve certain goals. Looking to all those things Javier says the one of the core mission statements is improving regulation and help users become better at managing their own emotions. It was in 2015 that affective computing started as emerging technologies and even it seems a good opportunity to research, on the other hand the companies started to look at it as an opportunity to them.

  • Challenges: The theory of human emotions is evolving; Human emotions are difficult to describe and label; A lack of representative and generalizable data; Oversimplified language is used to communicate system capabilities; There is blurred boundary between what should be private and public.
  • How to minimize challenges: communication; consent; calibration; contingency.

Dr Jessica Szczuka intervened to present a subject over the affective computing that probably some of the audiences haven’t thought much about: intimacy and sexuality, as well inviting us to explore how important emotion can be. We have three different ways how we can come to intimacy and sexuality with technologies:  through, via and with. The last one can be looked as very futuristic or sci-fi, an actual intimate or sexualized interaction with the technology itself, but we are not that far away. One of the challenges she presented was how affective computing is now really related to short-term and long-term interactions? As she presented a part of a research she highlighted one part of the model that’s relevant for the question made before: sexual arousal, which shifts your attention and your cognitive resources to reflect of the aspects away to the great fixation of the sexual fulfillment, therefore you do not have in this specific moment all capabilities to reflect that maybe this machine is not understanding the emotions right. Dr. Szczuka also present other research that shows that recurring interactions that evolve for us in the dynamics is key to what makes artificial entity chatbot or whatever and comparing to our daily contacts and how we perceived things it’s super hard to implement and we really may need to make sure that companies that are using this technology are aware of the potential consequences. To have more context about the consequences we saw examples: using affective computing is actually a way to nudge user into using a specific technology as we have this need to rely on others and use our emotions for this, if you think about the way people will interact in emotionally intense state, which wrong by affective computing obviously, it will also come along with manu and very sensitive data. As part of how minimize the challenges: we should stay technology positive, providing platforms for satisfying needs for intimacy and sexuality and being responsible, anticipating and implementing possible consequences and vulnerabilities.

Mrs. Marina Meira spoke about regulation of AI in general where emotional AI is inserted into. The first thing to think about is why regulate AI and technology in general, because the development of technologies can be supported by regulation while people’s rights are protected, individually and collectively. We have a big challenge about how to regulate technology, especially when it comes to AI because there are not many regulations throughout the world so in general, they are learning while technologies evolve. Looking to the past and when technologies started evolving principles and ethical guidelines started being thought around the world, but they didn’t have binding effects and it reflects on a lot of challenges when it comes to being followed. Those guidelines were most related to transparency, explainability, safety, security, fairness, non-discrimination, responsibility, privacy, and human control over technology and that were not followed by the companies, because in that case for example, they were establishing ethics Councils within their companies or nominating people who were specialists in AI ethics however they were not changing their practices. All that scenario showed the big challenge over law regulation, which means laws that can be enforced that will be sanctioned if they’re not followed and that translates into very specific measures. Looking to the nowadays it’s possible to see a similar scenario: several laws being discussed, and the most of these regulations follow what they call a risk-based approach, that means that the more risks to human rights that are the technology present to those who are going to be affected by it, the more obligations those developing the technology. There are risks following this risk-based approach idea and regulation sense previously assessed, because a very important figure in general regulation are the impact assessments that must be conducted under a strong and scientific solid methodology to assess and understand that are the actual risks that technology can present and think of ways to mitigate them. She also highlighted the importance of those risks should be assessed with a big participation of society. Even over all that challenges, Mrs. Meira finalized her presentation talking that it’s possible to regulate and it’s a positive thing we can achieve a better society with regulation as well as with technology, but first we need to consider the most vulnerable groups and how emotional computing affects them.

 

IGF 2022 Open Forum #77 Implementing the AU Data Policy Framework

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

The overall objective of the Africa Data Policy Framework is to raise awareness about data and its growing importance as a strategic asset for Africa economy and society and lay the foundations for the development of coherent, harmonized and integrated data governance systems that facilitate data access and cross borders data flows.

,

To build a shared data ecosystem across the continent, close cooperation of regional and national stakeholders is necessary in order to align the different existing strategies and policies that already exist to the AU Framework and allow a free flow of data across and within countries. Global nexuses and cooperation also need be considered to ensure that African perspectives are represented on the international level as well.

Calls to Action

In order to harness the opportunities of digital economy, it is necessary to harmonize data governance systems across Africa and thereby enable a data single market, which will allow for both an increased private and public data value creation.

,

To enable data sharing among countries and sectors, there is a need to develop data sharing and data categorisations frameworks that take into account the different types of data and their associated levels of security and privacy.

Session Report

UN IGF Session Report

 

Session/event: Open Forum 77: Implementing the Data Policy Framework

Date: 1 December

Time: 10:45 – 11:45

Moderator: Souhila Amazouz

Reported by: Pierrinne Leukes (GIZ)

 

Name of panelists:

Dr Alison Gillwald                   Executive Director of Research ICT Africa

Mrs Aretha Mare                      Project Manager in charge of Data Governance at Smart Africa

Mr. Guichard TSANGOU         Director of Postal, Telecommunication and Digital Economy ECCAS 

Mrs. Stella Alibateese               National Personal Data Protection Director: Uganda

Mr Torbjorn Fredriksson           Head, E-commerce and Digital Economy Branch at UNCTAD

 

The purpose of this panel, as introduced by Mrs Souhila Amazouz (moderator) as the representative of the African Union Commission (AUC), is to raise awareness of the African Union Data Policy Framework (DPF) and discuss the readiness of Africa as a continent when it comes to data usage, data governance, data ownership and how that will support the development of digital economy in Africa. The DPF is the continent’s strategic framework for data governance and aims to set the priorities, vision and principles with regards to data in order to harness its transformative potential. It also aims to empower African countries and citizens whilst safeguarding their rights, to achieve equitable and equal opportunities for all African citizens in the digital space. The objective of the DPF therefore is to provide guidance to African countries in developing comprehensive, coherent and harmonized data systems across the continent which will enable the efficient use of data and enable data to flow across countries in support of digital trade and also data driven businesses. Now in its second phase, the DPF is supported by an Implementation Plan which has been validated by member states, inclusive of a Self-Assessment Capacity Tool to help countries gauge their various levels of readiness as well as identify the support they need for successful domestication.

Mrs Stella Alibateese, as Director of the National Personal Data Protection Authority in Uganda, underscored that domestication of the Framework requires of member states to make sure that they provide for the DPF recommendations within their own policies. Once policy development is completed and the necessary legislative processes have been finalized, it will become easier for the implementing ministry to cascade it to the other ministries considering that this policy framework requires a lot of collaboration across different sectors. What follows then is a review of standards to enable interoperability and an assessment of the relevant infrastructure required. Ensuring that those needs are included in the national development plans is another imperative step to ensure that the necessary resources are made available to support implementation.

Approaching the discussion from a more regional perspective, Mr Tsangou as Director at the Economic Community of Central African States (ECCAS), emphasized that the Regional Economic Communities (RECs) are the building blocs of the African Union. In this vein, they play a prominent role in actualizing the goals of the DPF and his recommendations included operationalizing regional Internet Exchange Points, building regional data centre capacities and ensuring that Model Laws are aligned to the continental framework whilst taking into account regional specificities and needs. Key to achieving these objectives is addressing the obstacles currently prohibiting cross-border data flows between member states. According to Mrs Aretha Mare, Data Governance Project Manager, Smart Africa has conducted extensive research revealing that barriers to data flows can be broadly encapsulated by three main challenges: Lack of trust, lack of infrastructure and lack of technical capacity. Foundational institutions such as Data Protection Authorities are further hamstrung by the lack of financial resources needed to increase and ensure the required enforcement capabilities.

The principle of harmonization – a central tenet of the DPF and its Implementation Plan – will be a key driver in addressing the abovementioned challenges. Dr Alison Gillwald explained that harmonization is essential for enabling and harnessing the benefits of the data economy. It creates the economies of scale and scope needed to ensure equitable participation in the global data economy and in so doing helps guard against uneven development. The DPF makes a principled commitment to the realisation of a Digital Single Market - an integrated trade environment that we're going to see in process with the African Continental Free Trade Area - but also creating a rights preserving environment for users from the continent. By adopting an approach informed by progressive realization, low hanging fruit such as setting standards for integrated national data systems in order to unlock the public value of data can ensure that Africans share in the benefit of the data that they are producing. For too long Africa has been the recipient of the data subjects that are excluded from these markets, and it's really the commitment to harmonization that will allow stakeholders from Africa to create this enabling and trusted environment.

This scale is also needed for Africa to assert its place in the global data economy. Mr Torbjorn Fredicksson, who leads UNCTAD’s Digital Economy Branch, highlights that data can help to address many of the world's and Africa's major development challenges such as green transitions, food insecurity, pandemic preparedness, as well as more transparent and accountable governance. It holds the potential to transform research and development to improve the quality of decision making at all levels. However, should data be mishandled, the growing reliance on data may result in greater and greater inequalities. Additional risks are the continued fragmentation of the global landscape of data governance which will in turn exacerbate rising tensions among the matriarchs of the governments like China, the US and the EU, in addition to the increased fragmentation of the internet triggered by increased use of data localization requirements as an attempt to try to protect data inside the country which reduces the opportunities for internalizing the benefits of said data. These factors inform the call for a balanced global approach to data governance to help secure inclusive development gains. Reaching agreements on definitions and taxonomies for establishing terms of access with different types of data, dealing with data as a public good, exploring new forms of data governance and agreeing on principles as well as standards all require the active participation of African member states to ensure equitable benefits.

Questions from the floor related to data sharing agreements and mechanisms, increased civil society participation and the inclusion of local languages to give expression to governance in various settings – are all indicative of the desires of various actors to take up their roles in the shaping of global data flows. This is further evidence of heightened awareness of the strategic value of data and an enthusiasm to ensure that the digital dividends of data are shared in by all.

IGF 2022 Open Forum #89 Enabling a just data-driven African digital single market

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

The achievement of a Digital Single Market in Africa by 20230 as envisioned by the AU Digital Transformation Strategy is a bold and long-term vision which requires the commitment of all stakeholders to bring it to fruition. This entails the collaboration of multiple actors with different levels of knowledge, different interests and different levels of readiness and different levels of capacity.

,

Harmonization of Legal and regulatory frameworks across the continent remains imperative in order to materialize the benefits of the digital economy in Africa. Creating a conducive digital ecosystem namely digital connectivity, digital platforms and interoperable online payment and digital ID systems are pre-conditions to foster intra-Africa digital trade.

Calls to Action

As part of the second phase of AfCFTA negotiations on digital trade protocol, Member states are called upon to come up with agreements that consider cross-border data regulations, innovation, privacy as well as cyberspace security issues.

Session Report

UN IGF Session Report

Session/event: Open Forum 89: Enabling a Just and Data-Driven Single Market

Date: 30 November

Time: 09:30 – 11:00

Moderator: Souhila Amazouz

Reported by: Pierrinne Leukes (GIZ)

Panellists:

Dr. Ify Ogo:                            Regional Coordination Specialist for the African Continental Free Trade Area (AfCFTA) at United Nations Development Programme (UNDP)

Mr. Jean-Paul Adam               Director, Technology, Climate Change, and Natural Resource

Mr. John Omo                         Secretary General of African Telecommunications Union

Mr.Kenneth Muhangi             Lecturer-Intellectual Property, Partner-KTA Advocates, World Economic Forum 4IR Committee Member, Chair Technology, Media, Telecoms Committee East Africa Law Society.

Eng. Murenzi Danniel             Principal Information Technology Officer at East African Community, Tanzania

Mr Samatar Omar Elmi           Chief ICT Specialist, Africa Development Bank Group

Dr Talkmore Chidede             Digital Trade Expert at the AfCFTA secretariat

 

 

This session was primarily devoted to creating a platform for the panellists to share their sentiments on how the African continent can work towards enhancing digital trade and facilitate cross-border digital trade. The African Union Commission (AUC), represented by Souhila Amazouz (moderator), initiated the discussion by highlighting that the AUC has taken great strides in recent years evidenced by the development of the Digital Transformation Strategy (2020 to 2030) of which the main objective is to achieve this digital single market in Africa. To strengthen capacities in managing data and facilitate the movement of people and goods across the continent the AUC has also developed the Data Policy Framework and Digital ID Framework following extensive consultation and collaboration with both international and national organizations for the development of a harmonization strategy to create an enabling environment for creation of Digital Single Market in Africa.

 

Dr Talkmore Chidede, as Digital Trade expert for the African Continental Free Trade Area (AfCFTA) secretariat, provided a report on the progress of the Digital Trade Protocol since the Committee on Digital Trade (comprising of all State Parties) was established in May 2021 to coordinate and facilitate the negotiations of the Protocol on Digital Trade under the AfCFTA. Since its inception, this committee has conducted explanatory and preparatory work by consulting with non-state actors through brainstorming with digital trade experts from the continent at high-level sessions to hear the expectations and key issues to address this protocol on digital trade and hosted regional stakeholder consultations to hear the views of businesses and civil society organisations. Formal negotiations are set to take place between 5 and 9 December 2022, where a report on these hearings will be submitted for consideration and validation by the negotiators, together with a situational analysis on digital trade across the continent which maps the state of digital trade, policy and regulatory frameworks. Once consensus has been reached regarding the rules of engagement and guiding principles, this Protocol is going to develop a continent-wide legal regulatory framework for digital trade that governs intra-Africa digital trade.

 

Mr Jean-Paul Adam of UNECA recognized and congratulated the AU for the exemplary partnership and leadership it has displayed because digital transformation is the tool that will allow the acceleration of the implementation of the Sustainable Development Goals. He cautioned that collectively we must address the gaps which exist between the promise, the present and the potential of the AfCFTA. Amongst the regulatory challenges that need to be addressed are ensuring Infrastructure and Connectivity, ensuring Cyber-security and Artificial Intelligence for the enablement of trade. Initiatives such as the Africa Trade Exchange (operated by Afreximbank) – a platform which facilitates access of African companies to trade their goods, as well as African Regional Centre on Artificial Intelligence which was launched in the Republic of Congo earlier this year (with a priority focus on trade facilitation), are already in place to incentivize the continued investment in harmonization of the regulatory environment across the continent.

 

In his reflections on the role that political leadership can play in enabling data to flow within countries and across countries, the Secretary General of African Telecommunications Union -  Mr John Omo, asserted the importance of imbuing political leaders with a sense of urgency and the necessary knowledge regarding the importance of data for the management of national, communal and regional economic systems. He emphasized that we need to address the asymmetry that exists between the technical stakeholders, and the political class – which has access to grassroot networks and can initiate skills transfers. In acknowledgement of the fact that there's not a single organization, or individual, whether politicians or private sector, that has a monopoly of knowledge over this jurisdiction, it is imperative therefore that everyone in the ecosystem is brought together for purposes of data management and this will also bring to light any institutional overlaps and jurisdictional conflicts in terms of the partnership engagements in Africa.

 

Mr Daniel Murenzi of the East African Community shed light on the domestication of the DTS within the region, underscoring the importance of coordinated implementation and highlighting the gains made thus far under the umbrella of the “EAC Single Digital Market Vision and Digital Agenda”.  Buttressed by four pillars, namely Online, Data, Connectivity and an Enabling Environment this Regional Economic Community is promoting digital trade by ensuring all foundational components work across borders, removing trade and customs barriers, ensuring data protection and privacy laws allow cross-border data transfers, sharing cybersecurity resources and removing cross-border barriers to infrastructure and connectivity (in both wholesale and retail). Echoing the importance of addressing the supranational issues in collaboration with various economic communities as well as the African Union Commission in order to ensure overall consistency, Mr Samatar Omar Elmi of the African Development Bank introduced the Upstream Project for Digital Market Development in Africa. This $9.73 million project, which supports the implementation of both the AfCFTA and the DTS, contributes to the implementation of digital enablers such as universal access to broadband infrastructure, sovereign African cloud, African digital market, e-commerce and digital trade promotion programs for medium, small and micro enterprises and start-ups kicks off in Addis Ababa in January 2023. In the main, it aims to facilitate the creation of a conducive ecosystem for digital trust, skills and African experts’ networks.

 

Within this context, Dr Ify Ogo, with extensive experience in supporting member states in the first phase of the AfCFTA coordination in her role at UNDP accentuates the reality that states will be negotiating from their own interests which must effectively be reconciled in order to give expression to the Protocol. She called on these actors to reflect on the constructs of these rules, the reality they were intended to create and whether they fully serve the interest of the African continent.         Similarly, when asked about the importance of provisions in the Intellectual Property and Competition Chapters of AfCFTA, Mr Kenneth Muhangi (a World Economic Forum 4IR Committee Member) stressed the importance of member state buy-in for minimum standards, since this consensus provides the foundation needed for harmonization. To further underscore the importance of reciprocity, he shared the view that Intellectual Property will be the driver of digital trade because it gives companies the confidence to trade freely, in the knowledge that their goods and brands are going to be respected within the countries on a mutual basis.

 

This session highlighted the incredible opportunity which currently exists to galvanize the synergies in the work of the multiple agencies working on this topic across the continent. By strengthening cooperation, raising awareness about shared priorities to ensure complementarity between the different initiatives and building individual and collective capacities, the African Continental Free Trade Area (AfCFTA) can facilitate an integrated approach which promotes the shared prosperity from global digital dividends by enabling a just and data-driven Digital Single Market in Africa.

 

IGF 2022 DC-DT Fact-checking the DNS: towards evidence-based policy-making

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

The session focused on the access to the DNS-related data for informed decision making at a time that the DNS is receiving a lot of attention from policy-makers. Tensions discussed included disrupted measurements in the face of encrypted DNS, emerging proposals like the DNS4EU initiative and distribution of roles and responsibilities of actors in the DNS value chain. The various speakers commented on the rich nature of the DNS, being both commerc

Calls to Action

Our approach to tackling these policy questions has to be forward looking, thinking 20 years forward and how we want the space to evolve. Participants highlighted the importance of maintaining multi stakeholder conversations on the subject, taking into account views form civil society, governments and technical community.

Session Report

Fact-checking the DNS: towards evidence-based policy-making

IGF Report 2022

 

The DNS is receiving increased attention from policy makers. This session sought to explore to what extent the DNS ecosystem relies on data to make informed decisions, what tensions have been created by increased DNS encryption in terms of accessing data and how to develop evidence-based solutions to tackle DNS Abuse. The session collected views from various stakeholders from the ecosystem.

 

We started with Geoff Houston from APNIC (technical community). He spoke about how the DNS was not designed thinking it would evolve into a global network, and therefore it was built with virtually no security features, it was ‘trusty.’ This became a vulnerability when the Internet consolidated as a global communications network: any adversary could intrude upon the DNS, observe what was happening and tamper with the answers. Following the Snowden revelations, a series of protections were built around the DNS (DNS messages are encrypted, sources of information are authenticated, DNS content is now verifiable, etc). However, as a result, the DNS has become obscure, “gone dark”, generating problems of its own in preventing abuse and keeping tabs of drivers for centralization. In his words, when we speak of evidence-based/data based policy making around the DNS, “there is no DNS data to talk about, it just does not exist.”

 

Mallory Knodel from the CDT (Civil society) challenged the notion that the DNS has gone dark. Her view being that just because we had not secured data before, it does not mean there was a good reason for DNS queries to be global data. The data was visible before, and we have now found ways to make it private. She does agree, however, that this has generated issues and has broken things, and to her, it is important from a public interest and Human Rights perspective, that we acknowledge those issues. These include initial centralization of services to make DNS lookups private, challenges for abuse mitigation, censorship becoming more blunt in regimes that previously relied on DNS data for blocking and filtering.

 

Emily Taylor from OXIL inquired about the availability of data for researchers to study the impact of encrypted DNS, highlighting how for studying the resolution space it is very hard to get that data. Mallory Knodel pointed how measurement initiatives tracking censorship are confident they will be able to overcome that challenge. Geoff said that the reason why query data is not shared is because it has incredibly privacy implications, most operators don't release it for good reasons. When you strip query data from personal, sensitive data you are left with something quite limited. As a result, our window to look into what is happening at the level of the DNS is small and getting smaller. No regulation will change that. The more functions are picked up by applications (QUIC, DoH), the smaller the role of networks will become. This push is the result of interest by large operators and what they perceive users want in terms of privacy that has led to this push.

 

The conversation then moved on to pick up on existing industry practices to deal with abuse on the DNS with the participation of EURid, CENTR, .ng and Verisign and contributions from academia from Latin America.

 

Peter Van Roste from CENTR spoke about DNS4EU. The initiative seeks to create a European-wide public recursive resolver. The reasoning has to do with concerns by European institutions that (a) some dominant players –especially interested in the valuable data generated by public recursive resolvers– have captured a significant market share and that public recursive resolvers are typically not European. DNS4EU was probably informed by market or commercial concerns related to the value of resolver data. CENTR welcomes the initiative, as long as the use of the resolver not be made mandatory, and noted that nearly a dozen of European ccTLDs are running local instances of public resolvers contributing to the diversity and resilience of European networks.

 

Jordi Iparraguirre from EURid spoke about actions taken by EURid to prevent harm to users of the .eu space. These actions are evidence-based policies, but they are also informed both by the existing legal framework (contract with the European Commission, local law in Belgium GDPR) as well as with EURid’s commitment to .eu brand and customer protection. Concrete existing actions include keyword detection on domain names (for example, searching for specific strings related to COVID pandemic) and analysis of domain names at the time of registration; improved Know-Your-Customer procedures to check on Whois data, and information sharing with law enforcement on domain names deemed suspicious of harmful activity.

 

When considering reliance on data for abuse mitigation, Keith Drazek from Verisign highlighted the importance of recognizing that there are different actors with different roles, responsibilities and operational capabilities. He highlighted ongoing activity at ICANN to identify improvements in DNS abuse mitigation focused on threats that are not content related. The gTLDs registries and registrars have recently sent a letter to ICANN to say they are prepared to take additional responsibilities to deal with DNS related security threats. But there is also a need to focus on content related abuse and for considering additional tracks for dealing with abuse in a multi stakeholder way which may belong outside of ICANN. He also highlighted the need to work with other actors for them to understand what it means to take action at the DNS level when trying to mitigate broad abuse. Mark Datysgeld - chair of DNS abuse group in GNSO supported Keith’s points about work at ICANN and mentioned that the group has also submitted a letter to ICANN asking to renegotiate contracts to change responsibilities.  

 

Beyond experiences from the global North, there were contributions from the African and Latin American perspective. Carolina Aguerre explained how the technical community in the LAC region is aware of the level of centralization that exists (the region relies on large, international providers). She also pointed out that concerns around privacy on the DNS are not being matched with initiatives to deploy privacy protections and protocols at the architecture level. APNIC has done a good job of mapping the adoption of protocols for the protection of privacy on the DNS, in Latin America there are some initiatives, like in Brazil and Chile, but very little data. The community is currently focusing on raising awareness among users and policy makers around this particular issue. It will likely generate tensions in the region as DNS blocking is common practice.

 

Biyi Oladipo from .ng spoke about not just Nigeria but how ccTLDs are managed in Africa. He expressed concerns about recent developments where governments take over the running of ccTLDs and potential implications of such developments on how freely and easily users access domain names. The regulatory environment is far more complex in Africa, with each country having its own data protection laws; he sees data protection as a potential opportunity for evidence-based policymaking to take place. Lastly, in practice there are few domain names taken down due to abuse in the continent, this is an additional area for collaboration with law enforcement and where an evidence-based system would be important. Some developments are taking place, a coalition is forming to collaborate with law enforcement on abuse and takedowns.

 

Lastly, Nigel Hickson from DCMS added a government perspective. He highlighted the importance that government officials be involved in these discussions to address valid government concerns as they impact government policy and regulatory development. He also called the group to reflect on ongoing UN processes and our vision for the DNS, particularly in the face of the WSIS +20 review and UNGA.

 

IGF 2022 WS #58 Realizing Trustworthy AI through Stakeholder Collaboration

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

- Key takeaway 1: As more and more countries are planning to introduce some type of regulation over AI, all relevant stakeholders should seize this window of opportunity for collaboration to define concepts, identify commonalities and gather evidence in order to improve the effective implementation and enforcement of future regulations before their launch.

,

- Key takeaway 2: Ensuring that all actors, from both technical and non-technical communities, exchange and work together transparently is critical to developing general principles flexible enough to be applied in various specific contexts and fostering trust for the AI systems of today and tomorrow.

Calls to Action

Stakeholder collaboration remains critical as the global community continues to grapple with how to tap the benefits of AI deployment while addressing the challenges caused by the rapid evolution of machine-learning. Ongoing human control remains critical with deployment of AI advancements to ensuring that "the algorithms do not get out of our control. Critical to this is is breaking down silos between engineers and policy experts.

Session Report

Speakers:

  • Norberto Andrade, Global Policy Lead for Digital and AI Ethics, Meta
  • Jose Guridi, Head of the Future and Social Adoption of Technology (FAST) unit, Ministry of Economy, Government of Chile
  • Clara Neppel, Senior Director of European Business Operations, IEEE
  • Karine Perset, Senior Economist/Policy Analyst, Artificial Intelligence, OECD
  • Mark Datysgeld, Internet Governance and Policies consultant, San Paulo State University, Brazil

 

  1. Stakeholder cooperation is at the core of the development of frameworks for trustworthy AI

As a general-purpose technology, Artificial Intelligence (AI) presents tremendous opportunities as well as risks, while having the potential to transform every aspect of our lives. Alongside the development of new systems and uses of AI, stakeholders from the public and private sectors, as well as civil society and the technical community have been working together towards the development of value-based and human-centred principles for AI.

The Internet Governance Forum is the perfect place to have discussion on the different existing initiatives to create policy frameworks for trustworthy and responsible AI, including the work conducted by UNESCO with the “Recommendation on the Ethics of Artificial Intelligence”, or the Council of Europe with its “Possible elements of a legal framework on artificial intelligence based on the Council of Europe’s standards on human rights, democracy and the rule of law”.

The OECD AI Principles developed in 2019, as the first internationally agreed principles, have set the path towards an interesting and necessary process involving different stakeholders to develop a policy ecosystem benefiting people and the planet at large. Similarly, global standards developed by the Institute of Electrical and Electronics Engineers (IEEE) aim at providing a high-level framework for trustworthy AI, while giving the possibility for the different stakeholders to operationalize them according to their needs.

  1. Standards versus practice: applying frameworks to real use-cases

In that regard, both public and private sector organizations present unique challenges in relation to AI according to their specific situations and requirements. It is therefore critical for policy makers to ambition to bridge the gap between policy and technical requirements as Artificial Intelligence systems are undergoing constant improvements, changes, and developments. The case of Generative AI is especially representative as, in less than a year, it superseded the discussion on deep fakes, which shows how fast the technology is evolving and the need to involve engineering and coding communities from the very start of policy discussions. 

When ambitioning to move from principles to practice, difficult decisions must be taken by organisations due to value-based trade-offs, as well as technical and operational challenges dependent on the context. These challenges are often dealt with in silos within the technical community and not documented. Breaking these divisions is essential for companies to implement the principles in a holistic manner and better understand the conflicting nature of some of the principles.

For example, ensuring fair and human-centred AI systems may conflict with the requirement of privacy. In some cases, AI developers need to access sensitive data and detect if specific bias occur to know if models are impacting people with specific attributes, but this questions the privacy of people’s data. A similar tension can be seen between requirements of transparency and responsible disclosure regarding AI systems, and the explainability of predictions, recommendations or decisions coming from AI systems as specific and technical knowledge might be required to fully understand the process.

  1. Towards implementation: operatizing and monitoring policy frameworks into practice

To ensure the implementation of frameworks for trustworthy AI, international organizations and associations are developing frameworks to effectively manage AI risks by defining, assessing, treating, and monitoring AI systems, but also by working on the definition of common language and understanding, design different agile instruments to fit the different stages of the AI life cycle, and foster training and skill-development opportunities. 

 As different use-cases and applications of AI carry different risks, defining the most important values and principles depending on the specificities of a situation is critical to properly ensure the application of AI systems in a trustworthy and human-centric manner. Further, assessing the risks in a global, interoperable and multistakeholder way would allow the identification of commonalities to improve the effectiveness of implementation and facilitate enforcement of value-based principles. Alongside this risk-assessment approach, the OECD is proposing to collect good practices and tools to share knowledge between different actors and help a wider implementation of their principles. Finally, monitoring existing and emerging risks related to AI systems through different means, legislations, standards, and experimentations for example, would allow to create a governance system both informed by the past and present AI incidents while providing foresight on future AI developments.

Regulatory experimentations are of the utmost importance to ensure multistakeholder participation and resolve numbers of technical challenges including the opaque and dynamic nature of AI systems, the difficulty to measure impacts as well as the uncertainty around the effects of the regulation over technology. In the case of Chile specifically, the successful use of regulatory sandboxes benefitted from a participatory process involving the expert community, including engineering and coding communities, and policy makers in a transparent manner, which proved to bring down prejudices and barriers both groups had prior to working together. Other initiatives exist to connect policy makers, academics, and technology companies such as Open Loop, which is a project looking at understanding how guidance resonate in the real world before being implemented.

 

Working towards the realization of standards for trustworthy and human-centred AI proves timely and ahead of the curve as regulators are starting to design and implement regulations. Strong evidence-based analysis remains essential to feeding the policy dialogue, which is to be conducted at a truly global level by involving developing countries. If the different stakeholder communities present unique insights, objectives, and sometimes conflicting views their common objective remains to developing a holistic framework ensuring trustworthy and human-centred AI.

IGF 2022 WS #183 Digital Wellbeing of Youth: Selfgenerated sexualised content

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Since usually legislation refers to consensuality in order to differentiate images of abuse and sexual violence from usual behaviour in adolesence, a common definition of what "consensual" means is necessary, taking into account cultural differences.

,

General Comment 25 on the rights of children in the digital environment provides for a framework to address the issue of sexualised content, that needs to be translated into national legislation and transnational measures.

Calls to Action

In order to address the issues properly, consider the wording in regard of self-generated sexualised content, the definition of "consensual" and the wording in regard of sexual abuse, sexual exploitation and sexualised violence.

,

Make the voices of young people heard in alle matters that affect them and give the views of the child due weight in accordance with the age and maturity of the child. Take into account that sexual orientation and the formation of one's own sexual identity is a developmental task in adolescence.

Session Report

The first step in the workshop was to define the term self-generated sexualized content. Therefore, Sonia Livingstone (Professor of Social Psychology, Department of Media and Communications, London School of Economics and Political Science) differentiated three definitions of self-generated sexual content and the implications that these have for youth and the law. Self-generated sexual content can be produced in:

  1. an exploitative situation with a remote adult. This includes e.g. extortion or pressuring the young person into sending sexual material of themselves. The 25. General Comment emphasizes the importance of safeguarding, protecting, and rehabilitating the victim and criminalizing the abuser. It also highlights the platforms responsibility as well as regulation for both prevention and redress.
  2. An exploitive situation, but the perpetrator is also a child. In this case, restorative and non-criminal measures of the perpetrator are encouraged when possible.
  3. A fully consenting situation between children. Here a non-punitive approach based on evolving capacity should be taken. 

In all these cases, the state and business bear responsibility for all sharing of such images, for which prompt and effective take down is vital, to ensure that children that have been subjected to abuse are supported and helped by knowing images are no longer there to avoid re-traumatization.

Children say that the digital environment is critical to their capacities to develop and explore their identities, both as individuals and as members of communities. They do understand, nonetheless, that the digital environment is strongly connected to the offline environment. Thus, in addressing the risks of sexual abuse and exploitation online, children recommend not only measures that can be done within the online space, but also actions that transcend digital boundaries. Many local languages are not popular online, which is why Hazel Bitana (Child Rights Coalition Asia) also emphasized the need to make reporting sexual abuse easily understood for children and possible in their local language.

In order to address the question of which answers legislation provides, Gioia Scappuci (Executive Secretary to the Lanzarote Committee, Council of Europe) summarized the new monitoring report adopted by the Council of Europe’s Lanzarote Committee in March 2022, which aims to address challenges raised by the significant increase and exploitation of child self-generated sexual images and videos. The report covers 43 European state parties to the Lanzarote Convention, and highlights ways to improve their legal framework, prevent this particular form of sexual exploitation of children, investigate and prosecute it and enhance the victims’ identification and protection. The report shows that only 11 out of 43 countries specifically address self-generated material in their legislation and they do not distinguish between consensual or non-consensual. The Committee strongly recommends that children should not be prosecuted for possessing or sharing self-generated sexual images and/or videos of themselves when the possession/sharing of such material is voluntary and is intended only for their own private use. The report calls for measures to assist child victims of sexual exploitation and abuse, short and long term, in their physical and psycho-social recovery. It also calls to abandon the terminology “child pornography” and instead use “child abuse material”.

Martin Bregenzer (klicksafe Germany) explained that since last year the distribution, acquisition and possession of sexual pictures of minors is a crime by law in Germany and the penalties have been increased accordingly. On the one hand, this is a major achievement in the combat against child sexual abuse. At the same time, the legislation results in significant hurdles for consensual sexting by young people so teenagers are committing a crime in many cases when sexting. Since the new law came into effect, policy makers noticed that this could backfire, so there will probably be a revision of the law in the future. He also pointed out that consensual sexting between young people can be seen as a regular and healthy part of sexuality.

Tebogo Kopane (YouthxPolicyMakers Ambassador) emphasized the role of young people as active agents/participants, but said that there is a culture of silence in much of Africa, which leads to very little open conversations regarding sexual abuse of children being led with caregivers, teachers, parents, etc. This shows that a common approach for children’s protection needs to be flexible enough to be adapted to different cultural and political contexts. She mentioned that a space for open discussion, questions, and education has to be created. Many sensitive questions are asked online, instead of asking parents, so ensuring high-quality content as well as children’s digital literacy is necessary.

The project Love Matters was mentioned from the audience. They have regional sex-education websites, where young and pleasure-positive language is used, which attracts more young people: https://www.rnw.org/?s=love+matters

Considering further national policies and transnational strategies, Chloe Setter (Head of Policy, WeProtect Global Alliance) showed that more children have internet access nowadays, they are online younger, use new chat platforms and offenders are learning more, which makes the risk of abuse much higher. However, sexual abuse online is not inevitable, but instead a preventable problem.

The speakers, also discussed the question of how Internet Governance can support a common approach in respect to different political systems and cultural backgrounds. The need for a common cross-cultural definition of consent that takes children from different backgrounds and situations into account was highlighted and at the same time formulated as a challenge. There is no simple solution to these complex problems. To address the question of the right balance between privacy and data protection on the one hand and child protection on the other, the cooperation of all stakeholders is crucial to create safe, child-appropriate and empowering spaces.

At the end, the speakers and audience members discussed how to involve young people directly and many ideas were mentioned such as creating a children’s domain (.kids) as a safe space for children. Ensuring that children from different backgrounds and situations get a space in the decision-making process was highlighted, as not all children have supportive parents that can help them with everything, so different perspectives need to be considered.

Number of participants: overall 69 participants. 26 on-site (12 female, 14 male) 43 online (20 female, 8 male, 15 not defined)

IGF 2022 WS #454 Accountability in Building Digital ID Infrastructures

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Authorities need to right historical wrongs and promote civic education before introducing and rolling out new digital identification regimes.

,

There is a need to build into these frameworks Legislation that ensures these systems are inclusive and respect privacy and data protection principles.

Calls to Action

Governments should stop rolling out digital identification programs without taking the reality of their demographic into adequate consideration..

Session Report
  • On the colonial legacy of exclusionary digital ID systems: Colonialism was designed to dominate and extract and the first targets were border communities. The manual ID was designed as a visa system to control people’s movements; the IDs contained information such as people’s names, tribes etc. These dynamics of control and domination are also present in contemporary identification systems. They require people to have their biometrics and data collected in order to access essential services, because of this, people do not have a choice but to comply.  Marginalized communities are the ones that suffer the most in this new iteration of colonialism.
  • On the conflation of legal identity and legal identity: Digital identity is just a tool, legal identity and identity can be accomplished in various ways, we need to ask if this is the right way to accomplish that, we need to reframe the narrative.
  • On how policy infrastructure can move from systems that enable surveillance to systems that enable social protection: The advocacy efforts have to be towards all arms of the government, we also need to understand what the executive wants to achieve with its agendas, the goalposts are always shifting. In litigation on digital ID, we need to push the court to define what the irreducible minimums are and clarify what cannot be abrogated from. There also needs to be a definition of what national security is because this has been most abused in limiting human rights, it has to be something that threatens the integrity or human rights of a country.
  • On moving away from enabling surveillance capabilities: There needs to be more done on creating systems that have privacy by design to prevent a panoptical view on everyday interactions. If digital identification is a precondition for access to essential services, many people e.g migrants and elderly people will be excluded.
  • On what the obligations are to perform human rights due diligence: We need to define the actions and effects of the actors but we don’t find this in human rights due diligence, which makes attribution for actions difficult. There’s a lot of obfuscation which is hidden from the public view in how digital id is deployed, we don’t know the tender processes, they are shrouded in national security protection it makes it difficult to get evidence on who is doing what, and that makes obligations difficult to obtain. Most of the time the client is a state, this is important in HRDD because this makes it different from contracting to a private entity because the state has control over private data and a monopoly on violence, so there is a real distinction in obligations because of this.
  • On private sector accountability:  Corporations need to be accountable and this goes beyond making sure that the systems are being rolled out, they need to ensure that all people are able to access government services. In Uganda, the digital id system has been tied to mandatory access to essential services. This means that it becomes the single source of access to social protection. With most of the country being below the poverty line, accessing these services becomes a matter of life and death, people such as PWDs, cross-border communities and rural communities are being left behind. These systems are exlusionary by design for people who are seen as people who should not be included. Some tribal communities are denied from being registered because they are not considered Ugandan enough. The corporations rely on provision of biometrics or identification documents to authenticate people in order for them to access services. This speaks to duties that corporations have, while they are providing social protection services, they have an obligation to ensure that they are inclusive. They have been touted as being more inclusive and leading to more accountability from the government but they are very removed from the real situation on the ground, there needs to be conversations on alternatives.
IGF 2022 Open Forum #40 An internet that empowers all women: can we make it happen?

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The session highlighted the importance of addressing the mobile gender gap – mobile internet empowers women and supports achieving the SGDs. This requires a focus on the key barriers women face with speakers highlighting the importance of ensuring affordability of internet-enabled devices, providing women with the required skills and confidence, ensuring accessibility (for persons with disabilities) and safety and security concerns.

,

Speakers highlighted the importance of partnerships and including women from the start in projects or initiatives, in particular those who are marginalised, and setting clear goals and targets.

Calls to Action

1. Commit to specific gender digital inclusion targets. This requires an understanding of women’s needs and barriers to mobile internet and use and taking targeted, collaborative action to address key barriers such as affordability, digital skills, accessibility and safety and security concerns.

Session Report

This session titled ‘An internet that empowers all women: can we make it happen?’ brought together representatives from the private sector, international development community and civil society to discuss the unique challenges that women face in accessing the internet and what can be done to bridge the digital gender divide.

GSMA opened the session and set the scene for the discussion by providing the latest data on the mobile gender gap in low- and middle-income countries (LMICs) which stands at 16%. This means that women are 16% less likely than men to use mobile internet in LMICs. GSMA noted that progress towards closing the mobile gender gap has stalled and highlighted the need for urgent action by governments and a range of other stakeholders to work together to address women’s needs and barriers to accessing and using mobile and mobile internet.

Speakers unanimously reiterated the importance of addressing the digital gender divide and noted that the internet offers an opportunity to transform women’s lives, including women with disabilities. They highlighted that improving women’s digital inclusion also creates opportunities for economic growth, and can improve women’s well-being and society at large. The cost of exclusion was also raised and a speaker cited research which estimates that women’s unequal access to and use of the internet has cost low-and lower middle-income countries $1 trillion over the past decade.

The barriers to women’s digital inclusion were discussed at length. The session speakers identified a number of the key barriers to women’s internet use in Ethiopia and across LMICs which were related to:

  • Affordability of internet-enabled devices;
  • A lack of digital skills and awareness of the benefits of mobile internet;
  • A lack of access to devices and a lack of accessibility of devices and online platforms;
  • Safety and security concerns including online harassment; and
  • Lack of relevant content and services including content in local languages.

Panellists shared practical examples of how these barriers can be addressed and how their organizations are working to address the barriers. For example:

  • Digital Opportunity Trust (DOT) Ethiopia’s programs equip women with technology, business, entrepreneurial and digital skills so that they can create opportunities for themselves and participate fully in the economic and social development of their communities.
  • Vision Empower designs and teaches curriculum that is accessible for children with visual impairments on digital literacy and STEM related subjects.
  • Safaricom has implemented device financing programs to lower the upfront cost of purchasing a smartphone and to date has sold around 1 million devices using a pay-per-use model.
  • Africa 118 supports small and medium sized enterprises in Africa with cost-effective digital services to reach their target audiences and improve their online presence.

Additional noteworthy calls to action that speakers shared were around:

  • The need for stakeholders to hold themselves accountable by setting targets to reach more women through their initiatives and to consistently monitor and evaluate against their key performance indicators;
  • The need to mainstream gender in ICT policies;
  • The need for policymakers to focus on and improve the implementation of legal frameworks and policies that aim to protect women's online safety;
  • The need to include diverse women, particularly marginalized women and women with disabilities, in the design of projects, initiatives and policies aimed at improving women’s digital inclusion from the start;
  • The need for the courage ofprivate sector organizations to pursue initiatives to drive women’s digital and financial inclusion, particularly initiatives aimed at improving affordability of internet-enabled devices; and
  • The need for digital skills training content that is tailor-made to women’s needs and preferences and use cases.

UNCDF concluded the session by summarizing the discussion and highlighting the importance of partnerships to catalyze collective action to tackle the barriers women face to becoming financially and digitally included. UNCDF shared the work they are doing in this space and invited stakeholders to join the Women's Digital Financial Inclusion Advocacy Hub (WDFI) network in Ethiopia.

 

The session put gender front-and-centre. This was reflected by the diversity of the panellists (4 women, 3 men) as well as the topic of discussion, which concerned the barriers to women’s digital inclusion. The session was attended in-person by approximately 70% women and online by 85% women. 

IGF 2022 WS #395 Creating a safer Internet while protecting human rights

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

There is a clear need for greater regulation on online safety, but human rights, specifically the importance of maintaining freedom of expression, must be a major consideration throughout. The panel noted the dangers of internet shutdowns, especially on marginalised groups, and the importance of collaboration with the tech sector, and the use of new technologies such as AI, when considering solutions to improving online safety. The panel noted the importance of privacy and avoiding government overreach, and noting the importance of technologies such as end-to-end encryption, which should not be compromised. 

IGF 2022 WS #393 Protect the Digital Rights and Data Security for the Elderly

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

The convergence of "digitalization" and "aging" has formed a new era feature. Digital technology is suitable for aging has become a global social governance topic. Governments, international organizations, Internet enterprises, non-governmental organizations, individual citizens and other subjects urgently need to increase the provision of more reasonable, sound and effective protection measures for vulnerable groups such as the elderly.

,

At present, different subjects have formulated and taken various measures one after another, but there are still "three big gaps" in network access, knowledge and skills, and risk awareness among the vulnerable groups such as the elderly. China's experience and practices in data security and personal information protection for the elderly can be used for reference by the international community

Calls to Action

In response to the call of the United Nations to "let no one fall behind , we will further explore, study and formulate laws, plans, policies, standards and norms, constantly improve the policy system for governance of the digital divide for the elderly, the rules and standards for the Internet to adapt to aging, IGF will exert greater influence, guide more stakeholders and responsible parties to participate in it, give play to greater value

,

Countries and entities should maintain close exchanges and cooperation, learn from each other effective measures and methods, effectively protect the digital rights and data security of the elderly, and create a more friendly digital environment. Strengthen domestic and international practice and cooperation, connect all sectors for joint consultation, cooperate with relevant policy formulation

Session Report

1.Views by Stakeholders

(1) In recent years, relevant Chinese government departments have successively issued relevant work plans and carried out special actions. Cyberspace Administration of China, along with other four departments, jointly issued the 2022 Work Points for raising digital literacy and skills of the publictaking multiple measures to raise digital literacy and skills of all people.The next step should be to enhance the work of the elderly and other vulnerable groups, further carry out international exchanges and cooperation, and analyze excellent practical experience.

(2) To improve the digital environment of the elderly and protect their digital rights, we should ensure their legitimate rights and interests in personal information and data, participate in improving the quality of aging products and services, strengthen the formulation of international rules and international practice and cooperation.

(3) Governments and international organizations (standardization and technical organizations) should strengthen coordination and collaboration and establish dialogue mechanisms.

(4) China attaches great importance to ensuring the data rights and data security of Internet users, and has formulated the Implementation Plan on Effectively Solving the Difficulties of the Elderly in Using Intelligent Technology. Relevant departments have issued more than 20 policy documents on the elderly's travel, medical treatment, payment and other aspects, striving to solve the problem of the "digital divide" facing the elderly.

(5) Learn from China's experience and establish a think tank of cyberspace of China and Africa. Carry out training in the field of network security for the elderly; Call on Internet companies to further carry out aging adaptation.

(6) When carrying out work on the elderly, we should have humanistic care, consider their needs in all aspects, give full play to the role of multiple subjects to improve the overall digital literacy of the elderly, innovate data circulation and use the underlying architecture to strengthen data security.

(7) In terms of protecting the digital rights and interests of the elderly, China has continued to take legal measures to provide basic protection, and has deepened concrete results through training and the "feedback" of the younger generation on the quality of the older generation.

(8) As an Internet enterprise, Kwai has established an anti fraud governance system to form the actual effect of "pre education, blocking in the event, and combating operations", and make contributions to protecting the rights and interests of vulnerable groups such as the elderly from infringement.

2.Feedback by Remote Participants

About 70 guests from China, the European Union, Africa, the United StatesRussia and other countries and regions participated in the workshop remotely in the form of Zoom and Tencent conferences. The number of women online participants is about 26. During the workshop, the remote participants actively asked and exchanged questions around the theme.

IGF 2022 Town Hall #54 Help! The Kill switch is taking away my limited agency

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

The session had speakers from regions impacted by Internet Shutdowns. The session saw reflections from Activists, Researchers, and Lawyers on Internet shutdowns in their side of the world. Some major themes that came up from the session were how shutdowns impact people. Internet shutdowns basically Exacerbate humanitarian crises.  and they do not allow support to journalists and human rights defenders.  On network measurement, it was felt that it  Helps increase internet sensitivity around the world. It has been observed that it is a pattern that access to social media platforms is blocked in times of conflict and unrest. The impact is felt hardest in marginalized communities which often goes unnoticed. In the case of Myanmar it was reported that the shutdown had lasted for 18 months in 54 townships where 12% of the population does not have access to data. Loss of the Internet means loss of access to each other. For women, this is especially hard as they are married off at an early age because of lack of information.  In case of Balochistan, it was reported that the general public is becoming more and more frustrated with internet shutdowns, especially in cases of students and young professionals. There is not just a shutdown but also throttling where speed is downgraded. There is a lack of regional and national level cooperation. In the case of India as well, there is no government documentation for the same which leaves the onus upon civil society to document shutdowns, advocate and litigate against them.

The key takeaways of the session were -

1. The need for more documentation to understand the nuanced impact of shutdowns.

2. The need to have coordinated efforts to actually understand shutdowns

3. More transparency on shutdowns from both government and private institutions.

 

 

 

 

 

IGF 2022 WS #318 Gen-Z in Cyberspace: Are We Safe Online?

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

The safety of children and young people not only lies in the parents' responsibility, but it requires many distinct stakeholders' role including the government, technical experts and civil society. It was a consensus that the Internet is a double edge sword for children as it is not only imperative to give them the safety they deserve on the internet, but a safe space must also be created for them to develop and grow.

,

There are a lot of frameworks regulating child online safety including digital competencies. Essentially, more awareness has to be instilled in the children and young people as after all they need to know how to protect themselves. To improve child online safety, more punitive measures must be regulated and all stakeholders must transcend from policy to action.

Calls to Action

All stakeholders have a vital role in facilitating the implementation of suitable policies to improve child online safety.

Session Report

 

Report for IGF 2022 WS #318 Gen-Z in Cyberspace: Are We Safe Online?

Rappoteurs: 

Dalili Nuradli, [email protected]

Ariff Azam, [email protected]

 

Before introducing the speakers, the moderator kicked off the session by inquiring about the perspectives of everyone pertaining to child online safety. It is an acclaimed notion that child online safety can literally be fathomed as the frameworks, the policies, and the regulations that are centralised in making sure online engagement for children is done safely and securely and that our youth are in no way exploited, taken advantage of, or ended up losing their lives because of being active in the online ecosystem.  It is a well-known fact that the internet, which is readily accessible and available via the use of mobile phones and other electronic devices, has provided children and young people with levels of access to information, culture, communication and entertainment that everyone would never have imagined that would be possible 20 years ago. So, we are faced with a myriad of challenges nowadays. 

In terms of the domain name space, child online safety is a very complex notion that requires a lot of participation and responsibility from different stakeholders, education and platforms. It is imperative to not just restrict certain types of content such as extreme violence and child pornography which should be unquestionably unavailable to children. On the other end of things, it is also essential to realize that children must be able to express themselves, create content that is appropriate for them to learn and take benefit of what the internet can bring to their education and the development of the children. So from the tech community standpoint, there is kind of a double-edged sword here as we need to allow the children space for them to grow and learn in a safe environment and we also must be more responsive to any abuse and questionable materials. This takes a lot of coordination with law enforcers, trusted partners, trusted notifiers, and child rights organisations i.e., Watch Foundation which are experts in taking down child sexual abuse materials. However, when it comes to child safety, there ought not to be a differentiation between online safety and offline safety in which both carry the same weightage of concern that should be paid attention to. This alludes that in catering for the need of the children in regard to their safety, the way of treating the safety of the children in both settings must be self-same. Therefore,  protecting the safety of the children online is not a responsibility solely placed on the parents as other stakeholders i.e., government civil society, and society as a whole also should partake in championing the safety of the children online. 

In the matter of updating the current frameworks governing child online safety in ensuring child online safety policies are practical to be followed in different regions, it must be foregrounded that the reasons why such challenges spiral to the extent the issue of child online safety cannot be combatted stem from the fact that there is an inadequacy of the awareness amongst the parents or even the older generation. Consequently, this ignorance renders to the stunted growth of the legislation concerning child online safety as it is perceived by the parents and older generations to be a ‘silly matter’ to be pondered upon. Therefore, the rules or frameworks which govern the discussion of instilling awareness towards parents and others should be enacted. 

Before conferring the issue of the effectiveness of the regulations, it is indispensable to converse on the implementation of the regulations i.e., data protection and privacy acts. Albeit these legislations are in force, there is still a dearth of implementation. For instance, the United Kingdom has General Data Protection Regulation, which is deemed to be the strongest data protection act. Notwithstanding that, Tiktok was fined 92 million in 2021 and was fined 13 million in 2022 for privacy concerns that affect children. Ergo, the yardstick of the effectiveness of child online safety is not the existence of the laws regulating child online safety, but the implementation of those laws that should be stressed in order to ensure a safe digital safe for children’s and young people’s usage. 

One of the initiatives that we can pursue is to grant the parents with the awareness program from the outset. This is due to the fact that the past generation is not ‘tech-savvy’ which leads to difficulties in deciphering the behaviour of their children in cyberspace. Hence, the participation of the parents can also come in the form of joining the awareness programme or webinar to shed a light on what the current technology can do and what are the effects towards their children.

Moreover, the mechanism that the stakeholders can employ is an age verification system. To illustrate this better, United States websites relating to alcohol, gambling, and movie-rated-18 plus compel the users to verify their age. However, this entails a problem as to how the service providers are going to verify somebody’s age if the act of service providers trying to access a certain database will constitute a breach of data. The next problem arises is in regard to the absence of universal standards in tackling the said issue due to the existence of different jurisdictions. By way of illustration, the users of social media are required to confirm their age, yet, such verification is never really verified as the users do not have to tender their identity cards. This is evident that most of the time, the verification only relies on an honour system. Thus, the first step to solve this issue is by creating secure standards in which the service providers are able to access certain databases, however, these secure standards must still venerate data privacy laws and the jurisdiction of the country. However, it is indisputable that these secure standards are still flawed as they only cater for documented persons, not undocumented people in general. This is due to the fact that these secure standards will impel every online platform user to tender their identification cards and this will precipitate a crisis for these undocumented persons. It is significant to revert to the fundamental attribute of the internet which is that the internet should be open to everyone. Next, the age verification step will ensue the problem of a data breach, data privacy, data regulations and how to implement them. 

Due to the scattered standards of child online safety, one of the mechanisms that the developers of the applications may utilise is the creation of the minimum standards and norms for child online safety concerns. This will assuage the problem which the developers might encounter in appraising the content of the internet such as pornography and sexual abuse content. For instance, there should be applications which have different modes that will enable parents to indicate whether that device will be used by a child or otherwise. This will make it easy for them to monitor and determine who accesses it. Nowadays children have devices as early as 3 or 4 years. The content a 5-year-old is exposed to may not be the same content that should be exposed to a 15-year-old. So having segregation of what content should be accessible for different ages and categorising demographic for children according to age would be of assistance to mitigate issues vis-a-vis child online safety.

Moving on, one of the vital questions which arose in this session is what are the possibilities that exist to eventually achieve child online safety on the internet. It is feasible to provide a safe online environment for children, yet, it is a long way to go before the children can safely utilise the internet without external threats. This problem relates to the huge gap which exists between digital immigrants and digital natives, the digital literacy issues among parents, lack of control of the technology itself. Thus, an alternative has to be a place which is creating an Internet for kids. Dot.Kids inaugurated by Dot.Asia is a conspicuous example that is viable for keeping children safe online. This is because there is no constraint thrust on the children insofar as the children are incompetent to express themselves and even explore the internet space. This can only be attained by giving the children space so that there is no extreme content such as pornography, sexual abuse material, gambling, or violence. However, it is noteworthy to reiterate that preserving the safety of the children in such an online platform demands the ultimate level of reliability and integrity of the educators, teachers and parents and people who are around the kids. 

Another solution is by going down to the children's level of awareness. For example, creating comic strips shows how to deal with the Internet. Children will be invested in reading it as it includes images, graphics and conversation which provides them with the safety measures that are required of them on the Internet.  There will be an inclination of the children to actually read such reference materials. It is also critical to seek a resolution for the problem that arises from the parents’ ignorance pertaining to the use of technology as the parents are the first point of contact for children’s development. Regulations concerning online child safety must be developed in a sense that parents can receive and absorb such content. Measures must be taken in accordance with the parents’ level of understanding by acknowledging and being cognizant of the life of the parents which differs due to various factors including experiences and regions. 

On a further note, in answering the question of how we should develop a mechanism to ensure the stakeholders are fulfilling their responsibility to ensure child online safety, one of the important mechanisms is to make global efforts and have an international convention on the issue.  This is important as children are a vulnerable class of society. Thus, to be effective, strategies need to incorporate measures and messages appropriate to different levels of ages of understanding. This also requires children’s and young people’s participation to know their opinions and feedback. We also need to empower the adults, and educators and give them support when they lack understanding of such issues. Not only that, the golden rule of “A little less talk, a little more action.” shall be applied and there should be more cross-regional collaboration in the sense that they should be consistent and actually increase globally in order to combat child online safety issues.  Moreover, there is a need for synergy and collaboration between government and civil society when it comes to the promotion, protection and fulfilment of the rights of children. Hence, all stakeholders must strengthen the capacity and process that relate to the realisation of children's rights and bring everyone to the table for such discussion. 

All speakers and participants are in consensus that social media companies ie BigTech are accountable when it comes to child online safety issues. Given the centrality of the private sector to the internet, BigTech has major responsibilities for child online protection. Social media companies have an obligation to both respect human rights and prevent or mitigate human rights which directly impact their operation, service and products and by-products i.e. advertisements that are shown to children online. Child abuse and exploitation are manifestly adverse human impacts, thus social media companies should be held accountable. There must be a regulation of the data stored and advertisements of certain products to children. For example, the application of WhatsApp, Telegram and Facebook store data of children and teenagers and these BigTechs surely have saved the said data. Hence, there needs to be an improvement in terms of the texts, images and videos being stored and removed when it comes to children and teenagers. Furthermore, to ensure the safety of children online, it is necessary to introduce responsibility for global platforms at the legislative level. Responsibility should be comprehensive which indicates that all stakeholders should be responsible including online platforms and social media since children and young people spend much of their time on these platforms. This issue should be raised in many other Internet Governance forums and aim for the BigTechs to also be involved in this crucial discussion. Additionally, breaches made by BigTech that concern child online safety should also be addressed in upcoming forums. The solutions to the breaches should revolve around money and to a certain extent, shutting down businesses i.e. factors which hurt them the most in order for them to comply.

In conclusion, the three takeaways from this session concern the dire need for awareness, the push for implementation and third, to take action from all ends regardless of which sector one comes from in order to combat online child safety. Thus, it is vital to strengthen all stakeholders’ responsibilities and start with the laws which are already in place concerning online child safety and ensure its implementation with the cooperation of all stakeholders. Moreover, it is imperative to create more awareness by educating parents and children. Most importantly, we must emphasise punitive measures to stop online predators and offenders. Finally, all must have a need to transcend from policy to act as online safety for children is a universal issue that must be. 

 

IGF 2022 WS #406 Meaningful platform transparency in the Global South

Updated:
Addressing Advanced Technologies, including AI
Session Report

 

At our session at IGF, we sought to identify key factors for regulators in the Global South to consider as they contemplate transparency regulations for social media platforms. We discussed the kinds of regulatory interventions being contemplated, the kinds of harms sought to be addressed, and safeguards and other considerations that regulations must account for. The following are the key themes and conversations that emerged over the course of the discussion.

Participants and panelists spoke of the importance of using a two-pronged approach in considering transparency mechanisms for platform governance, which provides transparency obligations for both platforms as well as governments. The importance of considering how different States in the Global North and the Global South have challenges which are contextual to them was also highlighted. Transparency regulations must be framed such that they do not become tools for enhanced control over speech, and applying transparency requirements to States is essential in this regard.

The panelists spoke about the importance of recognising that transparency is an iterative process which will have to adapt to the changing technological and regulatory environments, and will evolve based on insights gained from information provided by platforms. As first steps, it would be important to develop enabling frameworks to see what kind of information would be useful for platforms to provide, and to incorporate measures such as data audits and researcher access to platform data.

Fernanda Martins spoke about experiences on platform behaviour during elections, and on the importance of working together to reduce political violence and misinformation in Brazil. She highlighted how the harms of disinformation or political violence are not limited to election periods, but are rather spread across broader timelines, meaning that platform efforts to tackle these behaviours could not be restricted to election times. Fernanda also spoke about the unpredictable nature of social media platforms – changes in governance or ownership structure have significant implications on online speech, and harms such as political disinformation and violence. Platforms can change behavior in significant ways if they are bought or sold, and such decisions can have massive effects on political speech and have other real-world consequences.

Shashank Mohan spoke through some of the goals and challenges of operationalising transparency. Ideally, transparency would lead to a fair and unbiased experience for users on social media platforms, and a system that respects user rights. Any measures to operationalise transparency would have to include contextual factors such as the scale of relevant populations and the level of heterogeneity. Information provided without accounting for such considerations could be incomplete or have limited utility - for example, broadly worded requirements for transparency in the context of content takedowns may mean that platforms provide broad metrics on their moderation efforts and not account for nuances in local contexts which may be necessary to address harms. This would not serve the purpose of transparency regulations, and therefore regulatory interventions would need to balance the level of granularity required by transparency mandates. Shashank also highlighted the importance of the Santa Clara principles in developing standards in this context.

Emma Llanso outlined the history of transparency mandates and provided an overview of various approaches to transparency that are currently being adopted. She spoke of the different kinds of regulatory interventions and their goals – the Digital Services Act, for example, sets out different obligations for platforms, and requires that they provide information on the automated tools they use, and also on the considerations behind the design of their algorithms. Such information would provide insight into the content that gets promoted on various platforms, and how these assessments are made.

Emma pointed out that another core focus area for transparency regulation is on digital advertising, particularly on how targeted advertising works online. Another avenue of reporting targets users, and requires platforms to provide notices for content moderation, and policies and processes for content takedowns and appeals. Such measures, and others targeted at making websites more accessible, are aimed at helping users understand platform behaviour and empowering them to seek redressal. Emma also pointed out that another large bucket of regulation focuses on researcher access to data, and the use of audits to understand the internal processes driving algorithmic decisions. Measures that require platforms to share access to information with independent researchers are crucial in understanding the relationship between platforms and harms, and to identify areas for further intervention. In this context, regulations would need to find ways to provide necessary information to independent researchers while also maintaining privacy of users of platforms. Emma pointed out that it is currently difficult to assess what the consequences of such interventions would be, and that transparency regulations would need to be iterative and responsive to information that is provided.

Chris Sheehy stressed on the importance of using a multi-stakeholder approach in transparency regulations. In part, existing efforts have been a response to previous multi-stakeholder collaborations. Chris highlighted the importance of the role of multi-stakeholder forums in checking transparency commitments of various platforms and also in auditing the frameworks of information and communication technology companies. In this context, he spoke of the Action Coalition on Meaningful Transparency (ACT), which is part of a global multi-stakeholder initiative led by civil society that aims to identify and build collaboration across various streams of digital transparency work, as well as to increase the diversity of perspectives considered in those efforts.

In response to a question about the role of the government in the context of transparency requirements, panelists spoke of how more granular reporting (such as on what category of law was violated, clarity on when takedown requests have been made by governments, etc) would provide more useful information. The importance of requiring governments to be transparent about takedown orders, and on the importance of including States in such obligations was stressed on, as a way to make sure that transparency obligations are effective and centre users and their rights. Panelists pointed out that existing transparency requirements in this regard could be strengthened across Global North and South countries. The challenges of instituting such mechanisms in countries with a history of State censorship was also discussed, along with ways to balance speech and other considerations.

IGF 2022 WS #269 Data privacy gap: the Global South youth perspective

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

Leakage of personal information with the covid-19. Economic problems with natural places, violence in Brazil examples, affect data protection, for example in Amazonia, cut-outs.

,

It’s urgent to have training programmes that are contextual and sensitive on data privacy and data protection.

Session Report

 

Data protection must be mandatory in school education, for the youth to understand the concept of protecting their privacy. Most African countries don't have data protection laws, or are not enforced. They have schools on internet governance, but it is not sufficient, so they need to bring these programs to the school level.

The issue of non-english languages was raised, also people don’t understand how big tech companies are handling their data. The concept of peer-to-peer education was also touched on, where recent research shows that when it comes to children and adolescents having problems on the internet,  they learn from each other.

Via games they could better understand the implications of privacy also, is a good approach of engagement for the children. When people download apps, they are not paying attention to the privacy issues, and they cannot easily understand if they are safe or not.

In the global south specifically,  The European region has more protection than the global south, that is a reality. Also in countries like Brazil. There is a compromise to take with Global south developers for them to provide applications and services when we can communicate properly and safely.

 

IGF 2022 DC-SIG Role of Schools of IG in sustaining efforts to support SDGs

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

1. Recommendion that schools make multistakeholder contributions towards the GDC, so that the consensus voices of the young and active learners can be part of the considerations.

,

2. Capacity building efforts, including Schools of Internet Goverance, must keep promoting gender balance, especially promoting STEM and ICT careers and education among the young without bringing in outdated notions about appropriate gender roles and gender limitations.

Calls to Action

1. Schools of Internet Governance, and other educational institutions in the digital age, cannot be created or succeed where there are Internet shutdowns. End Internet shutdowns for education’s sake.

,

2. Schools of Internet Governance should spend as much time on teaching the critical thinking necessary for the digital age in addtion to teaching about existing technology and governance models.

Session Report

 

DC-SIG 2022 IGF session

 

This year’s session, hybrid in nature, was also a combination of a regular yearly DC SIG review of schools, the yearly activities and planning for the next year, but with an issue based focus: on Schools and their contributions toward the achievement of SDGs. the meeting agenda was divided into discrete sections. 

  1. Self-introductions: at the start of the session there were 30 participants in the room and 15 joining remotely. While a specific count was not taken, the room had a good gender balance.
     
  2. Report from New Schools
    1. Zimbabwe IGF and Zimbabwe ISOC have been organising their SIG for 3 years now. First as a consultative moment, second as a remote session, and lastly as a physical school.
    2. ISOC Comoros organised its SIG in the year 2020, sponsored by PRIDA. They want to have a consultative moment for everyone to know their roles in the ecosystem.
    3. Cote D’Ivore IGF organised their SIG in 2019 and 2020. Got funding from PRIDA but because of Covid, they did not organise any SIG this year.  They need support for speakers and to organise the Western School on Internet Governance and Western IGF this year. It should be noted that as we wid not have translation services, this was done by a bilingual participant. IGF would be well served if translation services were to be provided to all sessions.
       
  3. Details on new things existing Schools have done:
    1. AFRISIG: it is a leadership development school and it is for people already playing a role in the ecosystem.  What was done differently at this year’s SIG was they brought very experienced people from the Cyber Security Authorities across Africa for them to have discussions and have a working paper on Cyber Security.  This was presented in New York and the Africa Union also used it.  The important thing about this effort is that the schools are learning places for negotiation processes.
    2. APSIG: they created a separate programme for people with disabilities.
    3. SouthSIG: the new thing is they did 2 months of online learning and then one-week hybrid learning. Then the last stage was a collaboration with a university and those who got to this stage got a fellowship and a diploma from the university in internet governance. Everything was done for free.
    4. EuroSSIG: this year’s practicum was the most relevant since it contributed to the Global Digital Compact discussion and it is on the website of the UN.  They made recommendations for other schools to make contributions towards the compact and make contributions towards other global discussions.
    5. GhanaSIG: the new initiative is using their fellows as expert speakers to make proposals for events at local and global events. 
    6. InSIG: they reserved seats for people with disability.
    7. RussiaSIG (Illona): they are now using their SIG to research on internet fragmentation issues. 
    8. Chad (tdSIG): they intend to close the gender gap in this year’s SIG.
    9. VSIG: they intend to bring in GDPR for citizens that match various countries’ needs.
    10. North America (Andre):  they intend to make the NASIG process multilingual in future events.
    11. Bangladesh-SIG: hands-on learning on the time-demand learning process.  They also intend to do the SIG in local languages.  They requested to be added to the DC-SIG since they have done 6 schools already.
       
  4. Discussions on SIGs and SDG:
    1. What has been done and what could be done:

While this section of the session was intended to allow discussion of SIG contributions in several areas including SDG 5 on Gender, SDG 7 on Energy, and SDG 13 on Climate Change, the meeting only managed to have a single extended discussion on SIGs and SDG 5. Further sessions, yet to be scheduled, are planned for the 2023 to cover the other SDGs.

Regarding SIGs and SDG 5, the following points were discussed.

  1. Increase gender inclusion: this was done by the Russia SIG.  They keep getting more applications from ladies.  
  2. AFRISIG: they have questions on what applicants think on gender equality and they include sensitive issues such as the LGBTQ in their application process.  They encouraged all schools to include the practicum session in their SIGs.
  3. EuroSIG: they get a vast majority of females as their applications, but rather not more male applicants.
  4. BrazilSIG was done which includes a legal school dedicated to lawyers and judges, which has been a very good experience. 
  5. RSIG: SIGs must keep promoting gender balance, especially promoting STEM and ICT careers and education among young students.
  6. ZimbabweSIG: they get more female applicants, however, female engagements is less in the actual participation.  Anriettte’s responded that AFRISIG does not have such experience, hence, the ZimbabweSIG must look at what works for them, and also include people in the moderation process in order to make it inclusive. Design the school in order for the fellows to know that they have an expertise that will be acknowledged.
  7. Chad Youth IGF coordinator: how can schools engage in discussions when they were being subjected to  internet shut downs (Khouzefi).
  8. Joshua (GhanaSIG fellow), be deliberate and keep pushing female fellows to participate in the fellowship process.
  9. Liana (online): more than 90% are females in the Armenia SIG.
  10. Sarata (online): we are making conscious efforts to include females in the community. 
  11. RRSIG (online): according to Illona they select everyone with clear interest in their SIG.

Because the session was running out of time, there are plans for continuing the discussion at some point during 2023..

  1. AOB:  
    1. The chair encouraged all SIGs to come together and have a global faculty, and global fellows.  
    2. There are no schools in Japan and they would want to have one. 
    3. There will be a networking session for all fellows of all SIGS in next year’s IGF.
    4. Given the number of issues that those attending wished to discuss, consideration will be given by the DC SIG to a Day 0 event at IGF2023 and/or an intersessional event during 2023.
       
  2. At the end of the session, we had 45 participants in the room and 24 people remotely.
IGF 2022 WS #482 Internet Shutdowns: Diverse risks, challenges, and needs

Updated:
Connecting All People and Safeguarding Human Rights
Calls to Action

Need more support to activists in countries for longer-term coalition building, training and advocacy to prepare for and prevent shutdowns.

Session Report

 

Participatory research for internet shutdown advocacy

 

The OPTIMA project works with advocacy communities all around the world to predict, prevent, prepare and respond to internet shutdowns. What we’re going to present today is research we have done over the past 6-9 months regarding the assessment of needs from our partners in different countries, which feeds into an internet shutdown toolkit that you can find online.

As it can be read in the report, every context is different, politically, socially, and in terms of resources and capabilities. There’s also lots of different kinds of stakeholders involved in shutdown advocacy strategies (governments, ISPs, activists, technologists, journalists, industry). Normally when we do this kind of work we’re in crisis mode; shutdowns occur in times of crisis such as elections, political uprising, etc. If we can plan for something, such as in the case of elections, we will attempt to mobilize resources in advance, but this is very often not possible.

We wanted to understand different stakeholder’s perceived experience regarding shutdowns, and also what they think are their skills and the perceived gaps around doing this kind of work, in order to understand how to broaden coalitions, provide support mechanisms, and develop better campaigns.

We first did a survey -snowball sampling- and then we followed up with a workshop in specific settings talking to people about the results we had. And then finally we had co-design workshops, which allows people involved in processes to be more involved in designing the outcome and to tell us what they really needed.

 

Bangladesh (Miraj Chowdhury)

  • From 2012 to April 2022, we have seen at least 17 shutdowns. Just in this past month, we have seen at least 4 throttling events targeting political rallies. Targeting mobile networks, Facebook blocking are all common practices and internet censorship is growing since 2018.
  • There is an election in 2024 and they are already anticipating some sort of censorship event for that time.
  • All of these events were mostly targeting political tensions and mostly communal riots.
  • This outcome comes from talking to hundreds of different people from different organizations.
  • 88% of people said they have experienced internet shutdowns in the past three years.
  • Most said that the largest impact of shutdowns is in business and economy.
  • Shutdowns are often justified to contain disinformation, and what we found is that when there is a shutdown people are unable to get proper information. 
  • Whenever there is a shutdown there is no accountability because neither the telecom service provider nor the government issues any kind of statement justifying why the internet is being shut down.
  • Civil society doesn’t have the technical capacity nor the larger understanding of digital rights to respond to shutdowns properly. 
  • Most people do know how to use VPNs but they’re not responding to shutdowns in a way that creates advocacy in a national level.
  • We need to create digital rights capacity and broaden communities engaged in this issues, as well as support technical skill-building. Even in remote areas, if there is a shutdown it might be never reported, and we need to document these cases and bring them into the discussion. 
  • For a country like Bangladesh, internet shutdown advocacy will need to start from scratch.

 

Senegal

  • Long been seen as one of the most stable democracies in Africa, but backsliding under Macky Sall. High rate of internet penetration; high rate of mobile devices.
  • There was an incident in March 4, 2021, when following a day of protests across the whole country, social media platforms were unavailable, but there were not enough people on the ground measuring the disruption, and so the shutdown was not well documented because there was not enough people on the ground and also not enough technical skills to document it. 
  • We found that civil society is not well prepared on internet shutdown topics, which is why we don’t have good data regarding the 2021 event. When the internet is not working, people will just assume it is a technical issue and leave it at that. There aren’t the skills on the ground to prove if it’s technical or if it is a shutdown.
  • 64% of respondents believe that a shutdown is very unlikely within the next year.
  • 61% reported civil society capacity as low or nonexistent.
  • There is low general awareness of circumvention tools, a dire need of skill-building regarding network measurement, low levels of awareness amongst lawyers and judges for censorship and digital issues, and a need to develop strategies to engage the government and the private sector.

 

India (Chinmayi SK)

  • India continues to top the list of countries for most internet shutdowns carried out in one year (106 shutdowns in 2021 alone).
  • There are many different reasons why shutdowns happen in India (law enforcement, exams, etc).
  • Media freedom is decreasing, internet censorship is increasing.
  • There are certain laws that enable internet shutdowns to be used as tools in certain scenarios.
  • They wanted to build and plan based on how people are affected and what their needs are.
  • They needed to add interviews on top of the surveys because the surveys were targeting English-speaking people and they needed to add access for people who didn’t have that skill. 
  • They were able to involve people from 14 states in India, including students, researchers, journalists and activists. These were people who had experienced shutdowns, challenged shutdowns in some cases, these were smaller groups so the discussions could be free-flowing.
  • 76% of people had experienced at least one internet shutdown in the past three years.
  • Internet shutdowns disproportionately impact certain states and communities. Even within the same state, some people had very different experiences than others.
  • 60% are familiar with shutdowns, but do not understand how they occur technically or legally, how they were implemented or any of that.
  • Certain pockets of the country have certain capacities —in certain places, people does have the capacity and the understanding to fight shutdowns, reporting a capacity of 33% in network measuring, which we can consider to be high. They also have the shutdown tracker so we could consider that the capacity for documenting is good.
  • They have been able to engage in litigation an fight cases to question the necessity and proportionality of shutdowns, and in some cases they have been able to give good judgements. 
  • There is a lot of hesitancy regarding the usage of any sort of circumvention tool. “Are VPNs illegal? Is it risky?”
  • It is important to document the events even if we’re not able to fight back.

 

Tanzania (Rebecca)

 

  • Restrictions to posting on social media, restrictions to NGOs.
  • Even after the change in president, people are being very cautious.
  • The 2020 shutdown was the first of its kind in Tanzania and it means that now, the communities are thinking more about that as something that can happen again. 
  • It seems that media and activists have been figuratively taken out of “prison”, but the laws that put them there in the first place haven’t changed. 
  • Awareness about shutdowns is high, but knowledge is low. 71% of the respondents reported having experienced an internet shutdown, but 46% of them said that they can’t tell —or aren’t sure if they can tell— the difference between a shutdown or an internet connectivity problem or a technical issue.

 

(At this point in the note taking process I got a horrible headache and life got really hard)

 

(Miraj) In order to advocate against shutdowns, we need to document the impacts, because that is the only way in which we can develop the arguments and the evidence to fight the shutdowns. I think this is where we are lacking the most in Bangladesh. On the other hand, we need to engage businesses; sometimes businesses have a stronger voice than civil society, depending on the social and political context of the country. What kind of advocacy is needed to empower and engage businesses to also advocate against shutdowns from their perspective and their interests?

 

(Chinmayi) In the case of India, there is enough documentation to start conversations with, regarding the effects, the consequences, the problems caused by shutdowns. It’s now for us to have the government look at this documentation and really think about necessity and proportionality.

 

IGF 2022 DC3 Community networks as human rights enablers

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Community networks offer a significant example of the value of alternative and complementary approaches to expand connectivity, promoting digital commons and helping to fulfil the UN Sustainable Development Goals.

Calls to Action

Community Networks can be seen as a powerful ally in the fight against digital exclusion and a considerable engine for people-centred connectivity and full enjoyment of all human rights, especially the so-called right network infrastructures consisting in the freedom to develop and organise digital infrastructures.

Session Report

The IGF 2022 session of the Dynamic Coalition on Community Connectivity focused on the rights aspect of community networks. Session participants provided their perspectives and best practices. Some panelists authored and co-authored the official DC3 outcome, a report titled "Community networks as Enablers for Human Rights".

The session started with the launch of the report, presented by Senka Hadzic. The report is a compilation of of four different papers/ chapters, one of which is a collective paper - and a direct result of a collective effort by DC3 members.

The first speaker, Ronaldo Neves de Moura from the Brazilian regulator ANATEL, spoke about ANATEL's  activities related to promotion of community networks  both on a national level, but also when engaging with international bodies such as the ITU. He also highlighted the relevant synergy between ANATELs activities and the work of DC3.

Raquel Renno from Article 19 pointed out that developing countries and marginalised communities are more affected by accelerated digitalisation.

Nicholas Echaniz' talk focused on importance of participation in the digital space. The current landscape ignores meaningful connectivity and tends to create a "second class digital citizenship".

Sarbani Belur shared her experiences on addressing community's needs (on location, coverage, online vs offline) when seeding communities networks in India.

Glenn McKnight And Niels ten Oever focused on infrastructures: Glenn highlighted that electricity is mandatory supporting infrastructure for connectivity, while Niels spoke about the way infrastructures are operated and controlled. He raised the question of private 5G networks being able to get local spectrum licenses, something that community networks spent years advocating for.

Karla Prudencio Ruiz from Rhizomatica in Mexico brought up the assumption that connectivity is always preferred to disconnection, which leads to a situation that the means by which connectivity is being achieved, seem not to matter. People are often not aware of risks and harms when going online.

Jane Coffin wrapped up the session pointing out the need to look at different regulatory models with respect to spectrum, licensing and financing CNs.

In his closing remarks, Luca Belli highlighted the need to be critical, and understand the limits of conventional models.

 

 

IGF 2022 DCNN Internet Openness Formula: Interoperability + Device + Net

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

Internet Openness is essential and instrumental to foster the enjoyment of Internet users' human rights, promoting competition and equality of opportunity, safeguarding the generative peer-to-peer nature of the Internet.

,

Internet Openness is a multifaceted concept, and the debate on Net Neutrality and non-discriminatory traffic management is only part of the broader openness debate. Net Neutrality is necessary but not sufficient to guarantee internet openness. Besides net neutrality, to guarantee Internet Openness it is essential to promote and preserve infrastructural interoperability and data interoperability, as well as device neutrality.

Calls to Action

Stakeholder should stop analysing internet openness threats based on a Internet layer approach but should work together to understand how internet openness is threatened and can be preserved via a systemic approach.

Session Report

The session was opened by DCNN coordinator, Prof Luca Belli (FGV Law School) and the session co-moderator Ms Smriti Parsheera (CyberBRICS Project). They stressed that over the past decade DCNN coalition has been advocating for open, secure, and non-discriminatory Internet, affordable and accessible to all people; promoting Network Neutrality as this fundamental principle plays an instrumental role in preserving Internet Openness; fostering the enjoyment of Internet users' human rights; promoting competition and equality of opportunity; safeguarding the generative peer-to-peer nature of the Internet

Since its creation this Coalition has explored the various dimensions of Net Neutrality and Internet Openness, acknowledging that Internet Openness is a multifaceted concept, and the debate on Net Neutrality and non-discriminatory traffic management is only part of the broader openness debate

Over the past years, Internet Openness analyses have increasingly focused on interoperability and device neutrality, acknowledging that net neutrality is only one necessary yet not sufficient ingredient of a successful internet openness formula, which include Interoperability + Device Neutrality + Net Neutrality.

Yet, net neutrality debates keep on being popular in policy circles, especially at the EU level, with the recent discussions regarding the introduction of a "fair share" proposal based on the "sender party network pays" model and its compatibility with networks neutrality principles. But also, at the south Korean and Latin American levels, as discussed by participants.A large number of Interne Openness related issues were explored by the speakers, including:

  • Lina María Duque del Vecchio, Commissioner at the Colombian Communications Regulator, Comisión de Regulación de Comunicaciones, Latin America, Government, Latin America
  • Maarit Palovirta, Senior Director Regulatory Affairs of ETNO, Private Sector, Western Europe
  • Thomas Lohninger, Epicenter Works, Civil Society, Western Europe
  • Angela Daly, University of Dundee, Academia, Western Europe
  • Sabelo Mhlambi, Founder of Bhala, Private Sector, Africa
  • Kyung Sin (KS) Park, Director of Open Net Korea, Civil Society, Asia-Pacific

Stakeholders manifested diverging views on the so called “fair share proposal” stressing that it might undermine Internet Openness, as emphasised in the Open Letter addressed to DCNN members to EU Commissioners, in October 2022 https://internetopenness.info/29-internet-experts-and-academics-send-a-letter-to-the-commission-urging-to-abandon-the-sending-party-network-pays-proposal/

Stakeholders also broadly agreed on the usefulness of the elements defined in the DCNN 2022 Outcome, the Open Statement on Internet Openness https://www.intgovforum.org/en/filedepot_download/92/23885

Namely, the Statement stresses the importance of:

1)     

Network Neutrality is the principle according to which Internet traffic shall be treated without discrimination, restriction, or interference regardless of its sender, recipient, type or content so that Internet users’ freedom is not restricted by favouring or disfavouring the transmission of specific Internet traffic. Exceptions to such principles shall be necessary and proportionate to achieve a legitimate aim.

2)     

Interoperability is the ability to transfer and render useful data and other information across systems, applications, or components (horizontal interoperability) and for third parties to build upon a certain technology (vertical interoperability). The combination of transmission and analysis involves several layers of interconnection, requiring the achievement of various levels of interoperability. At a minimum, one should distinguish between the lower (network) and the upper (application) layers, pointing to a division between infrastructural interoperability and data interoperability.

3)     

Device neutrality is the property ensuring users’ right to non-discrimination in the services and apps they use, based on platform control by hardware companies. That means users can have a choice of the application they prefer to use, regardless of the brand of device they are using. In other words, device neutrality is instrumental to achieving the ability to run any application so that users can access and share to all applications, content, and services, as long as they are deemed legal in a given jurisdiction, which is essential to achieving an open Internet

Lastly, participants stressed that stakeholder should stop analysing internet openness focusing merely on the Internet access layer approach but should work together to understand how internet openness is threatened and can be preserved via a systemic approach.

IGF 2022 Open Forum #38 Data as new gold: how to avoid ‘gold rush’ and create value for all

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

1. Enforcement of the data policy regulation should be adequately addressed and ensured. Enforcement is crucial, but common understanding and harmonised approach across the globe are important too. 2. Civil society organisations play very important role to make sure human rights are defended. Multistakeholder engagement throughout the process is crucial. Human centric approach and human rights should be embedded in the legislation.

Calls to Action

Participants of the session called for models that benefit society starting from the principle that the most vulnerable should be protected. If they are – the whole society (is protected) too.

Session Report

Governance Forum’ brought together six leading EU and Africa’s experts representing private sector, academia/think tank, civil society and public sector/government.

The speakers of the session - Marek Havrda, PhD (Deputy Minister for European Affairs, CZ Presidency), Bridget Andere (Access Now, Africa Policy Analyst), Alberto Di Felice (Director for Infrastructure, Privacy and Security, DIGITALEUROPE), Maria Rosaria Coduti (DG CNECT, Policy Officer for Data Policy and Innovation), Chloe Teevan (Head of Digital Economy and Governance ECDPM), Johannes Wander (Policy advisor on digital development & innovation at GIZ to AUC) shared  their views on the forum topic.

 

Human centric approach – role of the governments and how to strike the right balance between making available more data for reuse and guaranteeing  privacy and data protection

 

Dr. Marek HAVRDA informed that CZ presidency presented a Joint policy statement - Human-centric approach at the core of the standardisation and connectivity, at ITU Plenipotentiary Conference (Sept. 2022) on behalf of EU27, signed by 57 countries. He states that in a modern world we are dealing with trade-offs, privacy is one of the criteria, understanding and being aware of the use of technology, respect for the individual autonomy are other important criteria. And finally there is the overall impact on human well-being at large. There are much larger risks and we need better checks and balances, combining different types of data, especially criteria of privacy. On digital divide: having a human centric approach should also help to bridge digital divide. We need to know what data is out there and how it is used in the communities.

 

Mrs. Maria-Rosaria CODUTI expressed opinion that data protection and access sharing & use should not be treated as contradicting elements, these are two elements of the same face. EC is putting in place data protection in legislation when drafting, and complement already existing legislation. You can't just protect data and impede the use of data. We create a trustful environment, a trustful ecosystem, data subjects, data users, etc. the basis of our data governance model is a human centric approach. We have put this into the legislation, into the Data Act for example. This empowers users, ie. of IoT objects, that by individuals, by interacting, by using IoT objects they produce valuable data and they need to have a say on the data. We need investing in technologies that includes privacy by default.

 

Digital transition and data economy – how ensure no one is left behind?

Alberto di FELICE noted that companies in Europe - big or small  - do not have monopolies in the market. These are all central points of the EC. And their data strategy has been centred around sovereignty and how data can protect the economy. In the EU we are in the middle of several proposals around data, such as AI act. And respect safety and fundamental rights, data sharing across sectors and players in the economy (Data Act), vertical proposals such as the EU health data space. It is a complex environment because there are lot of proposals and also lots of regulations in place (e.g.GDPR). ‘Gold is great but it's also heavy’, so we need to know the amount of regulation particularly if we have more of it, can also facilitate data sharing. We're also in the middle of a global crisis (pandemic, war in Europe). One aspect underlying data discussions is a connectivity. We're building partnerships worldwide, e.g. US: TTC strengthening joint initiatives built on the global gateway.

 

How to build a data governance model which benefit both the economy and the society? Regulation versus enforcement: margins for improvement.

 

Bridget ANDERE – expresses opinion, that  human being is central. We need always pick society if having to choose between society and the economy. Importing laws and infrastructure from other places without looking at consequences in own country are detrimental. So we need to build models that benefit society, what impact will it have on the end-user. Human rights diligence is very important. Look at the people whose data will be collected and who is at most risk, make sure they're protected and you don’t have to go back and fix it in retrospective. We need to engage in public participation in these processes. Regulation vs enforcement is not just a problem in Africa. We have lots of regulatory frameworks that are amazing and have policies and regulations that are supposed to go with them, these laws are often existing in vacuums. No absolute rights protection, opt-in/opt-out mechanisms, just use national security as a reason. Limitations on laws are very broadly formulated and not clear, creates lots of gaps. When it comes to operability we have principles but we find ourselves with really good laws but bad implementation. We need mechanisms that allow people to complain about infringement of their rights. 

 

Chloe TEEVAN argued that GDPR, one of most established European regulations, was based on multistakehokder consultation (incl civil society), but it is also not without its faults. It has become a model around the world but is not necessarily either adhered to in other contexts/countries. Ireland hosting many tech companies, this has an impact not only at EU data protection but also globally as other countries have to go through the Irish Data Commissioner. And if there are improvements it is because of civil society putting pressure. Data Commissioners across the EU have been discussing how to improve enforcement and also invited civil society to the table. Multistakeholderism is important element, and if enforcement improves it is because civil society constantly holds government accountable. Active CSO participation is really essential and also means adapting to the context you're in and bring the voices in. In certain contexts there aren’t even data commissioners in place to enforce such regulation or they don't have the resources and independence they need. Also questions whether big tech - even in EU- takes government seriously, they pushing limits in EU. Even more difficult for smaller African countries. These are just a few issues with enforcement, also when talking about GDPR as a gold standard.

 

Key challenges on regional level (focus on Africa)– how to act efficiently to closing digital gaps. Role of data to bridge digital divide, ensure strong data protection and inclusive economic growth - case study of EU-AU Data Flagship and the Digital Global Gateway.

 

Johannes WANDER presented a perspective is from the work with the African Union. Lots of countries in Africa are interested in more localisation which then tends to very much a locked-in approach and you can not leverage the benefits for the economy and society. AU developed Data Policy Framework and endorsed it this year, the EU is supporting this. AUC is in a leading role in formulating such policies at a continental level. EU should support this endeavour, also as part of the global gateway. But of course harmonisation is an issue, AU has double as many members as the EU, but we all know how many years it took for the EU to reach a consensus. Enforcement is also an issue. This is something we will work through the next three years, plan is to align existing stakeholders to bring this framework alive. We need to find solutions at national and community level. How do we ensure a just data governance? How can the African continent manage and use the data for itself, without localising more than 50 countries. Certainly the data is a new gold or oil, but the main question is how to make use of that.  Some countries have legislation in place and second step should be enforcement to ensure the economic growth.

 

 

Question/ Answers round.

 


·      

Tony Blair institute: how can we simplify protection evaluation to actually make data transfer possible across borders in case of GDPR?

Maria Rosario CODUTI answered the question by addressing international provisions of Data Governance Act, between EU and third countries. Provisions to ensure sensitive publicly held non personal data so it will not be subject to unlawful access. We have a regime abased on intervening acts for non-personal data that we think will create a bottleneck. We extend the provision in data governance act to cloud service providers and customers. These rules are similar to Schrems II. We don’t create data localisation but encourage data sharing.

 

Chloe TTEVAN mentioned South African example -  it took a certain amount of time to develop the Protection of Personal Information Act (POPIA) and by that time the EU has moved to GDPR and South Africa was not granted adequacy. So this is a really big question.

 


·      

Question (Government representative): knowing data is gold, how to manage our data if there is no standard framework internationally?

 

Bridget ANDERE. There is no one standard framework working for everyone. So how do we ensure adequate data protection? Take the person that is most at risk and then you have adequate data protection frameworks that will work more widely

Johannes WANDER added that in case of the AU data framework, its principles that can be interpreted at a national level and see what works for them

 


·      

Question: (representative of Ministry of Technology, Ethiopia): which values can be incorporated at national levels in African countries?

 

Johannes WANDER replied that in EU there is a law interpretation focused very much on the individual. In some countries in Africa the communal aspect tends to be more important or higher priority than the individual. For example, health data is very private and should not be at communal standard. But societies really vary across the continent.

 

Dr. Marek HAVRDA added that  there is a need for new methods to monitor enforcement. Especially on privacy rules that differ between countries.

Key takeaways from the forum:
-Enforcement of the data policy regulation should be adequately addressed and ensured. Enforcement is crucial, but common understanding and harmonised approach across the globe are important too.
-Civil society organisations play very important role to make sure human rights are defended.
-Multistakeholder engagement throughout the process is crucial.
-Human centric approach and human rights should be embedded in the legislation.
-A clear need for models that benefit society starting from the principle that the most vulnerable should be protected. If they are – the whole society (is protected) too. 


IGF 2022 WS #497 Designing an AI ethical framework in the Global South

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

The AI ethical framework in the Global South relies on hard and soft law. Countries like Brazil, Chile and China are closer in the development of hard law, while in other regions like Africa and India the soft law approach predominates. In any case, there is an intense connection with development and innovation when it comes to AI and regulations need to consider ethical guidelines, human rights, diversity and multistakeholderism.

Calls to Action

Government: be more transparent and inclusive, considering most vulnerable groups in the debate. Civil society: keep strengthening underrepreresented voices and raising issues related to impact to human rights when it comes to AI use and development.

Session Report

The moderators Cynthia and Alexandra started the panel by introducing the regulatory context of artificial intelligence. Then, they introduced the respective panelists, the dynamics and objectives of the workshop, which, in short, was intended to explore how the regulatory landscape of AI has been developed in the Global South.

 

The panel's initial question was asked by moderator Alexandra: “What steps have your State taken in creating a regulatory framework for AI? Are there any legislation, bills, policies and/or national strategies seeking to establish rules or recommendations for the development and use of artificial intelligence? If so, what are their main features?

The first panelist to respond was Smriti Parsheera, from India, representing civil Society. She is a Fellow with the CyberBRICS Project from the Fundacao Getulio Vargas.

Smriti responded that the focus of discussion in India has been issues of promotion, innovation and capacity building. She also mentioned liability and regulation, noting, however, that these have not been the main focus. She argued that the main processes are not legislative and binding, but soft law mechanisms. She also mentioned that government committees have been created that look at privacy and security aspects. She mentioned the 2018 “AI For All” document, which sets out principles for responsible artificial intelligence, and stressed that people are already beginning to talk about the need for risk-based regulation aligned with principles such as those enshrined by UNESCO. Despite the existence of discussions and proposals for regulation, she stated that the focus is still on compliance and self-regulation, with a long way to go before binding legislation emerges in India.

 

The next panelist was Wayne Wei Wang, from China, representing technical Community, through the University of Hong Jong and Fundação Getúlio Vargas.

 

Wayne argued that China has a governance model for AI and data protection, mentioning that Oxford held a conference called “The Race to Regulate AI” in mid-2022, where three approaches to AI regulation were mentioned. He pointed out that regulation is often not specific and can be used together with data protection legislation. He argued that Chinese regulation encourages the large population to participate in the digital transformation, for example, in 2015, where there was the Made In China 2015 Plan and the Internet of Plus Initiative. In 2017, a formal document called The New Generation AI development emerged that defined the commercialization of AI as a market goal. And in the last two years, China has established a government AI committee that, although centralized, allows multistakeholder participation. Wayne summarizes that China regulates AI through hard law mechanisms, such as trade and data protection legislation, and also soft law, mentioning national incentives and strategies. He ended by mentioning that China has introduced specific legislation, “The Preventions of Internet Information System”.

 

In continuity, the panelist Thiago Moraes, from Brazil, representing government, through the Brazilian Data Protection Authority, replied that the debate in Brazil has a national strategy (2020) and a bill in progress, also mentioning the importance of guidelines of the OECD in this process.

 

He mentioned that the national strategy is based on the horizontal axes of legislation, regulation and ethical use, AI governance and international aspects, in addition to six vertical axes with related themes. In the legislative field, he mentioned the bill 21/20 in which there is a Commission of Jurists with 18 specialists to prepare a substitute text based on the debate with public hearings and a multisectoral approach, in order to understand the socioeconomic reality of Brazil. In the field of supervision and governance, he argued that the idea of multiple authorities coordinated by a central authority is a possible proposal for Brazil.

 

The panel continued with Bobina Zulfa, from Uganda, representing civil Society through Pollicy.

 

The panelist pointed out that she would address an overview of what is being discussed in Africa as a continent, since unfortunately in Uganda there is still not much regulation on the subject. She mentioned that in Africa few countries have written about the subject and that progress in this field is still slow. For example, only about six countries have national AI strategies and only one country, Mauritius Islands, have an AI legislation. Much of the regulation stems from data protection and soft law discussions, such as the Malabo Convention of 2014 (which only thirteen countries have signed), and Resolution 473 of 2021 which aims to study AI and robotics in terms of benefits and risks. At the moment, it is suggested that attention is being paid to the principles being developed in other regions in the hope that this will reach people on the African continent in a positive way. On the other hand, it was mentioned that in these discussions there is still a lot of opacity, being necessary to add transparency and participation.

 

The last panelist, Juan Carlos Lara G., from Chile, representing civil Society, through Derechos Digitales.

Juan Carlos points out that technology has been seen in his country as an opportunity for development and participation in the global dialogue with countries that are at the forefront of this implementation process. In Chile there is a national artificial intelligence policy for the years 2021-2030 and an action plan to implement the policy, coming from the Ministry of Science, Technology, Knowledge and Innovation. The Chilean experience is to assess the country's capacity and stage in the implementation of AI, in an optimistic and economic view that says very little about the boundaries of the technology, being much more focused on assessing potential to the detriment of ethical and responsibility challenges. There is still an ethical gap regarding the discussion of risks, impacts, accountability and damages. However, recommendations are being made and it is important to include new voices and participants in the debate, in addition to deepening it to understand local needs.

Moderator Cynthia highlighted the use of hard law by some countries and soft law by others presented, as well as the need for inclusion, participation and ethical guidelines. Next, she asked Smriti and Thiago a question: “Is and how diversity and multistakeholderism taken into account in the regulatory debates? (race, gender, territorial, vulnerable groups, academia, civil society, private, public sector, specialists)”

Smiriti responded that diversity and multisectoriality can be analyzed at various levels. The first level is who has a seat at the table when the discussion is taking place; the second level is that of those who participate in the debate with deliberative capacity; and the third level is that of who is producing knowledge in this process. She argued that India has a very diverse social context, which highlights the concern with non-discrimination and bias.

In this context, she stated that the government sector has involved the private sector and part of academia as multistakeholders in the discussion and that Centers of Excellence have been implemented in technical institutes around the country where startups, entrepreneurs and academia are called to dialogue about innovation. Furthermore, the National AI Portal is being developed, the result of government collaboration with an industrial sector, which aims to be an institutional repository of AI in the country. She also mentioned government committees, which include people from academia and industry. However, she concluded that the discussion is still not open to all who represent the diversity of academic perspectives and that the participation of civil society is critical because it is being little heard. Therefore, it is necessary to improve the transparency and participation of the process.

The moderator Cynthia emphasized that this gap is a point in common with Brazil, due to the difficulty of including some groups in the debates, which are dominated by the preponderant presence of the private sector. She highlighted the relevance of the discussion to ensure the participation of affected vulnerable groups who are not included and who need space for deliberation.

Thiago recalled the importance of multisectoriality in Brazil, as in the case of the “Comitê Gestor da Internet”, in which all participants must be taken into account, as in the process of drafting the “Marco Civil da Internet” and the “General Data Protection Law". The challenge was highlighted in a country of great diversity in Brazil and that indigenous peoples are still little heard, despite being an extremely important part of the country. Thiago pointed out that in Brazil there is an effort on racial and gender diversity, but that there are still many challenges to face. It was pointed out that Bill 21/20 is an interesting experience because it was proposed in 2020, in the pandemic year, and this discussion was not deepened, so that the private sector took on a lot of prominence in the debate. Only in 2021 was the Commission of Jurists proposed, where more voices were expanded.

In continuity, a questioning was made by the on-site hearing, where it was questioned, for the Chinese case, how the inspection has been carried out, considering the legislation that deals with algorithmic transparency and recommendation systems. Representatives of civil society were also questioned about experiences of participation and diversity.

Juan Carlos responded to the last question that it is important to highlight public consultations, which rely not only on individual responses and external experts, but on the invitation to introduce and create with people from civil society and who do not necessarily have technical knowledge. On the other hand, he mentioned the participation that started on digital platforms and that did not have accessibility or other languages, including indigenous ones. Furthermore, 70% of the people who answered the query were men. So, there is still a lack of processes to overcome inequalities.

Wayne, in turn, replied that in China there is the Cybersecurity Administration (CSA), which adopts a routine for supervising activities called the Clean Cyberspace Campaign. Other supervisory authorities are Companion-Life Enforcement Activities and the Ministry of Industry and Information Technology that also adopt this type of campaign. These authorities examine, for example, applications in terms of data protection, security, etc. China has also released a guideline on algorithm registration systems.

One more question from the on-site audience was asked. This time it was asked if there are specific examples of how the government has engaged target groups in discussions.

Bobina responds that she is not aware of specific legislation in Africa, but civil society groups and academia, such as the African Commission for Human and People’s Rights, have tried to broaden the debate at initial levels.

Juan Carlos, for his part, responded that Chile has the example of a national cybersecurity policy for the years 2015-2021, where some groups focused on the issues were heard. It was still a restricted initiative, but guided by the government. Juan Carlos went on to add that this initiative is not just up to the government, but can also be promoted by civil society. This collaboration can come from the promotion of training, petitions or invitations to participate in procedures, being fundamental to think of ways to collaborate and also cultivate the knowledge of the academy.

Smiriti pointed out that technology policy must be transparent with civil society.

In continuity, a question was asked by the remote audience to the panelist Thiago. Questions were asked about transparency and how it works in terms of consultation and engagement.

Thiago replies that in Brazil there is an effort towards transparency and that there are challenges, but also some examples of what might work. At this point, for example, the Access to Information Law, which is about ten years old and which can help with these challenges of transparency. This is legislation that deals with the collection, access and request of information to the government. He pointed out that the legislation may conflict with data protection in some cases, but there is still optimism about its functioning. The central question would be what degree of transparency is achieved, combined with the difficulty of technical capacity and financial resources, which is usually addressed to the private sector and which is a difficulty that can be faced in other countries as well.

One more question from the on-site audience was asked. This time Bobina was asked how the debate on facial recognition in public safety has been conducted and what are the main concerns related to the topic.

Bobina replied that facial recognition technology for public safety purposes has already been used on the African continent, in Zimbabwe, Uganda, etc. and that such mechanisms culminated as instruments of mass surveillance, with many researchers emphasizing the harm resulting from this use.

Juan Carlos said that the issue touches on public interest and that in Latin America, regardless of any regulatory debate, facial recognition in public safety is inadequate for fundamental rights and is usually being questioned in the courts. He also argued that the systems are neither technically robust nor legally authorized and that this has been happening in cities like Santiago and São Paulo.

The last question came from the on-site audience and asked what would need to be done and discussed in the future for the AI regulatory framework in the Global South.

Wayne replied that one of the biggest challenges to be resolved and that has been guided by the Chinese discussion is the paradox of transparency and commercial secrecy, making it difficult to assess accuracy with the protection of secrecy. He also mentioned the co-regulation model and the expansion of stakeholder participation. Finally, he raised the point of “ownership” of data and algorithms, which have been commercialized in China.

Thiago replied that there are many steps to be taken and that it is necessary to think about an intelligent regulation that can be applied. He mentioned that hard law often fail to keep pace with the development of innovation, and it is necessary to think of alternatives. At this point, he also argued about means of partnerships between sectors that can favor dialogue and that have been promoted in the last five years, such as hackathons, innovations hubs and regulatory sandboxes. All of them have specific characteristics and give the public manager the opportunity to approach the regulated field. But there is still a need to think about designs for transparency, diversity and amplifying voices.

Moderator Cynthia concluded by pointing out that the use and development of artificial intelligence is usually accompanied by the rhetoric of innovation, however, it is necessary to talk about the risks and impacts on fundamental rights. It is a challenge and the panel ends raising several questions and reflections. Finally, the use of facial recognition for use in public safety was mentioned, considered as a discriminatory measure, without security and that harms vulnerable groups in Brazil.

The panel ends with a reflection on the balance between the protection of rights and innovation. Moderator Cynthia reminds the audience that they can ask questions through the institutional contact of the Laboratory of Public Policy and Internet, and moderator Alexandra closes the panel.

GENDER INFORMATION - In the virtual audience, about 7 women were present, apart from the moderator and the rapporteur. At the on-site hearing, two women were present, apart from the moderator. About 7 men were present at the virtual hearing and about 8 men at the on-site hearing.

IGF 2022 WS #309 Access to remedies in safeguarding rights to privacy & data

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

More resources are required to inform public authorities of their responsibilities towards data protection and privacy rights of data subjects.

, Capacity building efforts must focus on informing the data subjects of their rights.
Session Report

Workshop #309:

Title: Access to remedies in safeguarding rights to privacy and data

List of panellists, chairs and moderators:

Panellists: Cynthia Chepkemoi (Data Protection Counsel (Advocate), Association of Privacy Lawyers in Africa, Kenya); Mosa Thekiso (Executive Head: International Legal & Regulatory Digital Services & Platforms and AI at Vodacom South Africa); Maureen Mwadigme (Senior Human Rights Officer: Kenya National Commission on Human Rights); Stella Alibateese (Director: National Personal Data Protection, Uganda)

Chair: Dr. Jonathan Andrew (Danish Institute for Human Rights)

Moderator: Cathrine Bloch Veiberg (Danish Institute for Human Rights)

Rapporteur: Line Gamrath Rasmussen

 

The session was moderated by Dr. Jonathan Andrew, representing the Danish Institute for Human Rights (DIHR), which is a national human rights institute (NHRI). The DIHR works closely with other national human rights institutions globally, a number of whom travelled to attend the IGF 2022. The DIHR continues to work on the theme of access to remedies, which is part of a broader project and initiative of the Action Coalition on Responsible Technology, an initiative funded by the Danish Foreign Ministry that brings together different stakeholders from civil society, nongovernmental organizations, public authorities, businesses, and other interested stakeholders who are participating in a yearlong program of events to strengthen the use of technologies responsibly on a global level.  The Action Coalition on Responsible Technology incorporates a work stream on policy coherence which is reviewing how regulations and different initiatives in relation to legislation are creating alignment in oversight, including in relation to access to remedies.

Substantive Report and Main Themes Raised:-

Data Protection and Privacy Rights in Kenya

- The legal framework in Kenya of the Data Protection Act of 2019, and the Computer Misuse and Cybercrimes Act of 2018 provide the basis for regulating data collection, processing and retention. The Data Protection Act has provided Kenya with regulations that put in place the procedural laws on how the registration of data controllers and processors must be conducted. The Data Protection Act also provides for a complaint handling procedure and outlines how data subjects can file a complaint to the office of the Data Protection Commissioner.  Whilst there exists a process, the mechanisms to seek redress where there is a violation of privacy have taken time to evolve into viable means of remedy.

- Enforcing data protection law in Kenya has proven to be a painstaking process, and larger tech companies continue to be responsible for some of the infractions that occur. It remains the case that many citizens are not aware of the procedures that they need to follow, such that much remains to be done in terms of sensitisation and capacity building to ensure a citizen is aware of, and can actually follow, the legal procedures in place to seek redress.

- The Data Protection Act also establishes an intricate system of rights and obligations that operationalise the right to privacy. Data protection authorities have a duty to receive and act on all complaints by individuals, and sometimes the authority on their own initiative can also investigate issues they have identified. The first stage of compliance is the DPA conducting privacy audits so that they can review their compliance level in terms of data governance, and whether they are actually registered as data controllers or data processors.  In cases where they have not registered, this means that they are not yet in compliance.

- A major consideration in relation to finding remedies are the different reporting mechanisms available with respect to violations of privacy.  Most frequently, the first port of call for any institution receiving a complaint is to attempt to resolve the dispute in-house. In certain cases, a party may have an alternate dispute resolution (ADR) mechanism in place and outlined in its privacy and data protection policy: where there is a data breach, this mechanism can be used to attempt to resolve the violation or breach.

- A second point of call for a violation of privacy rights is the Office of the Data Protection Commissioner (the Kenyan DPA): this is the authority in Kenya tasked to set the rules and regulations on how personal data is being handled, processed, stored, and is the authority to which all the data controllers and data processors are required to report any issue of a data breach or data loss.

 

- Personal data of data subjects in Kenya have on occasion been shared with a third party without consent having been given. Where this information is shared with a third party without consent, then this would amount to a violation of your rights as a data subject. From experience, when a complaint has been filed with the DPA, the office takes around 14 days to respond to the complaint. Then the DPA will ask the party that is the subject of the complaint to respond and provide evidence: this reflects the importance of fair administrative procedure whereby each party must be given an opportunity to defend themselves.  At this point, it is frequently realised that the data controller or processor actually had policies (also known as ‘agreements’) where the data subject consented to the processing. As such, consent to wider processing is often very broad: data subjects simply haven’t read the terms and conditions of the agreements which can be extremely long and convoluted. A final avenue for redress is the courts. 

- Public authorities, such as hospitals and schools (processing sensitive children's personal data or patient data), are often advised to have datasharing agreements in this regard with respect to any transfers of personal data. These agreements can protect the organization from liability, and from the risk of court proceedings or where complaints are filed to the Office of the Data Protection Commissioner.

 

Access to Remedy in Uganda: Role of the Personal Data Protection Office

- In Uganda the right to privacy is enshrined in Article 27 of the Constitution of Uganda. In 2019, the Ugandan government enacted a comprehensive law, the Data Protection and Privacy Act. The Act is a comprehensive law that was set up to further enhance protection of personal data, and it introduced specific digital rights in Uganda.  For example, the act has an entire chapter on data subject rights including the rights to access to your personal information, the right to erase your personal information, the right to make connections, the right to stop automated decisionmaking and many others are also provided for. Prior to the Act, Uganda had other laws that provided for privacy protection more generally. The law also provides for the Personal Data Protection Office. Part of the mandate includes resolving complaints from data subjects, so if a person finds that her/his rights have been infringed upon by a data controller or data processor, then the law gives you a right to make a complaint to the data protection office.

- The Personal Data Protection Office in Uganda also provides guidance, particularly to data controllers in regard to the interpretation of the law in respect of issues related to compliance. The legal framework also gives powers to the DPA to investigate, and it can also prosecute where it finds there has been non-compliance. Under the same laws, the Ugandan DPA is required to register all data controllers and data processors: currently the entire system is online (including payment and certificate issuance). Under the online system the office also receives automated updates on complaints filed. The office activated the system around May 2022, and it has currently over 2,000 complaints that have been raised against various data controllers. Crucially, it is key to ensure that data subjects can access their rights under the act. The Ugandan Act is very specific: it provides for their rights within the regulations and provides for mechanisms of how the data subjects will raise their complaints. Within the regulations there are specific provisions that require data controllers to respond to those complaints within certain timelines.  The timelines range from 7 days to 14 days.

- Under the guidance notes that the Uganda DPA issued for data subjects to raise complaints, it requires that data subjects first engage with the data controller or the data processor before they come to the office of the DPA (this aspect of the process is also enabled through the online system).  If a data subject finds that it has a complaint to raise, she/he can use the system to develop the letter that they can submit to the data controller (it is automatically generated from the system). This was put in place to ease the complaints filing mechanism, because it was known that many people may have challenges writing letters.

- Ugandan data protection law also requires data controllers and processors to have inhouse complaint resolution mechanisms. The Ugandan DPA provides training for data protection officers, who are focal points of contact in these organizations. They are provided with training too on how to deal with various complaints. Regarding the Ugandan DPA’s role, its mandate under the law allows for it to investigate the complaint. 

- In terms of the current legislation, given the regulations were passed only in 2021, the country has not yet had any prosecutions brought under the new laws, however, the DPA does have a number of investigations that are currently being undertaken. 

 

A Business Perspective on Access to Remedy: Vodacom Group

- In its business activities Vodacom Group manages a number of privacyrelated issues across the continent and across various countries. The issues the business deals with on a daily basis are broad, given its drive to take Africa as a region fully into digital inclusion and financial inclusion: these are the main topics that are top of mind for Vodacom Group - it wishes to avoid a scenario where Africa and African consumers are getting left behind from a digital economy perspective.

- A lot of emerging and new innovative technology requires a lot of data and data processing. As such, with these datarich technologies a key balance is actually how to use these technologies whilst also looking after the rights of its consumers.

- Vodacom has undertaken a study on how the business actually achieved the balancing act in the current regulatory environments that are present across Africa.  It is important to point out that remedies do differ from jurisdiction to jurisdiction, which poses a lot of challenges for Vodacom as a big business. Robust, relevant measures are in place at Vodacom, yet in contrast it can be difficult to convey just how complex and difficult it is for a smaller entity trying to manoeuvre through Africa from a business perspective to grapple with these different laws as they change from country to country. 

- The main barriers that Vodacom has identified with regard to rolling out datarich technologies are data localization laws. Vodacom observes that when it is dealing with big data or AI technology and leveraging off technology provided by Cloud service providers, that they tend to take a regional approach. As such, for Vodacom to use these technologies it has to think about where it is going to centralise its operation of the tech in question. For example, would it use the Amazon Cloud in Cape Town or perhaps another Cloud in Kenya?  However, because Vodacom wishes to move its businesses forward throughout those jurisdictions at the same time, it tends to have to use one hub - and that means that data is always moving across borders. Thus, the first critical issue is data localization.  The second is that in many countries across Africa there are data protection laws, but in others they don't have the data protection laws in place yet. In some countries there is however a constitutional right to privacy, so a business such as Vodacom obviously has to take that into account.

- Vodacom conducts its own studies to determine how it develops and responds to emerging factors relating to data protection and privacy laws. Taking into account the rights protected from the constitutional perspective and also from dataspecific laws or data protection-specific laws, it has reviewed a number of best practices contained in policies, and in digital agreements. It has reviewed the Convention for the protection of individuals with regard to personal rights (Convention 108+). Vodacom also looked at the EU’s digital transformation strategy and the data policy framework.

- From a standards perspective, which form much of the business’s focus, it also looked at Mauritius, which is a good example of a robust data protection act which takes care of the rights of data subjects and also signatories. As such, from a bilateral perspective, the business also reviews preferential trade agreements: for example, Singapore is a good example and has robust bilateral agreements with Australia and also with New Zealand.  Vodacom also undertook to examine the African Continental Free Trade Area (AfCFTA). This approach outlines therefore a broad perspective by a business in evaluating policies and so as to develop best practice: the recommendations the company makes are thus what it understands it needs in order to protect data subjects and respect the laws. 

- Taking into account the technologies, it is important to look at how to take a regional approach. Thus, the business also takes policies and looks at them from a regional perspective. Another option that can be considered is regional cooperation by trade agreements, where provisions can be made for rights, and also for regulatory reform.

- Vodacom Group takes a regional approach in cases where there aren't any data protection laws in place.  It also encourages the ratification of international conventions such as Convention 108 +. Further, it understands too that a foundation is the right to privacy, which exists in most constitutions. In addition, Vodacom has in place as a business specific measures such as privacy by design (PbD) - this is part of its approach whenever it is dealing with any data technology, which is essentially a constant in the current operating environment. Privacy impact assessments (PIAs) are also used, including internally even for jurisdictions that don't have laws in place: this has been developed as internal best practice. Whenever the business is working with any kind of data processing, it starts with its privacy impact assessment, and adaption is required to each jurisdiction accordingly given different laws.

- Vodacom Group, when given the opportunity to comment on the various policies or laws that are still in draft, provides input e.g., with a new bill in Tanzania. Vodacom aims to take a robust and balanced approach in its activities: this is key in protecting rights of consumers from a privacy perspective. On the AI side, it is a little bit broader from an AI perspective: other constitutional rights are impacted, such as freedom of expression, equality and non-discrimination. One also has to consider how to deal with biases in data. Vodacom is therefore constantly thinking about these rights, whilst at the same time trying to cater for digital inclusion and financial inclusion.

 

The Role of National Human Rights Institutions – the Kenya National Commission on Human Rights

- The Kenya National Commission on Human Rights (KNCHR) is an ‘A’ status national human rights institution according to the Paris Principles.  The Commission has a clear mandate to speak on matters of digital rights. With regard to emerging digital technologies, it has become very clear that even seemingly neutral technologies can actually replicate preexisting inequalities and contribute towards marginalisation. Technology impacts human rights positively, and at the same time may have a negative impact - this is where the role of oversight institutions, such as the Kenya National Commission on Human Rights, can function in addition to other organisations such as data protection authorities.

- As a national human rights institution, the KNCHR is very keen in providing oversight of online spaces to ensure that the milestones met in the physical world are not lost in digital spaces.  We note that there are so many issues and human rights concerns that have been happening in the online spaces, and unfortunately most are often not regarded as such as human rights issues. For example, in Kenya a case study was conducted by the KNCHR due to having received a lot of complaints on matters of freedom of expression, where activists have been arrested and charged (frequently with offences under the Computer misuse and cybercrimes Act 2018 - Kenya). This proved particularly the case during the COVID19 epidemic where human rights defenders really took to express themselves online as opposed to going on the streets due to the limitations of public protest with which we are all familiar.

- Censoring and blocking are also key issues. For example,  public institutions that have a Twitter handle or Facebook page have unfortunately taking steps to avoid criticism by seeking to censor negative comments about their actions and activities. These authorities have in certain cases taken steps to ensure people are blocked from receiving any messages or interacting further on particular platforms.

- Surveillance is also a key concern, including government surveillance and surveillance by businesses. The targeting of consumer decisions and gaining insights on activities through processing personal data, such as by FinTech companies, is considered a huge problem in Kenya. The Central Bank of Kenya has been spearheading regulation of this sector, so as to ensure that there is a sensible approach with respect to FinTechs targeting civil and political rights. There has also been progress with regard to oversight of  government surveillance activities targeting civil and political rights in Kenya, including voting rights.

- Kenya has also experienced a number of massive data protection breaches. Prior to the elections in August 2022, a large number of Kenyans found themselves registered as members of political party with the Office of the Register of Political Parties even when they hadn’t in fact registered themselves. As such, this instance reflects a very interesting finding that political parties will go the extra mile to get very specific information  on individuals to be able to meet the threshold that was required by the Office of the Register of Political Parties to be able to register as a political party.

- Another concern in Kenya is that the country is seeing quite a lot of movement in terms of compliance in the private space.  In fact, recently the Office of the Data Protection Commission made a requirement on the regulations and compliance procedures by private companies.  However, for the government  the situation is quite different.  In essence, there also needs to be awareness in government of the need to follow data protection laws: government is in effect the largest data controller. There still exists a misguided belief that the public sector cannot infringe on personal data laws, and this approach must be challenged. Fundamentally, in Kenya, state departments, agencies, the government in general should lead by example and implement data privacy programmes within their organizations. On the issue of access to remedy itself, national human rights institutions (NHRIs) are very independent and trusted entities and are thus able to be engaged successfully. The KNCHR already receives a lot of complaints and a lot of feedback from communities and from users of particular technologies. Thus, in terms of providing legal advice, holding public awareness forums, these continue as activities conducted by the NHRI so that the citizens are actually helped to understand their rights, especially with regards to digital rights.

Conclusions: How should digital accessibility issues be tackled so as to safeguard the digital rights and access to remedies that the different stakeholders are all working to achieve?

Response from the Kenya National Commission on Human Rights (KNCHR)-

  NHRIs, such as the KNCHR, can work to ensure that the vulnerable and marginalised groups are not impacted by our actions when it comes to issues of access to online services.  Unfortunately, what is happening right now is that technologies are often marginalising the vulnerable even further. Thus, for an NHRI it is important to think, how it can work with ISPs and other companies to be able to understand the need of specific areas that have been mapped out.  Secondly, it is important to ensure that services are equitably distributed across the population. However, it is important to take into account the business angle, and whether they will be able to recoup their profits when they go into more rural areas.  A key question is therefore how best can government incentives be used to ensure that such companies can reasonably reach out to these offline areas and at the same time mitigate higher costs in doing so? This is an issue that requires a multi-sectoral approach: it is not one that can be dealt with by one sector alone.  It is a challenge that requires a mapping aspect, a monitoring aspect, and reporting - all parts must be performed so that the vulnerable and marginalised groups actually benefit from increased connectivity of networks. Stakeholders in the digital sector need to work very closely together as human rights are interdependent and the different roles that actors play here in their respective capacities all complement each other. Working in siloes doesn’t work; it is clearly necessary that the respective stakeholders in their different capacities come together to be able to impact positively on matters ensuring protection of the rights of users of the technology that is under development. 

 

Response from the Ugandan Personal Data Protection Office-

  Digital connectivity and access are a valid concern.  On the part of the Ugandan DPA, it is trying to address the issue by creating awareness through the local languages of the country (there are over 50 tribes speaking different dialects). At the  Ugandan DPA office only 3 or 4 of the dialects are spoken: this presents a large challenge where it is creating digital literacy programmes - clearly, it's important that you communicate them in the language that most people understand.  As such, this continues as a challenge that the DPA is trying to work out ways to address. First of all, it is interpreting the laws and then developing its work that can create awareness of the laws amongst the population. Secondly, in terms of access, whatever technology is developed, the Ugandan DPA makes sure that it provides for communication through current and future smartphone devices. For example, the complaint system is one that clearly interfaces with the population. The DPA has also enabled SMS and other technologies that enables an individual, even with a basic device, to be able to reach the authority and communicate. Obviously, the issue of engagement and connectivity is a journey and government need to continue with these efforts until the gaps are bridged.

 

Response from Cynthia Chepkemoi (Advocate)-

  Digital literacy is a broad challenge. Working with different institutions it is clear that, in creating awareness and improving digital literacy among marginalised communities and more especially women and children, the best approach is to work through associations, that's where you can reach many people and institutions. For example, in Kenya classes have been provided to train children on digital literacy, train them on cybersecurity and skills they need to stay safe online.  Also important is identifying specific groups that are more marginalised in the digital space. At times one of the major challenges is the infrastructure itself, in as much as in trying to roll out the services to marginalised communities, it is realised that they lack the infrastructure, so it even becomes more difficult to enhance digital literacy, but then through working with associations and civil society organizations it calls for a multi-stakeholder approach. A collaborative approach is required to actually attain and reach the digital literacy levels that we need to see among our communities.

 

Response from Vodacom Group-

  Vodacom has a very robust social contracting programme. and a big part of its function is when rolling out various products and services, for example, with its a momandbaby app (essentially a healthcare product that tracks pre and postnatal development) that connectivity is considered. For such services to actually go into the market, users need a smartphone - thus smartphone penetration is key, as is also the relevant digital literacy. As part of Vodacom’s social contracting programme, as it rolls out its various products that cut across different sectors (e.g., healthcare, education) it partners with Cloud service providers on areas such as education, for example. Vodacom continues to look at specific issues and identify new areas, and this approach goes hand in hand with that of educating consumers and users of those products on their rights, what the business does with their data, how the company secures their data. In addition, it is also important to inform them how they can hold the business accountable when it comes to their data if they're not comfortable with how their data is being processed, or if don't understand what we do with their data. It is important that consumers have at their disposal a resource or various channels to approach the company so they can learn and be informed.

 

- - -

 

IGF 2022 Open Forum #61 Future of the Internet: Realising a shared vision

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Over the past year, various events have shown the Internet’s strength, its resilience, and why we must protect it, all of which should inform actions taken by the Internet community in realising a shared future-looking multistakeholder vision, building on existing successes.

Calls to Action

Over the next year, we need to continue to translate positive principles for the future of the Internet into practice in the areas of extending connectivity, role-modelling to build capacity and empower national-level communities, strengthening global governance institutions and their leadership, and collaborating towards shared solutions to key challenges including Internet fragmentation and Internet consolidation.

Session Report

The UK’s Department for Digital, Culture, Media & Sport convened stakeholders from across the Internet community to discuss a positive future for the Internet, and how to translate this vision into reality.

The session took stock of initiatives since the last IGF which have contributed to a shared positive Future of the Internet, and which have made progress towards realising this vision. 

  • At the national level, the UK has articulated our vision for the future of the Internet in the refreshed 2022 Digital Strategy, and UNESCO has worked with a range of other countries to conduct measurement and assessment against the ROAM indicators.
  • In the multilateral space, we have seen almost 70 countries sign up to the Declaration for the Future of the Internet. 
  • Participants noted that over the course of this year, three major ITU conferences occurred, including a successful Plenipotentiary where a new Secretary General was elected. 
  • It was discussed that in the technical community, ICANN has continued work to close the language divide for 3000 languages, and within the IETF work has continued to analyse and address Internet consolidation. 
  • Here at the IGF, progress has also been made, with the intersessional Policy Network on Internet Fragmentation starting its work, and the appointment of a new Leadership Panel.

Across these developments, a number of themes emerged from the discussion in the session.

  • There was a sense of awe at the evolution of the Internet through adversity. Participants noted that hundreds of millions of new users have come online over the past two years, and that governments as well as the rest of the Internet community have been able to keep working through the pandemic. There was a renewed optimism in Internet governance as global meetings across the community such as IGF, ICANN, IETF have been re-energised by a move to a hybrid of virtual and physical participation. However, through this time of change, participants urged that we remain conscious of the strength as well as the frailty of the Internet, and take action to guard against two trends in particular: Internet fragmentation on the one hand, and Internet centralisation on the other.
  • Much of the discussion focused on leadership, and the need to set a positive forward-looking and ambitious vision for the Internet, rather than focusing solely on threats, challenges, and other negative aspects. Participants discussed the importance of reinforcing forums and their roles, while also noting the room for fine-tuning of leadership and governance such as the new Leadership Panel of the IGF. In the vein of leadership, the participants also spoke about bottom-up approaches, and the power that each of us have as role models to spread awareness and build capacity in our local communities, and in doing so root global discussions in local context. There was also consensus around the importance of leadership making processes more inclusive to participation from a wider set of stakeholders, as well as working to extend access to the Internet itself.
  • On a more practical note, the discussion touched on the ways in which stakeholders can translate their commitments and principles into impact. Participants noted the trend at the national level of updating technology regulations, as well as the need to enhance ICT infrastructure. Capacity building initiatives at local and national levels were also highlighted, as were measurement initiatives as a tool for targeting action, and tracking progress.

The session also looked ahead at the next year, and surveyed the critical moments to come together and shape the future of the Internet by translating principles into action.

  • In 2023, the Japanese government will host the IGF and the G7 presidency, and IETF116 will be held in Japan. Discussions on approaches to the Internet and cyberspace will also be hosted by UNESCO, and other parts of the UN system like the OEWG. 
  • The Global Digital Compact was highlighted as an important ongoing process which could serve to build consensus on a range of issues relevant to the future of the Internet. However, the process for developing the Compact would need to allow for meaningful contributions from the multistakeholder community, beyond an initial consultative round eliciting responses.
  • In the longer-term, the WSIS+20 review process was highlighted as a juncture at which the global community can come together and reaffirm our support for a multistakeholder Internet for the benefit of all, as well as setting ourselves new collective goals for the future, for example on development.
IGF 2022 Open Forum #56 Enhance International Cooperation on Data-Driven Digital Economy

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

All parties, including governments, civil societies, private sectors, etc., should cooperate to jointly build a community with a shared future in cyberspace. Efforts should be made to deepen digital exchanges and cooperation, expand economic and trade exchanges in the digital field, promote the interconnection of digital infrastructure, enable digital access for all, and jointly promote the sustainable development of global digital economy.

,

The establishment of international rules for digital governance based on consultation and consensus will enhance the development of digital economy. More engagement in establishing international rules for digital trade helps reduce trade barriers and facilitate the sound and orderly development of international trade.

Session Report

During the session, most speakers emphasized the necessity and urgency of international cooperation on digital economy, digital connectivity and data governance. They shared practices of different stakeholders in boosting digital economy while ensuring data security. It is suggested that governments, civil societies, private sectors and other stakeholders should unite and cooperate to jointly build a community with a shared future in cyberspace. Efforts should be made to deepen digital exchanges and cooperation, expand economic and trade exchanges in the digital field, promote the interconnection of digital infrastructure, enable digital access for all, and jointly promote the sustainable development of global digital economy.

Experts also focused on the establishment of international rules for digital governance. They maintained that the establishment of international rules for digital governance based on consultation and consensus would enhance the development of digital economy and that more engagement in establishing international rules for digital trade would reduce trade barriers and facilitate the sound and orderly development of international trade.

Some expressed that in order to develop and benefit globally from cyberspace, international cooperation must take place in cyberspace and benefit from the Initiative on China-Africa Jointly Building a Community with a Shared Future in Cyberspace, to become a global initiative because of its comprehensiveness, clarity, transparency and global mutual benefits.

In addition, some advocated more engagement in the international governance of digital economy such as joining the G20 Digital Economy Development and Cooperation Initiative, the Global Development Initiative (GDI) and other international cooperation agreements.

Reflection on Gender Issues (Gender Report):

The number of participants in the open forum is more than one hundred and the percentage of women and gender-diverse people that attended the session is estimated to reach over one third. The session managed to engage with gender as a topic. The onsite moderator and keynote speaker Ms. Yik Chan Chin delivered a speech under the theme of “Building Gender-Inclusive Digital Ecosystems”. She analyzed the challenges and difficulties facing women in the digital technology ecosystem and shared her insights into enhancing gender inclusion in the digital world, which were echoed by the participants at the session. She maintained that woman-centered design could enable digital innovation hubs to tailor their services to women’s needs and that collaboration should be made between digital ecosystem stakeholders.

IGF 2022 Open Forum #75 Combating Hate Speech – Answering the H & 5 W's Questions

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

A definition of Hate Speech is key to build a global understanding of the problem and how to address it (taking inspiration of the definition of the Council of Europe CM/Rec(2022)16 Recommendation on Combating Hate Speech). But such a definition would always need to be flexible to accommodate regional and national contexts. Implementation of legislation to combat hate speech is essential (within a human right compliant framework) but difficult b

Calls to Action

National, regional and global platforms, like IGF and others, are needed to prevent and combat hate speech, to help set common understanding around a definition, build capacity and improve cross border coordination to prevent and combat hate speech effecting so many in vulnerable situations. New standards adopted now need to be implemented.

Session Report

Summary report: Combating Hate Speech – Answering the H & 5 W's Questions

The Open Forum was attended by approximately 50 people on site and 10 people online.

Following a short introduction to the Council of Europe Recommendation on Combating Hate Speech and the UN strategy on combating hate speech, participants split in three different Break-out groups. The on-site participants joined the in-person break-out group which discussed the definition and legal framework on combating hate speech. The online participants split between two groups on ‘How to monitor hate speech online in order to address it’ and ‘How to use education and counter speech to prevent and combat hate speech’.

Summary from break-out group ‘How to use education and counter speech to prevent and combat hate speech’

  • Key to prevention is making users, and general public, aware what is and is not hate speech. In addition to understand the risk hate speech poses and how best to respond in given circumstance.
  • Education should provide a safe setting to explore how to balance different rights essential to a democratic society, such as freedom of expression, which comes with responsibilities and limits, and impact of hate speech on others and their right to non-discrimination and safety. A save educational setting allows us to learn how to have critical debates, deal with different opinions and conflicts constructively with respect for everyone’s human rights. 
  • It’s observed that Youth seem more aware of, reduce their use of, and act against hate speech. Meanwhile generation 50+ is becoming key contributor to the toxic environments online, reposting hate and dis-information. It raises the questions if internet media literacy should be strengthened among that generation?
  • Building on experiences gained from the Italian No Hate Speech Movement; key to successful education and outreach is cooperation between different stakeholders. Civil Society actors, Universities, cultural sector work and learn together to use human rights education methodology and raise hate speech as an issue in the public debate from their different viewpoints. This approach ensures human rights speech in public spaces, in education, in media and online, and makes for a coherent and consistent messaging.

Summary from break-out group ‘How to monitoring hate speech online in order to address it’

  • It’s reaffirmed that data gathering on hate speech is essential to devise effective and  meaningful policies and practices.
  • There is a lot of similarities between international standards regarding the requirements for monitoring and reporting on hate speech.
  • UN agencies are tracking data on hate speech.
  • Tech Companies have responsibility to collect and make available data on online hate speech. It should be used for public policy, and internal systems quality control, which must be open for external monitoring.  Cooperation between companies and other stakeholders should be encouraged, international regulation can help to do so.
  • The Council of Europe Recommendation calls on its member states to take appropriate measures to ensure that law enforcement effectively record and monitor complaints concerning hate speech and also set up an anonymized archive of complaints. That body of information should be disaggregated and available to relevant stakeholders for research, policy and monitoring purposes. To establish a data access framework by states would be very important.

Summary from on-site Break-out group on ‘Legal framework’

  • Participants shared examples of national legal frameworks in place to address (online) hate speech. For example South Africa, Kenya and Ethiopia have systems in place similar to the German Network enforcement act, but with some interesting different elements, eg
    • In Ethiopia Hate speech and mis-information is defined as a crime. If an account with more than 5,000 followers disseminates hate speech it leads to an aggravation of the punishment. So, the wider the hate speech is spread, the greater the punishment.
    • In Kenya an independent institute was established to monitor hate speech, including on grounds of ethnicity.
  • Other countries, like Sri-Lanka, have no laws on hate speech, but other laws are used, such as those covering discrimination.
  • It was found a necessity to address hate speech with comprehensive laws. As Ethiopian participants illustrated, hate speech can precede genocide.
  • A global agreed definition on hate speech is needed, but it should accommodate national contexts and realties.
  • In drafting the legal framework, implementation needs to be considered. Law enforcement find it difficult to collect evidence of hate speech and build a case for prosecution.
IGF 2022 DC-Jobs Responsible Internet Usage

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

Key takeaways: 1. Focus on Decentralization, Localization, and Governance: Technologies of the future are going to have a significant intersection of the Internet and ESG principles. The next generation of social media is going to be more decentralized, providing more empowerment and local governance. The Federated model of the Internet creates a lower digital footprint by building an internet that is more scalable and conscious in its power and

Calls to Action

Action items: 1. Quantify the carbon footprint of the digital activities, create labels of the matrix of usages like that of the food and aviation industry, and embed the standards in the platforms / Internet Protocol. Like, emails could carry an impact on the environment for the carbon emissions. Currently, mobiles have systems to show us screen time and have parental controls, a similar mechanism could be initiated for the carbon footprint of o

Session Report

Responsible Internet Usage

Session Report

Session: Responsible Internet Usage

Date: December 2nd 2022

Time: 10:45- 12:15 UTC+3

Theme: Enabling Safety, Security, and Accountability

At Banquet Hall A

Session Chair: Dr. Rajendra Pratap Gupta, Chairman- Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF).

Rapporteur: Ms. Smriti Lohia

 

The Session started with opening remarks from Dr. Rajendra Pratap Gupta, Chairman of Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF). He released the report on ‘Responsible Internet Usage’. He discussed how digitalization is becoming an integral part of our lives and how it impacts the environment. Dr. Gupta gave a very different perspective on understanding the topic. If we take an average lifespan of 70 years and a human sleep around eight hours a day, we probably sleep 24 years in our life. And if we estimate 9 hours of time on the web per person, we spend almost 21 years of our life on the internet. That means the majority of our time. We are going to be on the net in terms of our office workers and that will definitely have an impact on the carbon footprint.

 

Mr. Gunjan Sinha, Executive Chairman of MetricStream and a tech pioneer, has been closely involved with the Internet since the early 90s when the Internet was still a research network. The Internet has become foundational to our lives, society, and nations globally. He talked about the intersection of the internet and ESG principles. The environmental world focuses on the carbon side; the social focus is on the digital divide. The Environmental, Social, and Governance of the Internet need to be considered very carefully at the policy and individual levels. According to him, social media will be more decentralized in the next generation, and when you decentralize the internet, it creates more empowerment and more local. So we have to get towards modus decentralization, more localization, and more governance at the local levels, even though the internet assumes to be a global network. As we get to a Federated model, we will create a lower digital footprint.

We have to label our digital activity properly. Labeling standards must be used in the digital world; for example, label at the bottom of an email about its negative impact. As we label, it starts to create awareness which leads to the right change of behaviour which then leads to a more ESG- centric internet, which has been missing in our overall architecture of the Internet of the future. It requires governments to come together. It also requires standard bodies to come together to create labeling standards.

Dino Cataldo Dell’Accio, Chief Information Officer at the United Nations Joint Staff Pension Fund (UNJSPF) talked about the project of Digital Identity. It was a move towards digital transformation. Now the system is based on new technologies such as biometrics and blockchain and how to provide assurance about these technologies’ reliance on governing bodies and other stakeholders.

He emphasized a lot needs to be done to make sure that the use of technology can be done in an accountable manner to provide assurance and whether and how the use of technology is being done by taking into consideration all the implications of technologies, whether it's about environmental sustainability, social responsibility, energy consumption and so forth. Governments, international organizations, the private sector, and professional associations should work together and start creating a set of standards in order to provide assurance and reliance on the responsible use of these technologies.

Mr. Erik Solheim, Former Environment and Development Minister of Norway, and former executive director of the United Nations Environment Programme (UNEP), touched upon the IT industry’s responsibility for solving the global environmental crisis. The industry should aim to go net zero, sourcing its data centers from renewable energies and buying carbon credits for emissions that cannot be abated. It should use its enormous outreach on various social media platforms to facilitate a broad dialogue on how to solve environmental problems.

Dr. Pooran Chandra Pandey, a senior visiting fellow at The Institute for Democracy in Taipei, gave a brief on the kind of scale and impact of carbon emissions that we are creating by doing very small things, like email, but that matters a lot. The serious problem with any technology is when we go beyond and begin to use it without really knowing the consequences that we will have on health, climate, and society. The kind of activities we undertake without knowing how it impacts the environment. Therefore, knowledge is important because most of us don't know the kind of impact of 1 MB of an email and the data created in its life cycle. On aggregate, an individual is contributing more than 300 million tons of CO annually, from sending emails to text to playing games online and others which we are not aware of in terms of the negative impact we are creating for the people around us and also on the environment which is an intrinsic part of life.

There are a few things that need to be considered: We have to educate children from the school itself about these issues; companies that are producing technology need to be more transparent in terms of their sustainability report, the value they are creating not by selling technology but also being aware how technology affects people in the state of wellbeing.

 

Osama El Hassan, a senior digital health expert on Smart Health at the Dubai health authority, talked about the excessive use of the internet and its association with people’s health. The consequences of excessive of the internet are both physical as well as psychological, which can cause damage to the economy as well. This can also affect cognitive abilities. The key health issues that are clearly associated with excessive internet use is the high blood pressure and obesity; sitting for so long, or focusing on internet devices for long, especially with people who are addicted to gaming, will impact their blood pressure. On the psychological side, anxiety is becoming more and more prevalent, especially with adolescents that now have more difficulties interacting with the real world. This area needs much consideration, and we must have many governance issues.

We also need to have legal governance around misbehavior, bullying, or shaming. This could be at local, regional, and country levels.  We need a framework to ensure that these interactions will not have a psychological impact on users.

 

Smriti Lohia, the co-author of the Responsible Internet Usage paper, talked about how the internet will soon become a basic human necessity and why it is now important to understand responsible usage of the Internet. People need to start focusing on their digital carbon footprint. She emphasized on while using the internet; we need to consider what's necessary and what's not necessary; for what purposes we are using the internet; we need to see what platforms we are using; what kind of content we are accessing. So, we need to consider our responsibility towards the internet at very minute levels.

The session ended with a vote of thanks from the chairman, Dr. Rajendra Pratap Gupta, and with a promise to come out with more reports on the labeling of our digital activities and to create mass awareness about the carbon impact of our digital footprint and work towards making people responsible on internet usage.

To learn more about our work,  visit: https://www.intgovforum.org/en/content/dynamic-coalition-on-internet-jo…

IGF 2022 Town Hall #82 Sustainable Automation as SDG-18?

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Bridge the skill gaps for future jobs through lifelong learning. Technology is here to stay and it is important that lifelong learning and upskilling is made integral to our education

,

Sustainable Automation should be considered by UN as SDG#18

Session Report

Sustainable Automation as SDG-18?

 

Key Takeaways:

  • Disruption in labour market due to automation
  • Identify the gap between current and futuristic skilling
  • Control of the Private sector over different areas/sectors of automation
  • Focus on job creation
  • Bridging the skills needed for future jobs
  • Life-long learning process to life-long skillinglnced use of automation

Calls to Action

  • Bridge between job opportunities and academics
  • Continuous Lifelong Learning to Continuous and Lifelong Skilling
  • Focus on Up-skilling, Re-skilling, and Futuristic skilling
  • Balanced use of Automation
  • Basic social safety and security for people
  • Creating ideas for capacity-building of futuristic jobs
  • Sustain-Able Automation as SDG 18 or included in SDG 9 – Sustainable industrialization

 

Session Report

Session: Sustainable Automation as SDG-18?

Hosted by Dynamic Coalition on Internet & Jobs, Internet Governance Forum & Digital Health Associates ( https://www.digitalassociates.health/ ). 

IGF 2022 Town Hall #82

Press Briefing Room

13:15 IST (1 Dec) -14:15 IST (1 Dec)

Rapporteur: Ms. Rahatul Jannah

Session Chair: Dr. Rajendra Pratap Gupta, Chairman- Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF).

 

The Session started with opening remarks from Dr. Rajendra Pratap Gupta, Chairman of Dynamic Coalition on Internet & Jobs, Internet Governance Forum (IGF). He released the report on ‘Sustain-Able Automation as #SDG 18’ Dr. Gupta remarked that technology should not just focus on Productivity, Profits, and Proliferation, but also there is a need to keep people at the core. He discussed whether we should use automation indiscriminately or discreetly. He also briefed about the Sustain-Able Automation report which looks at automation in countries- large and small, sectors including agriculture, manufacturing, and retail services. Later, he moved to the panel to discuss the effect of automation on jobs and if Sustain-Able Automation should be #SDG 18.

Dr. Rishi Mohan Bhatnagar, President, Aeris India, and co-author of the book, Enterprise IoT, which covers Internet of Things (IoT) project management frameworks, talked about the time when the industrial revolution started in the western world and all the people who spun the wheel in India lost their jobs. He mentioned that whenever there is a technological advancement, there will be a transformation and a disruption. In 1986, when the computerization of railways was initiated in India, people were completely against computerization. However, advancement in technology will not stop, and we will have to adapt ourselves. He pointed out that policymakers need to come up with ideas for skilling resources and creating capacity-building for futuristic jobs.

Asish Thakur, Executive Director, Glocal Pvt. Ltd. talked about automation in Nepal, and mentioned that there has been a lot of work going on in terms of automation in different sectors. He mentioned the research conducted by UNICEF in 2019, which said that by 2030, more than half of the students would come out of school without the skills needed for future jobs. So, we need to adapt to technological advancements and focus on skilling. He also added that a lot of young people have now started taking up upskilling and reskilling programs, and observed that they are now getting into coding, designing, & different activities, which are equally or even more at times, providing them with opportunities and resources. Therefore, there needs to be a bridge between academics and job opportunities. He concluded that there will be a need to learn futuristic skills to adapt to technological advancements.

Mr. Suresh Yadav, Deputy Head, Secretary General Office, The Commonwealth Secretariat, quoted from the Sustain-Able Automation report how the total market value of Apple is much higher than the total GDP of many countries combined together, which brings a perspective that how much control companies have on automation and how they define and decide the direction and areas of automation. He mentioned the shift of power from the government to the private sector and how the private sector controls important assets and determines changes in policies. Mr. Yadav also remarked that technology has the power to disrupt, and neither the government nor individuals can stop this disruption. Hence there is a need to look for the best ways to adapt and prepare ourselves for those changes. He emphasized shifting from ‘continuous lifelong learning to continuous and lifelong skilling.’

Mr. Pooran Chandra Pandey, former member of the Board of Trustees, United Nations World Food Program, and currently International Visiting Fellow, Taiwan Foundation for Democracy, Taiwan (R.O.C) discussed how to balance technology in a way that does not get on the wrong side of human centricity. He elaborated on how technology is evolving and creating disruptions at a very large scale- throwing people out of jobs, bringing upheaval in the labour market, and which is waiting to be accelerated by excessive technological application brought into the offices. He remarked that it is the joint duty of the government and private sector to identify areas, sectors, and people to skill and give them basic social safety and security. He concluded by saying that automation is not only going to displace people from several sectors but also create social disorder. Therefore, there is a need to mitigate the downsides of automation urgently.

 

Toward the end of the session, the majority of the participants opined that it is time to consider ‘Sustainable Automation’ either as SDG-18 or under the existing SDG which covers sustainable industrialization  i.e. SDG-9

The session ended with a vote of thanks from the Chairman, Dr. Rajendra Pratap Gupta.

For more information on Sustainable Automation, please visit the webpage https://www.intgovforum.org/en/content/dynamic-coalition-on-internet-jo…

 

IGF 2022 Open Forum #46 Strengthening MS collaboration on DNS Abuse

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

The multistakeholder model includes a role for users, it is important to reflect this in due process and consider the impact on human rights - for example, through notification and engagement with the user in their domain name is implicated in a form of abuse. Recourse mechanisms also form part of a due-process when prior notification of action is not possible, for example due to severe, well evidenced concerns of harm to the public.

,

There is growing regulatory pressure on the need to address content related abuses at platform level, and increasingly through the DNS. Given the technical limitations of the DNS in targeted interventions, there is a need for a multistakeholder process on developing principles, criteria and thresholds to demarcate the limited set of situations where the DNS may be used to remediate specific types of content. However, it is important to note that

Calls to Action

Continued multistakeholder dialogue on definitions of when to act at the DNS on what types of abuses and strengthening due-process towards ensuring an open, accessible and safe internet for all

Session Report

The DNS Abuse Institute and Internet & Jurisdiction Policy Network Open Forum on Strengthening Multistakeholder collaboration on DNS Abuse, took place on Wednesday, November 30, 2022 from 06:30 to 07:30 UTC. It engaged a multistakeholder panel comprised of industry representatives (both generic Top Level Domains (gTLD) and country code Top Level Domains (ccTLD) Registries), government, law enforcement, and civil society representatives on the question of, “when is it appropriate to act at the level of DNS to address online abuse?”. 

The session was structured around the following key pillars:

  • What does DNS abuse look like? 
  • What does acting at the DNS layer mean? 
  • When do you think it's appropriate to act through the DNS? 
  • What is the role of multistakeholderism in DNS abuse? 

This session focused on the role of the multistakeholder model in relation to when it is appropriate to address online abuses through the Domain Name System (DNS). Domain registries and registrars are part of a centralized system of Internet infrastructure that provides an addressing system for the Internet. 

The Internet Corporation for Assigned Names and Numbers (ICANN) is a multistakeholder organization where consensus policies are developed by the community including the contracted parties (e.g., Registries and Registrars). These contracted parties who operate generic Top Level Domains (gTLDs) can be impacted and ultimately bound by these consensus-driven policies. Country Code Top Level Domains (ccTLDs) also fit into the ICANN ecosystem but have their own systems of policy development which can also include multistakeholder engagement on a national or regional level. 

There are various definitions of ‘DNS Abuse’. Sometimes the term can be used as shorthand to indicate ‘action is appropriate at the DNS level’. However, the conversation benefits from a more granular discussion and the consideration of context for specific types of online abuse.

The DNS is a tool that allows users to connect to specific addresses on the Internet (often websites), but itself is separate from the content on websites– which is not within the control of a registry. Acting at the DNS layer often entails deleting or suspending the entire domain name which can have far reaching, often unintended consequences, for the registrant and website users. For a registry, remedying abuse often means working with registrars or other service providers. 

For governments and law enforcement around the world, it can be challenging jurisdictionally to address harm occurring on the Internet, with its transnational nature, when the actors and intermediaries are distributed globally and laws are not aligned across the world. 

There are more actors involved in the ecosystem beyond registries and registrars, for example, hosting providers, the registrant themselves (the user of the domain name). 

On what DNS abuse looks like, panelists delved into what different operators consider to be appropriate to address at the level of the DNS and the wide distinction between gTLD registries and ccTLD registries. It was identified that ccTLD registries are often much closer to national laws and may be required to follow national procedures. 

When considering action at the DNS level, it is important to differentiate technical abuse from content abuse, assess evidence and consider principles. Is the ‘tool’ available to that operator effective to mitigate the specific harm, precise, proportionate, and has a limited potential for collateral damage? If action is taken in error, it can often be reversed (e.g., a domain name can be restored), but the consequences of the error may not be reversible. Those consequences could come with an unacceptable impact on fundamental human rights, for example, by causing a loss of connectivity for critical health or informational services. The potential to cause more harm than the initial issue detected should be considered seriously. 

It is essential that law enforcement in particular have respect for due process as they investigate and report harm. It is also essential that operators of infrastructure are aware of the potential human rights impacts of potential actions. 

In addition, it was identified that ​​the role of the multistakeholder model is important but is better suited to some tasks than others. There is currently a movement within ICANN by the contracted parties to request changes to their contract. In particular the request is for focused and targeted amendments to take reasonable and appropriate action to mitigate or disrupt malicious registrations when reports are properly evidenced. 

There is also a role for the multistakeholder model within ICANN to undertake further, more detailed work on the topic of DNS Abuse. In addition, there is also a need to engage with actors outside the ICANN DNS Community ecosystem. 

The multistakeholder model includes a role for users, it is important to reflect this in due process and consider the impact on human rights—for example, through notification and engagement with the user that their domain name is implicated in a form of abuse. Recourse mechanisms also form part of a due process when prior notification of action is not possible, for example due to severe, well evidenced concerns of harm to the public. 

There is growing regulatory pressure on the need to address content related abuses at platform level (where the content ‘lives’), and increasingly through the DNS. Given the technical limitations of the DNS in targeted interventions, there is a need for further work through the multistakeholder process to develop principles, criteria, and thresholds to demarcate the limited set of situations where the DNS may be used to remediate specific types of content. However, it is important to note that the purpose is not to legitimize content restrictions through the DNS which is neither technically possible nor recommended.

IGF 2022 WS #422 Toward a Resilient Internet: Cyber Diplomacy 2.0

Updated:
Avoiding Internet Fragmentation
Session Report

The workshop, organized by the Brazilian Internet Steering Committee – CGI.br, focused on discussing recent cyber diplomacy developments and how using a set of tools could boost digital resilience. It was moderated by Rafael Evangelista and had the following speakers:

  • Alexandra Paulus, Civil Society, Western European and Others Group
  • Koichiro Komiyama, Technical Community, Asia, and the Pacific Group
  • Veni Markovsky, Technical Community, Eastern European Group

The moderator opened the workshop by explaining that cyberspace dynamics can lead to uncertainties and new geopolitical configurations that, in turn, demand a new posture from states. This new posture would include more encompassing dialogues, including with other stakeholders. Thus, the idea of discussing cyber diplomacy as a valuable tool for opening dialogue channels was raised. After the short contextualization, each speaker took the floor to expose initial thoughts on cyber diplomacy developments.

Koichiro Komiyama raised three points in his presentation: (1) the power of cyber diplomacy, (2) how cyber diplomacy has changed over the years, and (3) where the diplomatic game is going. In this sense, he started with a concrete example of political negotiation that spilled over the initial negotiating countries: the 2015 US-China cyber agreement. According to him, this agreement reduced, even if temporarily, the cyber incidents in Japan, thus showing the power of diplomacy to enhance cybersecurity. The second point he raised differentiated cyber diplomacy 1.0 from 2.0. In the first type of cyber diplomacy, the states' central focus would be national security in cyberspace. In this sense, military capacity mattered the most, the discussions were around West versus East, and the key players were the USA, China, Russia, and a few other countries. In diplomacy 2.0, nations would look beyond security to data control, broadening the discussion to economy and trade. In this sense, the main focus of 2.0 revolved around resource competition, the Big Tech become (along with states) the main players, and population matters the most, as human activity is the largest data source. His final point underlined that only two countries worldwide would master cyber diplomacy 1.0 and 2.0, with military capacity and large populations: China and India. Thus, states would need to engage with these countries to keep the diplomatic game going forward.

Alexandra Paulus started her presentation by discussing the policy instruments to respond to cyber operations. She explained that the first necessary condition for responding to cyber operations was to conduct internal attribution, which would include legal, technical, and political aspects. She then listed five commonly used policy instruments to tackle cyber operations: information sharing, public attribution, diplomatic measures, criminal indictments, and sanctions. She further posed two not usually-used policy instruments: military and intelligence operations. From this context, she explained that cyber diplomacy has three main challenges: dual-use problem, attribution, and determining political responsibility. In the face of these challenges, she stressed that cyber diplomacy should be seen as a long-term investment. Thus, cyber diplomacy would play the "long game." Besides, it should consider that politics will be politics, meaning that broader issues could take place in discussions and that cyber diplomacy is dissent about practical application. In this context, she proposed a way to think about cyber diplomacy: cyber resilience. Building on the NIST (National Institute of Standards and Technology) concept of cyber resilience, she gave examples of what a cyber resilience posture would look like at the domestic and foreign policy levels. Thus, at the domestic level, such posture could include the creation of data embassies, threat hunting, and regular security incident exercises. At the foreign policy level, it could encompass the improvement of transnational critical infrastructure resilience, the conduct of cyber capacity building aimed at resilience in other states, and a set of international norms. In this sense, as a non-escalatory approach, she called attention to four advantages of taking cyber resilience diplomacy forward: (1) threat actor agnostic, (2) more realistic threat landscape view, (3) improvement of cybersecurity abroad and at home, and (4) contribute to international peace and security.

Veni Markovsky explained he would not talk on behalf of ICANN, but his initial thoughts would go toward the organization's activities. In this regard, he explained that all conversations about cyber diplomacy are coming to focus on the United Nations (UN) and happening in different groups that the UN General Assembly is organizing: a group of governmental experts (UNGGE), an open-ended working group (OEWG) and an ad hoc committee to discuss a cybercrime convention. In this regard, he explained that ICANN brings technical knowledge to diplomats negotiating cybersecurity. Hence, they understand how the Internet works and the organization's role as a technical body that ensures that the DNS and addresses are working all the time. He also highlighted that ICANN engages with the UN to report what happens amid intergovernmental discussions to the broader community and inevitably impacts the Internet. ICANN does that through papers and publications in a variety of languages. In particular, Mr. Markovski pointed out that ICANN has been working on more country-focus reports, already published in Russia, China, and the Netherlands. He then stressed the importance of tracking international fora of the Internet and cyber issues discussions to stay on top of things, find out what is happening worldwide and interact with people. In this sense, he pointed to the importance of IGF and the World Summit on the Information Society Review ( WSIS + 20) that will happen in 2025. Closing his initial thought, he reiterated the invitation for people to stay in touch and subscribe to ICANN papers since the organization will report what is happening at the UN and the International Telecommunications Union (ITU).

Following the speakers' expositions, the moderator asked a set of questions to the panel about public attribution to cyber attacks, sanctions, and the opportunities and way forward to better protect the Internet. On the first issue, the speakers recognized that public attribution is a complex issue, not only because there is a gap between the information states and the private sector but also because providing proof for attribution can reach limits related to sensitive information sharing. In this regard, it was also raised that proper knowledge sharing in the negotiations room could avoid discussion on Internet fragmentation and that public attribution could work as a trigger to more meaningful talks on state responsibility and factors like due diligence of nonstate actors.

On the second issue, speakers agree that cyber sanctions have limited effects on changing actors' behavior and that more must be done to make this tool more valuable. In this regard, examples of sanctions were raised, and the idea of a "long game" with sanctions causing a potential chilling effect was featured. Also, the speakers explained that sanctions should be thought of differently to have a more tangible impact. Economic sanctions in response to cyber operations were raised as a possibility, but also a reminder sparked, that if going on this path, the technical community should work harder to tackle digital economic aspects, such as cryptocurrencies.

On the third issue, speakers stressed the need to provide expertise to governments to prepare them better when negotiating on cyber-related matters. Especially since the negotiations take a multilateral instead of multistakeholder format, besides. It was emphasized that people need to ensure the Internet functioning as a single interoperable Internet worldwide to have the benefits, virtues, and opportunities it brings to society. In this regard, it was also pointed out that issues related to the unconnected, the lack of infrastructure, hardware, software and communications, and other needed skills fall within cyber diplomacy.

After the discussions, the floor was open to the audience's questions. The Brazilian Hub posed queries on the Global South's participation and civil society's role in cyber diplomacy discussions. In this regard, the speakers explained that civil society has a role in engaging with the government, either providing expertise or in elections. They could further on norms such as prohibiting states from attacking elections critical infrastructure and pushing governments to have more open dialogues, as civil society did in 2003 and 2005 in the WSIS process, helping to shape it in its current design.

On the Global South participation, t was pointed out that policymakers and scholars have overlooked its involvement in cyber diplomacy fora, and that previously, the debate was constrained to a few actors, such as Europe, the Five Eyes, and others. In this sense, the creation of OEWG was a considerable step for Global South participation, along with other programs that allow people to participate in such discussions, such as the Women for Cyber Program. Still, it was raised that more needs to be done, and policymakers need to overcome challenges related to resource restrictions to better engage in the fora.

To conclude the workshop, the speakers said their final words. They converge on the relevance of multistakeholder participation in cyber diplomacy and Internet discussions, besides calling attention to avoid underrepresentation.

IGF 2022 WS #235 Dialogue on the 'Declaration for the Future of the Internet'

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

On April 28, 2022, the United States announced the Declaration for the Future of the Internet (DFI), a commitment signed by 61 like-minded nations to reclaim the promise of the early internet in the face of 21st-century challenges. The IGF session, Dialogue on the Declaration for the Future of the Internet, was held on December 1, 2022, and gathered experts from signatory and non-signatory countries to debate the policy questions highlighted above.

Milton Mueller from Georgia Tech's Internet Governance Project opened the session by asking panel members whether their host countries signed the declaration and, if not, to explain why they didn't.

The Cyber Ambassador for Germany, Regine Grienberger, opened by stating that EU member states had all signed the Declaration. Grienberger highlighted how EU stakeholders are reaffirming their stance on ongoing digital transformations through other initiatives such as the Declaration on digital rights, principles for the digital decade, and Germany's Freedom Online Coalition.

BRICS-block countries did not sign the Declaration, taking issue with the process and substance. According to Dhruva Jaishankar from Observer Research Foundation (ORF) America, India did not sign the DFI because its drafting process did not include sufficient consultation with Indian officials or an emphasis on national security. Jaishankar noted that while nations like India may pay lip service to multistakeholder principles, digital nationalism will be India's predominant form of internet governance going forward.
According to Anriette Esterhuysen from Civil Society, African Group, South Africa did not sign the DFI due to a customary position of not signing international agreements they did not negotiate. That said, South Africa has broad alignment with most DFI principles, except for multistakeholder governance. Esterhuysen added that the DFI sends a geopolitical signal of alignment between like-minded and democratic nations. The DFI language makes it difficult for many other states to align themselves with the document. Esterhuysen hopes civil society will use the DFI to hold countries accountable.
According to Louise Marie Hurel from Civil Society, Latin American and Caribbean Group (GRULAC), Brazil does not typically sign international agreements where they are not part of the conceptual and negotiation phase. Further, Hurel highlighted how Brazil's foreign policy stance could best be described as strategic ambiguity, where cooperation with different geopolitical blocks is made on an ad hoc basis. For example, Hurel noted that Brazil cooperates with the West on the international counter-ransomware initiative while maintaining a working relationship with other countries such as China and Russia. Hurel said the DFI achieves its objective of sending a government-to-government political signal but is less sure about its adherence to a multistakeholder process. For Hurel, the DFI is about creating trust with countries in the middle geopolitical ground and a stakeholder mapping effort for the US to gauge a willingness to support from its allies.

Assistant Secretary of Commerce for Communications and Information and National Telecommunications and Information (NTIA) Administrator Allan Davidson noted that the DFI's intent was to signal a shared vision and renewed commitment around its seven core principles, given the rising trend of digital authoritarianism. Davidson noted the DFI was conceived as an intergovernmental declaration because it started as a contribution to the Summit for Democracy, where governments were initially approached. Davidson hopes the document can attract more countries to become signatories and allow the multistakeholder community to advance the DFI vision and hold nation-states accountable.
Milton Mueller asked whether the Declaration will mean the US will be more willing to distance itself from forms of digital sovereignty. Mueller also emphasized the difficulty of navigating the tension between creating an exclusive geopolitical block of like-minded nations while allowing countries that cannot agree with the DFI principles the possibility of signing on and adhering to the document in the future.
Finally, the panel decided to bring DFI principles into future IGFs to gain more support and discussion of those principles. The Q&A portion of the session included the following interactions:

  • A question from an unnamed UK government representative asked what could be practically done with the DFI.
    • Esterhuysen answered that the DFI principles could be used to engage in public, open review processes of a country's recent online legislation.
  • Moira Whelan from the National Democratic Institute (NDI) in Washington, DC, emphasized civil society's engagement in developing the DFI. Whalen also noted that government colleagues were reluctant to provide civil society with opportunities to contribute meaningfully. Whelan asked the panelists for a detailed explanation of the mechanisms for civil society to participate.
  • An audience member noted that since DFI was a project early in the Biden administration, it suffered from an "objective creep" problem. He suggested that UN member states declare their aspirations, intent, vision, and commitments to the future of an open internet and engage in that process through the General Assembly or the Secretary-General.
  • Yik Chan Chin, from the Oxford Global Society and Beijing Normal University, noted that the DFI was a geopolitically driven initiative and questioned the intent behind the Declaration.
  • Matthew McNaughton from Kingston Jamaica's SlashRoots foundation indicated that a declaration of values by like-minded actors would not necessarily be the best vehicle for achieving an open, un‑fragmented internet. He noted the DFI might have the opposite effect of further highlighting distinct divisions and separate visions.
  • Izaan Khan noted the Shanghai cooperation organization, of which India has been a member since 2017, has made digital sovereignty the foundation of internet governance and is promoting it as an international code of conduct in information security.
IGF 2022 WS #494 Cutting Ties: Citizens caught between conflict and tech

Updated:
Avoiding Internet Fragmentation
IGF 2022 WS #260 Protecting Shared Computation (Cloud Security)

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

Data is the lifeblood that guides the decisions of the most organizations but old ways of thinking about data protection are not fit for the era of digital transformation.

,

Cloud security /protection starts with complete visibility into the security and compliance posture of every resource you deploy into the cloud

Calls to Action

It's time to devise a strategic plan to protect data so that our spaces can reap the benefits of working in the cloud without increasing the risks of exposure

Session Report

Organizations and the public face security concerns regarding cloud environment .Despite the fact that many organizations have decided to move sensitive data and important applications to the cloud, concerns about how they can protect it is abound .Therefore the technical community needs to help in  reducing risks of exposure of data while the  private sector  and civil society need to play key roles in educating and raising awareness to the public of security concerns.

Data sovereignty /residence control has created major concerns around data control with protection regulations such as the GDPR limiting where EU citizens data can be sent, the use of data centers outside of the approved areas cloud place organizations in a state of regulatory non-compliance .We need to have other region like the Africa region, America, Asia and Australia adopt different jurisdictions and laws regarding access to data for law enforcement and national security which can also impact the data privacy and security of nations.

IGF 2022 WS #471 Addressing children’s privacy and edtech apps

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

The use of edtech apps by children and adolescents generates different risks, especially with regard to privacy and the protection of their personal data. Large corporations that create and provide these services, some of which are free, can collect massive amounts of data and use it to send personalized advertising and behavioral modulation based on their vulnerabilities.

Calls to Action

It is necessary that governments put children's best interests at the center of the debate, including hearing their opinions and experiences. Governments must also pass legislation to protect children's data and monitor and penalize any violations of children's data, privacy, or rights. The tech industry bears the primary responsibility for child data protection.

Session Report

 

  • Millions of students have returned or will return to a new academic year and they will largely use technology that was adopted during the pandemic. Just a few months ago Human Rights Watch published a report called “How Dare They Peep into My Private Life?”: Children's Rights Violations by Governments that Endorsed Online Learning during the Covid-19 Pandemic” that investigated the educational technologies endorsed by 49 governments in the whole world. The investigation covered the majority of kids who had access to the internet and devices.
  • Every government, except for one, authorized the use of at least one online learning product that surveilled children online, outside of school hours and deep into their private lives. This was the first time that evidence was collected that the majority of online learning products harvested data on who children are, where they are, what they’re doing, who are their family and friends are and what kinds of devices their families could afford for them to use.
  • The critical point of the HRW’s report is that the products did not allow students to decline to be tracked. The monitoring happened secretly, without the child's or family's knowledge or consent. Being the online app a mandatory tool, it was impossible for kids to opt out of surveillance without opting out of school and giving up on learning.
  • This pairing of the edtech industry with the attention economy and targeted advertising industry, as it’s clear from Han’s research, has been promoting a clear violation of the students’ rights to privacy and to the protection of personal data. And on top of that, it promotes children’s behavioral manipulation to an extent that we are still unaware of (neither in terms of present, future, individual or collective impacts).
  • Children are human beings going through a developmental stage. They need to be able to make mistakes and learn from them, as well as to experiment throughout this development in order to understand and mold their own personalities.
  • The need for children to experiment with their personalities is completely undermined by the attention economy and its profiling and aggregation techniques. In the end, what we see today is that the content that reaches children online and therefore influences their personality shaping is, to some extent, dictated by private and commercial interests. So besides behavioral manipulation, this aggregation and specific content targeting can also reinforce discrimination.
  • In order to face the problematic current scenario of the edtech industry, we need to understand that the protection of children’s rights will only be achieved once it is shared among all of society. Much is often said about families being responsible for educating children to use digital devices and services. Some families should support children in their use of edtech apps as much as possible, of course, but that can’t be all. How do states choose edtech tools to be adopted in public education? How do schools themselves choose the tools to be adopted in the private education sector?
  • We need to address the responsibility of the private sector, both the edtech companies themselves and other companies from other sectors that are buying student data from them.
  • When addressing the responsibility of States, schools and the private sector, we need to bring the concept of the best interest of the child to the table, as determined by the UN’s Convention on the Rights of the Child, the most ratified international treaty in the whole world. All actions that directly or potentially affect children must be undertaken in order to fulfill their best interests.
  • The first and foremost way to protect children online is to be aware of what data they provide, whether the apps that they use are putting their data in unwanted hands. Check the company's reputation reviews, take advice from parents and teachers and check online if you're in doubt before using them. Maybe teachers should be trained in schools to help students to understand how to keep their data safe. 
  • The other way is to ensure that we have the best practices enforced. Companies that offer solutions to children have to be mandated to only collect relevant data. Companies should face severe consequences. This is where the IGF can play a role and convince governments to enforce these universally. Governments should come together and make laws that ensure that children stay safe online and that their data is protected . Technology is not going away and children are increasingly going to use the internet and online apps for their educational needs and other social media requirements. We should work collectively to bring laws across national boundaries, encouraging organizations, government agencies and international institutions like the United Nations to mandate rules that will help protect us online and our privacy.  
IGF 2022 WS #369 Harmonising online safety regulation

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

Legislators around the world are increasingly engaging with online safety questions, and implementing novel regulatory regimes aimed at enhancing online safety and addressing various online safety risks. In this context, more and more independent online safety regulators are emerging, whose job it is to implement and enforce novel online safety regulations.

,

To ensure people are protected online and to ensure that regulation is effective and consistent across boarders, international collaboration amongst regulators is essential. While substantive rules may differ across the world, there is significant scope for alignment around regulatory toolboxes and for the sharing of best-practices and expertise. The new Global Online Safety Regulators Network will serve as a crucial vehicle for collaboration.

Session Report

Digital technologies are at once both global and local, and as a result international regulatory collaboration has always been essential to keep people safe online. With the global regulatory landscape for online safety rapidly evolving and more and more countries devising and implementing novel regulatory approaches to improve online safety, international collaboration will become even more important.

In this session which took place on Day Two of IGF, four regulators who are at the forefront of the new drive for online safety came together to discuss the latest online safety regulatory trends, how greater international regulatory cooperation can improve outcomes for everyone, and the role that initiatives like the new Global Online Safety Regulators Network can play. 

The panel featured senior representatives from the eSafety Commissioner (Australia); the Online Safety Commission (Fiji); the Broadcasting Authority of Ireland (Ireland); and Ofcom (United Kingdom). It was moderated by Matthew Nguyen, Digital Governance Lead at the Tony Blair Institute for Global Change.

During the discussion panellists shared updates on how their respective jurisdictions are approaching contemporary online safety policy issues. The major theme was international cooperation, and extensive discussion was devoted to how regulators need to work together to enhance enforcement capabilities and ensure they are benefiting from best practice globally. The panel also discussed the Global Online Safety Regulators Network, that was recently launched by the four regulators and which aims to be a vehicle for greater regulatory alignment and cooperation amongst online safety regulators across the world.

Stakeholders, both in the room and online, were engaged throughout the discussion. Participants expressed support for greater coordination amongst regulators and many were especially interested in understanding the Network’s plans to invite additional regulators to join in the coming year. Stakeholders also opined on the relationship between government and independent regulators, and the panellists noted that regulatory independence was a key principle of the new global network.

Conversations in the room continued well after the end of the panel discussion, with stakeholders highlighting the challenges from regulatory fragmentation and the importance of involving civil society and broader internet governance stakeholders in regulators’ work.

The four regulators were grateful for the opportunity to present and discuss their work with a diverse range of stakeholders at the IGF.

IGF 2022 WS #350 Why Digital Transformation and AI Matter for Justice

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

Judicial operators play an important role as guardians of justice in the digital age, and need to have the latest knowledge on how technology can help them strengthen access and delivery of justice while being mindful of the associated human rights, democracy and rule of law related risks of technologies like artificial intelligence.

Calls to Action

UNESCO’s Judges Initiative empowers judges, lawyers and policymakers to better fulfil their responsibilities as the duty bearers to protect human rights and the rule of law. UNESCO supports judicial operators worldwide in this endeavour through through knowledge sharing, open educational resources, and capacity building efforts.

Session Report

As uses of AI proliferate, whether through the use of surveillance tools or algorithms that amplify disinformation, judicial operators as duty bearers play an important role in protecting human rights and the rule of law. Through partnerships, capacity building efforts, open educational resources, and standards related to new technologies like AI, UNESCO supports judicial operators in creating open and accessible justice systems.

 

As judiciaries worldwide face a large backlog of cases, they are working to make administration of justice more efficient, timely and people friendly. For instance, a panelist underlined that in a West African country, the number of cases filed is more than four times the number of cases closed. However, technologies like AI are helping address this challenge as electronic law reports, AI-powered document review, e-registries, e-payments, and a range of digital transformation measures have led to a 200% increase in the number of cases resolved by the Court of Appeals between 2009 and 2019.

 

Faster case resolution is necessary, but there are significant challenges to the inclusive, sustainable, and transparent use of AI. These include a lack of transparency and trust in the judiciary, challenges with digitization and infrastructural capacity, and privacy concerns.

 

At the same time, the role of judicial operators is evolving. They need to be aware of the use of technology in the justice system, technological bias, and varying levels of digital literacy to improve access to justice. In the courts, the focus is on building the infrastructure to support digitization, but more can be done when it comes to hiring to acquire digital skills.

 

A key finding of capacity building in UNESCO’s AI Needs Assessment Survey in Africa is the need to create localized data sets to inform AI. Currently, AI systems are often trained on low-quality and unrepresentative data sets and are then deployed in the African context.

 

The discussion emphasized the role of interoperability between within the judicial systems and law enforcement. The judiciary, law enforcement, and administration should have interoperable systems. Currently, discussions about interoperability are conducted in isolation - those responsible for digital policy in government are not working with those responsible for digital transformation in the justice system

IGF 2022 Open Forum #50 Global Conference on CCB: Cyber Resilience for Development

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

The Global Conference on Cyber Capacity Building (GC3B) is needed to bring multi-stakeholders together and mobilize effective, sustainable and inclusive stewardship of international cooperation for cyber resilience, bridging international development with international cyber capacity building.

,

A priority for cyber resilience is making sure there is sufficient support and sustainability. Opportunities and challenges of financing cyber resilience through different sources needs to be tackled, in addition to ensuring that resulting global public goods remain sustainable.

Calls to Action

Sessions under the Operationalizing Solutions pillar will be opened up to the global multi-stakeholder community in an Open Call for proposals for session leads., starting 15 December, at gc3b.org. Interested parties should submit a proposal for session leadership.

Session Report

Open Forum The Global Forum on Cyber Expertise (GFCE) held an Open Forum (#50) on the ‘Global Conference on Cyber Capacity Building (GC3B) 2023: Cyber Resilience for Development’ on Wednesday 30th November 2022, during the Internet Governance Forum (IGF) 2022 in Addis Ababa. This open forum session presented the concept, aims and foreseen outcomes of the conference while highlighting the opportunities for global cooperation through the GC3B.

Tereza Horejsova, Outreach Manager of the GFCE Secretariat, opened the event by summarising that the GC3B will be a key global gathering of leaders and experts to mobilize effective, sustainable, and inclusive stewardship of international cooperation for cyber resilient development and cyber capacity building. Its overarching aim is to catalyze global action to elevate and mainstream cyber resilience and capacity building in the international development agenda. She explained that this Open Forum was convened to consult with the global multi-stakeholder community on the conference program.

Chris Painter, President of the GFCE Foundation, outlined the conference’s aims and objectives. He noted that much has been done to promote best practices in cyber capacity building, but insufficient awareness among key decision-makers and a lack of resources and coordination sometimes hinders implementation. This is why the GFCE partnered with the CyberPeace Institute, World Bank, and World Economic Forum to work together in convening the Global Conference on Cyber Capacity Building: to advance, operationalize and collaborate on cyber capacity building. He affirmed that the need for cyber capacity building as a key enabler of sustainable and resilient digital development will be highlighted, reflecting the key theme of the conference for 2023: ‘Cyber Resilience for Development.’ Lastly, he highlighted the two main objectives of the conference: elevate and mainstream cyber resilience and capacity building as a first-order, strategic and operational priority in international cooperation and development, and to support middle- and low-income countries in incorporating cybersecurity and cyber resilience into their national strategic plans, including their digital and infrastructure strategies and investments. These objectives will be achieved through several concrete outcomes.

Theoneste Ngiruwonsanga, Project Manager in charge of Cybersecurity & Data Privacy at SmartAfrica, re-iterated the importance of cyber resilience for development and the need for the GC3B, by highlighting that cyber resilience requires deeper understanding of risks and communities at the regional level and that developing countries should introduce resilience into their critical functions. He explained that this is also because individuals will always strive to live and invest in countries that are resilient. Finally, he spotlighted the importance of the conference’s aim to be multi-stakeholder and inclusive, to mobilise effective, sustainable, and inclusive stewardship of international development and CCB, and that it is necessary to be open to input from the global community, in order to catalyze global action.

Dr. Towela Nyirenda-Jere, Head of the Economic Integration Division at AUDA-NEPAD, highlighted that a key outcome of the conference is a Global CCB Agenda that can be linked to Regional CCB Agendas. She zoomed in on the Africa CCB Agenda by stating that it is currently in the process of being written and finalized with contributions from the community such as through the GFCE Africa Regional Meeting 2022 which took place in the margins of the IGF 2022. Moreover, this process is being led by the Africa CCB Coordination Committee who represent key institutions with various stakeholder interests in Information and Communications Technology (ICT) and Cybersecurity in Africa. The Agenda relies on a demand-driven approach for the coordination and implementation of cyber capacity-building programs and initiatives on the continent. She also affirmed that a whole-of-society and whole-of-government approach to cyber capability building is needed. The key themes identified to be addressed in the Agenda are: political willingness from governments; revision of legal framework on cybercrime and technical capacity building for CERT and DFLs; coordination at national, regional, and international levels; and cyber awareness and skills development. Lastly, she explained the next steps, which include finalizing the proposed Africa Agenda on CCB together with the Africa CCB Coordination Committee, and subsequently submitting it for endorsement by the African Union, after which it will be presented at the GC3B.

Francesca Bosco, Senior Advisor at the CyberPeace Institute, presented an overview of the GC3B program, highlighting its four pillars: Making International Development Cyber-Resilient; Collaborating to Secure the Digital Ecosystem; Cyber Capacity Building for Stability and Security; Operationalizing Solutions. All the pillars will involve sessions and discussions on sub-topics and the 4th pillar, Operationalizing Solutions, will be further divided into four tracks: Empowering Better Program Management for Cyber Capacity Building and Cyber Resilient Development (Track A), Implementing Successful Cyber Capacity Building and Cyber Resilient Development Actions (Track B), Using Global Public Goods for Cyber Capacity Building (Track C) and Coordinating at the Regional Level (Track D). It was explained that the conference program has purposely left space for up to 12 session slots for members of the community to propose and/or lead sessions, under the cross-cutting theme of “Operationalizing Solutions”.  An Open Call for proposals for session leads or session topics under this pillar will be launched on December 15th 2022. Prior to this, the conference co-organizers are looking for feedback from the global multistakeholder community on the topics included per track (A-C).

Following the presentation of the GC3B and its program, participants were invited to partake in an interactive discussion regarding which topics should be prioritised under each of the tracks of Pillar 4. For Track A (empowering better program management for cyber capacity building and cyber resilient development) bringing cyber expertise into development programs and upskilling/reskilling development staff on cyber issues was identified as the preferable priority. Diversity and inclusivity were also identified as principles the Conference should aim at promoting and representing. Secondly, under Track B (implementing successful cyber capacity building and cyber resilient development actions), participants proposed that the track priorities opportunities and challenges of financing cybersecurity and cyber resilience in developing countries through different sources. In this way, the conference can serve as a launching pad for assessing the way in which resources are used and involve additional donors. Lastly, under Track C (Use of Global Public Goods for Cyber Capacity Building), participants highlighted the importance of giving wider access to existing resources and ensuring global public goods are designed and used sustainably.

Participants were thanked for their contributions and are invited to visit the website at gc3b.org for more information, or get in touch with [email protected] for any questions or to provide any further input.

IGF 2022 DC-Gender: Who's Watching the Machines? New Tech, Gender, Race & Sexuality

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

The idea that technology is a great equalizer is flawed, as technologies often amplify inequalities and harm against gender and sexual minorities, people with disabilities, and people from other marginalised communities. Queer and disabled people cannot access healthcare services as they are built on identification systems that they cannot access. Technologies are built by a homogenous group of people(white, cis men) using biased data.

,

Holding tech companies accountable is a big challenge, as they have strong lobbies with the governments and collective public awareness on issues of digital rights is limited. Geographic-location dynamics also inform these tech accountability processes;tech companies mainly respond to policies in the global north as there are big costs to defying them. Further, how and why given tech policies are being implemented need to be considered.

Calls to Action

The interest of the people, their lived experiences and local contexts, and their visions for the technologies need to be front and centre. Tech companies and developers ought to be intentional not just about privacy, safety, transparency, representation, and benefits to the community, they should also centre people’s choice and agency. We should be able to decide the extent to which we want to participate & engage with technologies.

,

To think of technologies are merely practical solutions to problems, or a means to an end is a myopic understanding of technology. Emerging technologies should centre community, care, pleasure, and fun. People’s joy and community building should be the motivations for technological innovation.

Session Report

 

Smita Vanniyar inaugurated the discussion by highlighting the importance of bringing forth the experiences and visions of women and queer people in building a resilient internet and stressed on the need to de-binarise discussions on internet governance. With Governments bringing emerging tech, digital ID systems, and surveillance tech, we need to think about in what contexts these are being pushed and who will be most affected by them. They pointed to how facial recognition technologies and CCTV cameras are pushed for the ‘protection’ of women and children, however, facial recognition technologies have an inherent colour bias. When you use such tech for safety and it is already biased, we need to ask: who are we throwing under the bus? Research shows that non-binary, trans, and black people are significantly misidentified. We need to ask: what determines what kind of policies govern tech? Are these being pushed by civil society or are there other forces at play? A lot of the time, we get policies from the perspective of research and development and they are inevitably connected to state security and the military. Tech policies are often informed by global relations, for example, a lot of privacy conversations came from the EU, often ignoring local contexts.

Liz Orembo reflected on how algorithms contribute to the lack of representation of women in governance or politics in today’s age.  She highlighted how the algorithms and social media’s business models amplify attacks on women seeking political seats. What needs to be done is to discourage such biased and harmful business models, and reimagine these business models in a way that enables other communities. Technologies are biased because they are built on data that ignores the realities of women and gender minorities, especially in the global south, as they have limited access to digital technologies. On the other hand, women and gender minorities in the global south are not using technologies because they are not built for them. It is a continuous loop. Liz also spoke of the many barriers to holding technology companies accountable. Civil society members act out of passion, while companies are always driven by profit-motive. They have strong lobbies with various governments as they have the money and power to do so. Moreover, it is a challenge to mobilise the general public in holding these companies accountable as understanding tech and policies is a slow and difficult process. When people do not understand how tech works, it becomes a challenge to ask for accountability. 

Chenai Chair deliberated on the concerns around surveillance and tech, noting that in many parts of Africa, surveillance cameras are being increasingly deployed for military and state security. Who has the data that is collected by these technologies? How are these technologies being secured? What is the data used for? There are also serious concerns around data surveillance, and there is a need for greater transparency. How does my device know where I am and who is collecting such information? What steps are being taken to ensure privacy? For example, the mental health apps that we share our most intimate thoughts and information with are often quite irresponsible with our data and do not prioritise privacy of data. Chenai also elaborated on the biases that exist in Voice Technologies. Voice technologies respond to European and American accents but misidentify African languages and accents, and even voices of women. These voice technologies are being pushed as solutions that enable greater access to information and engagement with digital technologies, but these biases need to be addressed. There needs to be diversity in datasets that these solutions are built on. 

When speaking of tech accountability, Chenai highlighted that geo-location dynamics play an important role in determining the tech platforms’ response to accountability demands. These companies respond to political representatives where they come from. For example, platforms take the EU more seriously because they will be charged a hefty fine, whereas, fines imposed by African nations are seen as a small cost. The forex rate works against African nations. Community standards that govern these platforms are also used to silence people from many communities. Mass reporting is used to silence feminists, LGBTQIA+ community, and activists, who greatly rely on these social media platforms for their work. Apps like Telegram are used to mobilise such mass reporting.

Chenai also emphasised the need to centre fun and joy in our approach to technology and innovation. People first used the internet to date. When TikTok came, a lot of people thought TikTok would not work because people did not have to sign up to see content. However, TikTok is has reach across classes because it centred joy and entertainment for as many people as possible. A lot of innovations were driven by SDGs, etc. but how do we ensure that there is a greater participation and awareness of technology among people because there is joy and fun in the process?

Srinidhi Raghavan examined conversations on data, surveillance, and privacy at the intersection of technology and disability. In India, there is a strong push toward the systematization and creation of Digital IDs for easier and streamlined access to health services. Some of the few central questions that need to be addressed are: Where is this data being kept? How is the system built to allow people with diverse disabilities to access these systems? Access to these services is often linked to the extremely contested Biometric ID card (in India), Aadhaar card, which was built thinking of non-disabled bodies in mind. People with cerebral palsy, retinal detachment, or other kinds of physical disabilities cannot access the Aadhaar system. Technology is seen as a solution to bridge the gaps in access, but often it is the other way around. Disability needs to be front and centre when we create these technologies and services.

Srinidhi also highlighted how many of the emerging technologies are actively harming people with disabilities. Technologies are being built to detect disabilities, and to look for how disability presents itself, but as we know, these systems of identification are often used to discriminate against people with disabilities. While some might say that these technologies allow for streamlining of access to healthcare, these same technologies are also used to deny disabled people, like those with psychosocial disabilities, access to medical insurance. We also need to think about these technologies from the perspective of privacy – who gets to decide when a person’s disability is disclosed and identified?

Srinidhi also noted that the question of AI helping in increasing accessibility, spaces, and possibilities for persons with disabilities is a complicated one. As activists, we realise we do not have enough data on disability issues. But if data is collected without paying attention to privacy or transparency, it can be weaponised against groups the data is collected about. For example, a study in the US showed that a company was collecting user data on disability, and one of the things it ended up doing was deprioritizing disabled persons' applications in the hiring process. Data collected is often not in the best interest of the community. How do we imagine good use of data? We often do not know what data is being collected. There is a power in play - how do you challenge something when you do not even know what is being collected about you? We would like data to build inclusive systems, but it is used against us. Our relationship to data is a very complicated one. Moreover, the conversations at the intersection of tech and disability are always about fixing disability. It is important to think about where this comes from. We need to think about people and community and care, not just a means to an end. We need to think about the human aspects of engaging with technology - how can it nourish human life and relationships?

Sheena Magenya reminds us to be wary of the narratives that talk about tech as a great equaliser. Tech is seen as a big solution to all our problems, especially in the African context, however these technologies are devoid of any accountability, equity, and representation. In the technology landscape, there is limited agency and choice of community members. As we saw during COVID-19, often, technological ‘solutions’ are imposed on communities, and there is not much to refuse or reject use of tech. Communities that are criminalised in certain contexts and nations are denied access to spaces when these technologies are constantly tracking you or fishing your personal information out of you. If you are a trans person, and your identity is criminalised in a certain region, you can only participate in certain spaces by lying about your identity. 

Sheena also asserted that we need to shift focus away from what is horrible and what we want more of. We need to remember that tech and innovation are not new to the human experience. We have always been developing tech as humans, but now the speed is just really fast. We are always trying to do more and better. But this inherent interest in technology is manipulated by corporates and states. Where is the choice to the extent to which we want to participate? Who gets to take up space? Young women and queer people face an incredible amount of violence for simply existing, for saying things that are not in line with societal, patriarchal standards. They are truly at the battlefront. There is no support, solidarity, or recourse. It is important to recognise that young people are defending not just their freedoms but everyone else’s as well.

IGF 2022 Town Hall #63 Enabling a Safe Internet for Women and Girls

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

- Online gender-based violence creates a negative feedback loop because of its silencing effect. Because of how widespread the problem has become, it creates social norms that enable this behaviour to continue over time. - This problem is prolific and spreads to institutions such as national elections and even the IGF itself. More must be done to ensure that online spaces are safe spaces.

Session Report

Online gender-based violence is prolific. It affects women, girls, and gender diverse people across different countries and social environments.

 

People who call this problem out make themselves more vulnerable to further abuse. Honourable Neem Lugangira spoke about her experience as an elected official in Tanzania, the exposure that meant for online gender-based violence, and how raising this issue made her even further targeted. Another activist based in South Asia noted the self-silencing impact of this abuse for targeted individuals.

 

Women are helping themselves overcome this problem. Irene Mwendwa gave the example of Pollicy’s peer-to-peer communities and local-level training for resilience among women policymakers at multiple levels of government.

 

Norms against this kind of violence already exist. Clear analogies on the illegality of this behaviour in the streets should be translated into the online world, said Onica Makwakwa of the Global Digital Inclusion Partnership. Governments are falling short of their mandate to address this: huge amounts of public education against gender-based violence is required, and a failure to address this issue costs governments over $1 billion in lost productivity due to the digital gender gap.

 

Platforms are taking action, but transparency is missing, said Kat Townsend. This comes from the experience of the Web Foundation and its Tech Policy Design Lab, working with activists and major social media platforms on this issue. While the platforms are making changes that users can see, there is still an underwhelming amount of transparency for us to understand what the potential impact of this change might be.

 

Resources:

Amplified Abuse, from Pollicy https://pollicy.org/projects/amplified-abuse/

Tech Policy Design Lab, from Web Foundation https://techlab.webfoundation.org/ogbv/overview

Meaningful Connectivity, from A4AI https://a4ai.org/meaningful-connectivity/

IGF 2022 WS #283 Capacity Building for Safe & Secure Cyberspace: Making It Real

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

Even though there are various factors involved in CCB and countries and regions have different needs and issues, stakeholders involved in CCB can explore best practices and solutions implemented in different regional settings, as they will often be transposable and adjustable to different contexts.

,

Cybersecurity is a shared responsibility between governments and other actors involved in this space, such as the private sector, technical communities and civil society. CCB should thus provide a multi stakeholder response to challenges, for eg through calling on the private sector to provide inputs on trends and threats overview, or empowering civil society to take an active approach in CCB, such as through the potential role of academia in fil

Calls to Action

Stakeholders are encouraged to make use of the GFCE ecosystem and tools, such as the Clearing House, that not only matches needs with resources, but also supports stakeholders through clarifying their needs and developing their CCB roadmap.

Session Report

IGF WS #283 “Capacity building for safe & secure cyberspace: making it real” looked at cyber capacity building (CCB) as a priority on the international cooperation agenda. The session discussed regional dynamics and challenges as well as the role  and intersections of different actors in a multistakeholder approach to CCB, particularly in workforce development.

APNIC mentioned that capacity building efforts faced three main challenges in the Asia-Pacific region, exacerbated by the COVID pandemic: the growth in number of users and networks led to additional pressure on existing operators; multilingual diversity challenges were noted in terms of the need for continuous translation of manuals and documents such as updated best practices in different languages (needed for maintaining the level of engagement); the increased reliance on access to internet in terms of livelihood and delivery of government services has also meant that this became a much more critical resource for organisations and businesses in remote areas. A reliable, accessible, affordable stable internet has become essential for securing growth in a stable way.

The OAS (Organisation of American States) – CICTE as focal point for GFCE’s Liaison highlighted their focus on identifying gaps in CCB. The challenges identified in the region were not only related to the gap between decision makers and the technical community but relate also to the disconnect between decision-makers stemming from an age bracket that does not necessarily identify with the information required to make cybersecurity-related decisions, and the cyber domain they are called to rule on. On workforce development, it was noted that there was not sufficient reference to education as regards the digital skills gap. There is an insufficient offering of university courses for reducing the digital skills gap. Further on, on the labour market, recent graduates are faced with obstacles or are unable to get cybersecurity jobs because of their lack of practical experience. Moreover, gender parity in the workforce dropped after the pandemic, which highlights the need for policies geared towards including women in digital and cybersecurity roles. CCB is a strong focus of the GFCE Liaison at OAS – CICTE. Mapping of ongoing projects and efforts is an important step undertaken by the GFCE liaison through analysing information on the Cybil Portal. The mapping so far indicates an increased interest from donors and implementers in the region. The OAS as GFCE hub is an ideal position to seize this as an opportunity through coordinating efforts and developing a regional roadmap for CCB implementation. The potential overlap between projects can thus be deconflicted through steering efforts towards different identified priorities.

The GFCE’s Clearing House process was illustrated through examples from the significant number of requests stemming from the Global South, particularly from African countries. The match-making mechanism goes beyond connecting members and partners who have identified CCB needs with resources within the GFCE community, be it expertise or financial. The process is in many cases a first step in a country’s CCB’s journey. The Clearing House process facilitates the discussions on CCB needs in an expert community, supporting countries in identifying and prioritising their needs, which results in developing national CCB roadmaps for the medium-long term. This exercise is essential to mobilise the resources and expertise available.

Cross-stakeholder engagement in CCB is vital as each stakeholder group is called to represent different viewpoints and play specific roles. From the outset, panellists focused on cybersecurity as a shared responsibility between governments and other actors involved in this space, such as the private sector, technical communities, civil society and academia. CCB should thus provide a multi stakeholder response to challenges, be it through calling on the private sector to provide inputs on trends and threats overview with predictions across a longer timescale, or empowering civil society to take an active approach in CCB, such as through the potential role of academia in filling in identified knowledge gaps. It was proposed that the shared responsibility mantra should in turn impact on the concept of CCB, broadening it beyond the state view and addressing local industries and civil society that need support and could benefit from CCB.

Further on the role of the private sector, it was noted that the industry is not only made up of large companies, it is in fact in majority composed of small companies that operate domestically and implement solutions locally. Even though they will not have the same resources available as a major industry player, they are still an essential partner and potential beneficiary of CCB.

Regarding civil society and academia, it was mentioned that investing in domestic research-based academic programs and engaging with academic communities at national level will provide a better understanding of the national context and can also help countries develop the knowledge required to build their national cyber capacity through a bottom-up approach.

The two stakeholder groups collaborate in meaningful and practical ways, for example through the private sector offering placements, fellowships, and internships, as certification programs often need to provide practical experience. 

It was concluded that a capacity building approach connecting industries and educational institutes ensures that there is no supply-demand mismatch in workforce development. Panellists underlined that workforce development strategies should be comprehensive and connect education, government and private sector. This ensures that the skilling content is industry-aligned, based on a common set of needs. This ensures also that all stakeholders speak the same language, be it when describing university courses offered or when drafting job descriptions. However, it was stressed that these strategies should be country-specific, as the need for cybersecurity personnel varies according to the country’s industrialisation and digitalisation level. So, it is important to promote career paths in a country-specific approach.

As a concrete example of a cross-stakeholder, country-specific approach, Microsoft mentioned the implementation across 23 different countries of cybersecurity skilling campaigns, aiming to bring in traditionally excluded or less represented communities in the cybersecurity workforce, including women. By partnering with local governments, education institutions and local businesses the campaigns aimed to ensure that the programs developed fit the unique needs of their own context.

Panellists reiterated the importance of having a cross-stakeholder approach to cyber capacity building, understood in the regional context and implemented at national level.

IGF 2022 Open Forum #57 Digital skills for protection and participation online

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Children are actors of change. But in order to become active digital citizens, that are able to safely navigate the online environment, access support online and provide support to their peers, they need to be equipped and empowered with the necessary digital skills and digital literacy education.

,

Online digital skills and safety education through engaging trainings are one of the ways to empower women and girls everywhere and allow them to access opportunities and rights online. International partnership is essential to deliver comprehensive and relevant education. ThE drafting national strategies should complement and be focused on privacy, security, and ensuring the inclusion of girls and women, and learning digital technologies.

Calls to Action

We need to involve children in drafting policies and have an understanding of what children are doing and what they need in order to feel safe online. Young people should be at the center of collaboration and this will help to improve cooperation and partnership on digital skills and online safety for all.

IGF 2022 Open Forum #97 Adopting Data Governance Framework: From Silos to Ecosystem

Updated:
Governing Data and Protecting Privacy
Key Takeaways:
Speakers and participants at the Open Forum reflected on issues of data governance vis-à-vis national settings and individual experiences. Cutting across all interventions was the message that trust is the bedrock of data governance, and digital transformation in general – without that trust, it is impossible for frameworks or user experiences to be genuinely useful., Speakers and participants at the Open Forum reflected on issues of data governance vis-à-vis national settings and individual experiences. Cutting across all interventions was the message that trust is the bedrock of data governance, and digital transformation in general – without that trust, it is impossible for frameworks or user experiences to be genuinely useful. Data governance needs to be considered with respect to its processes and instit
Calls to Action
Data governance needs to be considered with respect to its processes and institutionalization, which should be nimble and tackle innovations head-on to harness their opportunities and addressing challenges quickly.
IGF 2022 WS #214 Blurred lines between fact & fiction: Disinformation online

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

Disinformation circulating in the internet and in private groups can interfere in democratic processes . All panelists pointed out that investment that should be made in media literacy skills in population as a way of empowering them through a critical analysis of the information they receive. Meaning, to give the people the instruments that can help them to distinguish disinformation and misinformation and empower their decisions.

,

Apart from impacting democratic processes, disinformation has also impact in mental health of activists and young people regarding emotional effects that disinformation provokes. It was also mentioned that we should be very careful with the legislations, because there exist the threat to suppress free speech and pluralism in the conversations.

Calls to Action

One call for action is to support media literacy education and raise awareness about safety online. Particularly for the Youth we have to come up with different ways of raising awareness than the usual because they tend to ignore all the side effect of disinformation in the democratic societies and they only touch upon the surface of the problem.

,

Support to journalistic media as a way to avoid giving ground to disinformation, meaning to support good practices that follow the journalistic deontological code.

Session Report

Great insights and discussions at #IGF2022 on the 29th November during the workshop on online disinformation.

Unanimously, the internet is the source to which people would turn first if they need information on a specific topic and the internet has provided unprecedented amounts of information to huge numbers of people worldwide. However, at the same time, false and decontextualized information has also been disseminated. The rise of the digital platforms has enabled people to provide more direct access to content, and thus have replaced, in a way, mediated professional journalism and editorial decision with algorithms that prioritize clickbait content, in order to maximize engagement. Anyone with a social media account can create and spread disinformation: governments, companies, other interest groups, or individuals. Research has suggested that human users are the main amplifiers of online propaganda, not bots. Consequently, online influence operations are extremely fuzzy, as they largely depend on the broadcast of data by many private actors to reach their target audience. On top of that, inaccurate or misleading content has potentially damaging impacts on core human rights and the functioning of democracy.

In the workshop participated in the panel:

Sérgio Gomes da Silva, Director of International Relations and Communications at the General Secretariat of the Council of Ministers Presidency, member of the Board of Cenjor (Protocol Centre for Professional Training for Journalists), member of the Executive Board of Obercom (Communication Observatory) and member of the National Electoral Commission.

Rodrigo Nejm, Awareness director at Safernet Brasil, PhD social psychology Federal University of Bahia (UFBA). Coordinator of the Brazilian Safer Internet Day since 2009.

Samuel Rodrigues de Oliveira, PhD candidate in State Theory and Constitutional Law at the Pontifical Catholic University of Rio de Janeiro. Master in Law and Innovation. Attorney at Law.

Marina Kopidaki, an 18-year-old medical student at the University of Crete. Five years ago she joined the Greek Safer Internet Youth Panel and since then internet safety has been one of her main interests. At an international level, Marina have participated in various activities such as the BIK Youth Panel, the Internet Governance Forum (IGF) 2019, and the Youth Summit, among others.

During the workshop Sergio Gomes da Silva gave some examples to demonstrate how fake messages circulating in internet and in private groups can interfere in democratic processes. Regarding the balance between disinformation and the right to free speech Sergio Gomes the Silva point out the support to journalistic media as a way to avoid giving ground to disinformation, meaning to support good practices as the journalistic deontological code.

Another aspect point out was the investment that should be made in media literacy skills in population as a way of empowering them through a critical analysis of the information they receive. Meaning, give the people the instruments that can help them to distinguish disinformation and misinformation and empower their decisions.

Rodrigo Nejm stressed out how WhatsApp can be the only channel to access information for some part of the population in Brazil.

Media literacy is a key point but in Brazil it has to be said that the basic literacy is not ensured so how can they go further to media literacy.

Brazil has good Guidelines at the national level about media literacy a great approach, but the big challenge is how to scale these guidelines to all population.

It was also pointed out how youth participation and influencers are raising awareness about hate speech.

All the disinformation is having impact in the democratic process and also impact in mental health of activists and young people regarding emotional effects that disinformation provoke.

Samuel Rodrigues de Oliveira addressed how policymakers tackle the problem and presented the civil rights framework for the internet that includes content regarding hate speech and disinformation online. It was also referred the resolution that tackles disinformation that affects the integrity of the electoral process. As an example it was referred that following this resolution after this year elections some YouTubers accounts were taken down. Brazil is discussion legislation that addresses all this issues and they hope that this legislation will be soon approved.

Marina, the youth representative recognized that disinformation is not a hot topic among youth.

Nevertheless, she mention that disinformation undermines democratic principles but its something that for young people is not easy to understand. Although she recognized how important this is.

Youth are very vulnerable to disinformation that can be everywhere from advertising to news but it is not easy to understand and to spot it, that’s way Marina advocates for the importance of safer internet education.

Marina also gave the example of the Greek education system where, since 2022 is more flexible and there is a pool of themes that can be though. Now teachers and students can access to Handbooks and manual about some of the online safe topics aiming to develop digital citizenship. Although is not mandatory it’s already a step ahead. Marina Also underline the importance of the safer internet centers in tackling this issue.

IGF 2022 WS #505 DNS Abuse: Where are we and where do we want to be?

Updated:
Enabling Safety, Security and Accountability
IGF 2022 Town Hall #53 Social Justice during Rapid Datafication

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Understanding the impact of datafication in people’s lives and crafting data governance standards need to be predicated on critically examining power dynamics.

,

How data justice affects human rights largely depends on who holds the power to set governance mechanisms or influence the use of data. To ensure equity, the state, tech groups, academe, and civil society actors must be cognisant of power relations and include diverse perspectives.

Calls to Action

Increased awareness for the social justice implications of newly introduced technologies.

,

Be careful in considering technological advancement as a necessity for the progress of humanity.

Session Report

Rapid developments in technology and the datafication of society are seen as inevitable by most members of society. According to research conducted by EngageMedia among Indonesian and Philippine respondents as part of the Alan Turing Institute’s Advancing Data Justice Research and Practice (ADJRP) project, people seldom question the advancement of technology. This reliance on technology fuels endless data collection – every click, scroll, and tap on digital devices generates massive amounts of data that define people’s identities and inform their participation in and access to key everyday activities, from financial transactions to communications. The vast amounts of data being collected are not always leading to better policies or actionable knowledge. Should one collect data for the sake of its collection, or should outcomes of data collection always be leading? Again, a question that is not often asked.

With the datafication of society, humans exist as both physical and digital beings. When it comes to that digital being, how just is society when regulating this online person? The positive view of technology and datafication glosses over critical questions, particularly about the purposes of the collection, use, and processing of data, and who benefits in this process.

The research by EngageMedia and the Alan Turing Institute is an attempt to expand research on data justice and critically question how people understand the concept of data justice and the impact of datafication on their lives. Based on the insights shared by the speakers and panel members during the IGF session, data collection does not always translate to actionable knowledge that improves people’s living conditions, especially those for whom datafication tends to facilitate their exclusion and exploitation further. One example is the case of an Indonesian research respondent who was unable to have her ID card reflect her gender because the system did not allow for it, until a government official stepped in and offered to change it for her manually, after seeing and confirming the individual's new gender identity in person.

At the core of the issue is the need to question and examine power dynamics critically, particularly concerning fairness and equity in the way people are made visible and represented in the production and analysis of data. In discussions on defining data justice or setting data governance mechanisms, whose ideas take precedence? Too often, the voices of communities in the Global South are not adequately captured in these discussions. During the IGF session, participants stressed the need to ensure a diversity of views and understand people’s different lived experiences. Their experience of datafication – its harms and benefits – may be different from the experiences of those in the Global North, and to ensure inclusion and equity it is important to be mindful of whose voices are being heard in the conversation. The dignity of every individual should be at the centre of determining these values.

Based on the research, and echoed during the IGF session, the correlation between data justice and human rights is dependent on power. One example raised during the session was about an individual whose digital data, and by extension, his digital identity, was wiped out in armed conflict, presenting a host of complications for how he can access essential services and fully exercise his rights. In this case, the loss of data and his online personhood is not an individual failing, but the fault of the institution holding power. Such institutions are no longer limited to the government: non-state actors, such as technology developers, now have the power to collect, use, and store data, impacting people's lives worldwide. But the question on accountability and the obligation to protect this data presents no straightforward answer.

What is clear, however, is the need to centre human rights and social justice in discussions on data justice, and ensure the diversity of views and critical examination of power dynamics in such conversations. These will be crucial in informing and shaping the body of knowledge to guide policymakers, technology developers, and other relevant actors in embedding fairness and equity in data governance mechanisms.

 

 

IGF 2022 DC-IUI Strengthening digital ecosystems through shared principles

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The ROAM-X indicators is a framework based on values and principles such as justice, inclusion and meaningful access for shaping global digital landscapes. The framework is a process for change and improvement that relies on peer-to-peer learning and a multistakeholder approach for the development of more inclusive digital ecosystems.

,

The Dynamic Coalition on IUI serves as a shared-space for various stakeholders and UNESCO to share good practice and lessons learned from the process of the ROAM-X assessment by jointly advancing evidence-based policy making and orienting more countries to conduct their voluntary assessment for more sustainable digital environments.

Calls to Action

UNESCO calls for international stakeholders as well as the IGF community to join the Dynamic Coalition of IUIs to advance the implementation of the voluntary national assessment of the ROAM-X indicators in more countries and engage in consultations for the update of the indicators framework.

Session Report

The session was centered around redefining the strength of the current ROAM-X framework and further advocating for the update of the indicators to better adapt to current socio-political contexts. 

Global partners and experts discussed the revision of the number of indicators to simplify the framework to adapt to fast-changing digital development and making digital innovation part of solutions for climate change and green transformation. The experts also called for adding new relevant dimensions in the context of the UN Global Digital Compact.  

Dorothy Gordon, Chair of IFAP, opened the session emphasizing the link between Internet Universality and the core Information For All Programme mission. Marielza Oliveira, Director for Partnerships and Operational Programme Monitoring, introduced the session, by presenting the support of the ADG, and stressed the overall applicability of the ROAM principles to the digital ecosystem. Simon Ellis then gave a brief description of the Internet Universality ROAM-X indicators’ methodology, with particular reference to the growing number of impacts from the programme as well as the multistakeholder ownership of reports.  

 Anja Gengo from the IGF Secretariat indicated that the ROAM-X assessment should bring forward horizontal measures owned by a multistakeholder coalition including appropriate NRIs. She was followed by Damilare Oyedele from IFLA Africa who placed libraries at the center of access to information. 

Fabio Senne of CETIC Brazil, the first country to implement the ROAM-X indicators talked about how the results have been translated into practice through changes to legal, digital technology and operation quality. He added that data were not just the realm of government but of the private sector as well as civil society a multi-stakeholder of data creation and ownership.  

David Souter, the principal author of the IUI-ROAM manual gave a strategic view of the Internet Universality ROAM-X Indicators in relation to the emerging issues of the UN Global Digital Compact. He emphasized that the purpose of the assessment is the improvement of human development, social and economic by aligning with the Sustainable Development Goals. Following his Commission for Science and Development report, he reaffirmed the relevance of the ROAM-X indicators and principles to ‘preserve what we value, promote what we want, and prevent what we fear’. Digital developments go as far as to alter political structures.  In terms of the ROAM method, he stressed the importance of a solid evidence base – starting the work with skepticism not with faith. 

Questions from the floor covered localization of the ROAM-X assessment, the place of the ‘rule of law’ within the framework, its impact on local communities, and the barriers faced by countries in making progress. The panel responded by highlighting that the ROAM-X framework includes indicators on barriers to progress, noting the impact of legal process on national legislatures, and re-emphasizing the importance of the multi-stakeholder model in identifying the needs of disadvantaged communities.  

Marielza Oliveira concluded the session by saying that Internet Universality ROAM-X Indicators were more of a process for change than simply a suite of indicators and that, in this sense, it aligned well with the development of the Global Digital Compact. 

IGF 2022 WS #261 Perils and opportunities of data integration for security

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

The session has agreed on the prevalence of human rights when thinking in the intersection of security policies and digitization. Particularly participants from other global south countries like Senegal and India have shared their experiences and point to learning on how Global North countries actually evaluate the acquisition of digital technologies for the delivery of public policies, security included.

,

A recognition of the non-uniformity of the state was discussed. Specifically, although sovereignty is indeed an issue of importance for State's, sometimes the way such Sovereignty is perceived in the context of digital technologies is not necessarily sufficient to ensure, so States need more capacity in this end.

Calls to Action

There is a need for safe and honest spaces of conversation at the intersection of digital technologies and security policies. The responsability to create this space falls on the entire multi-stakeholder ecosystem: Civil society, academia, private sector, governments and technical community. Moreover, the international cooperation community working in this issues must be included as well.

Session Report

This IGF session was designed to create an honest space of conversation with different representatives from the multi-stakeholder model, in order to reflect on one of the biggest challenges at the intersection of digitization of public policies. Namely, what are the trade-offs to be considered at the intersection of public security and human rights when deploying digital systems, with a particular focus in the right to privacy.

The session was predominantly composed by representatives from civil society, but also with one representative from the government. It was aimed to also include the voices of the private sector in this matter, but due to time constraints, this was not achieve.

The panel had numerous reflections useful to the matter in question. It started from a characterization from the different panelists and their work. Such panelists presented case studies from the Triple Border Area (border shared between Argentina, Brazil and Paraguay), Colombia, Argentina,Uganda and also case studies from a more bird- eye view, particularly related to the African context. Amongst the main topics discussed:

  • The trends and discourse that is identified when reflecting in the implementation of technologies in public security contexts (particularly in the Triple Border Area) is that of technology as a synonym for eficiency; efficiency as a value opposed to rights; opaqueness based on the justification of public security

  • When thinking in the collection of biometrics data in the context of border control, there is an ongoing process in the context of the large flux of migrants coming from Venezuela and Colombia. There is a large use of biometrics in this context were the State collects more biometric data to grant basic rights to Venezuelan migrant communities and in comparison to the data collected from Colombian citizens. Such is an experimental deployment of technologies in migrants

  • When thinking in the access to information revolving security programmes, there is quite a gap in general. In the case of Argentina, there is a lack of information in the aquisition and use of surveillance technologies. These technologies are bought from local suppliers and not the manufacturers, which makes practices even more opaque. There is a public-private trend in the implementation of these technologies

  • From a sovereignty point of view: surveillance technologies are not produced in Africa, but imported there, so that adds human rights international bodies and communities to the agenda, as they can strenghten the narrative that these technologies are not needed and are not synonyms for security

  • Data protection laws usually exempt their application to public and national security matters, so that makes it easier for States to implement surveillance technologies - and they have been doing that without impact assessments of any kind (both human rights impact assessment or data protection impact assessments)

  • There is a big responsibility from international cooperation both from international organizations or States when thinking in the deployment of security and technology. The US national security agency has provided technologies to Ethiopia; China and the EU have also invested money in African countries for the deployment of surveillance technologies
    The EU has been equiping non members States with surveillance technologies in order to prevent migration fluxes before migrants actually reach the physical borders of the EU
    Private economic interests, lack of transparency and lack of legislation are some of the challenges to work against this movement. Going after private companies that manufacturer the technologies and after countries that provide them to other ones are also some of the possible paths to moving forward.

  • From a government point of view is important to recognize that technologies are a tool for States to provide services, including public security. States engage with human rights (Paraguay is directly involved with international human rights systems). However, it is also important to reflect on the issue that States are not uniform bodies, and sometimes, approaches to a given matter change between different public bodies. This includes public security as a policy.

     

Audience engagement

 

  • As long as countries from the Global South use technologies developed in other countries, they will not be free and be able to protect their citizens. Africa doesn't have a strong political will to develop its own technologies and some countries are just moving away from authoritarian to democratic regimes now. Civil society is also not strong in this context. Another challenge is the lack of laws of access to information, as well as a lack of collaboration between strong international human rights organizations and local organizations (specially francophone Africa)
    It's a muti- stakeholder ecosystem and coordinated efforts are necessary to attack it

  • More needs to be done when reflecting not only in the role of the State, but also international cooperation and private actors who predominantly have shareholders interests in mind. The multi- stakeholder process is and should be a tool to use when reflecting in this role and the duty of this stakeholders when thinking in human rights compliance and particularly when considering global south particularities.

  • Lastly, CSOs should be in touch with each other and with journalists and the media in order to unveil current opaqueness. Also with international human rights actors, who usually have the possibility of engage more directly and put more pressure in States

IGF 2022 Town Hall #55 Inclusive AI regulation: perspectives from four continents

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

There is still a participation gap in the AI regulation debate. There has been an increase in participation of different stakeholders of late, yet particularly significant groups such as youth and vulnerable groups seem not to be at the forefront of the discussion nor be particularly represented.

, AI regulation focusing on human rights is an emerging general trend. Discussions in terms of privacy and data protection, particularly in terms of automated decision-making, are gaining momentum.
Calls to Action

We need more involvement of youth perspectives in AI governance.

,

Within multi-stakeholder approaches to AI governance, civil society perspectives must be more strongly included.

Session Report

The town hall session #55 Inclusive AI regulation: perspectives from four continents brought together perspectives across four continents to reflect on the status quo of AI governance and specifically the question of inclusion. After a short introduction to the session, four presenters briefly summarized the state of AI governance in their respective context, speaking for about 8 minutes each. The participants were encouraged to set the floor, so that they would provide space for meaningful debate in the second half of the session, where questions from the audience were answered openly and deliberatively. In the following, the input presentations are summarized before the discussions are reported.

 

Celina Bottino (Project Director at the Institute for Technology & Society of Rio de Janeiro, Brazil)

Developments in the field of AI are enormous worldwide. In terms of regulation, the challenges are still there. In Brazil, advances in policy have been reflected in different instruments, from the National AI strategies to more than 20 proposed AI bills.

Yet, there are still complex infrastructure hurdles that have not been overcome. Celina Bottino noted the example of a project that the Institute for Technology and Society of Rio de Janeiro has, developed in partnership with the Public Defenders of Rio de Janeiro: a Sandbox of AI for exploring health issues litigated in the Judiciary. The project has the funding, the access to technology and the right minds, yet it still lacks structured, machine readable, quality data.

The advances in AI Regulation, mostly do not address this infrastructure gap. Data governance and open data are yet to be considered first on the list of priorities. She mentioned the most recent Bill presented in the Senate, which has been the result of an inclusive process where specialists from a wide variety of fields could partake in an open consultation endeavor and even there this matter was less than a significant element.

 

Samson Esayas (Associate Professor at BI Norwegian Business School, Norway)

He noted that he was representing the European perspective but originally he is from Ethiopia, where IGF 2022 is taking place. He mentioned that he is from the North of the country (Tigray) where there have been a lot of communication gaps and he wishes to call attention that inclusion means communities such as the one he comes from being taken seriously and partaking in regulatory processes as well.

As for the European regulatory process, in his views there are 4 main drivers of the discussion:

  • The first driver is the protection of fundamental rights, particularly privacy, freedom of expression, freedom against discrimination and the protection of vulnerable groups.
  • The second driver is protection of the integrity of elections and against disinformation and other systemic dangers;
  • The third driver is accountability and allocation of liability; and,
  • The fourth driver is data control and access. This is a similar concern as the one noted by Celina Bottino in terms of Brazil and the Global South, as Europe seeks to address the concerns of data quality, and the asymmetries of access to data.

Samson Esayas noted that one of the main purposes of the AI regulation is to foster an environment of human-centric AI development. Thus, Europe has identified specific cases that increase the risks of posing danger to humans and the regulation addresses such cases.

Additionally, he noted that the EU is developing other regulations to complement more general issues not directly addressed by the main AI regulation. The example he mentioned was platform regulation, which focuses on the vulnerabilities of platform workers.

 

Shaun Pather (Professor at the University of the Western Cape, South Africa)

He mentioned the digital divide that still exists in the world and noted recent ITU studies, which show facts and figures: 2.7 billion people are still offline. Additionally, he highlighted that affordability and costs of connectivity services add to the digital gap.

Another significant shortcoming refers to skills. A study of the state of AI in Africa shows that:

  • There was no dedicated AI legislation in the continent;
  • There were only 4 national AI strategies; and
  • Data protection regulation is one of the most developed issues, particularly in terms of personal data processed for automated decision-making.

Moreover, there is an important question of whether populations and groups are involved or represented in data collections for the AI tools that may be offered to them. Not being a part of data sets may lead to ineffective AIs or unequal and even potentially discriminatory results.

As a solution, there should be an integrated effort to involve more of the population and foster greater participation. There should be more coordination and coordinated responses and both the projects and the frameworks should involve a more international participation.

In terms of ethical principles and responses, even if ethics may have a local component, we should strive to find more universal frameworks that are still accommodating to regional and local realities and policies. It is significant that more research, especially on practical tools and mechanisms for checking software in all stages, should be developed to provide a more inclusive AI.

 

Sandra Cortesi (Fellow at the Berkman Klein Center for Internet & Society at Harvard University, USA, Senior Research and Teaching Associate at the University of Zurich, Switzerland)

Reflecting on the state of AI governance in North America, and the United States in particular, Sandra Cortesi showed that many conversations are taking place to develop a diverse range of norms for governing AI. Thus, in terms of inclusion of stakeholders there is a wide variety of participants in the development of AI issues. Significant issues are picking up in municipalities, such as facial recognition limitations and bans, and standard organizations, such as the ethical standards of the Institute of Electrical and Electronics Engineers (IEEE).

There is no national law in the US. The approach of governance is not an overall view, yet to sector specific agencies, for medical AI (Food and Drug Agency).

There are still participation gaps. One example is the participation of young people. They seem not to be part of the wide AI governance agenda. UNICEF, for instance, has documented the gap of youth being mentioned and represented in the debate.

 

Questions from the audience and responses from the presenters

How can we deal with the different ethical considerations at regional level?

Shaun Pather: There should be a more universal framework still even if we should allow for more regional and continental-wide responses to be developed.

Celina Bottino: Conversations are being lead by Global North countries, other countries and regions should be called upon to participate.

 

What would be the practical approach for AI regulation and AI-based spaces like the metaverse to ensure public safety, security, health, data sovereignty and accountability of global AI actors, given that different societies have different ethical and legal frameworks? And what should be done for AI related cybercrimes which are borderless?

Sandra Cortesi: There is no one silver bullet. All such conversations should happen at the Global level. Include civil society and different actors. We may not agree on cross-border solutions yet this does not mean we should not strive to find common ground.

Samson Esayas: Fundamental rights and concerns about safety are the focus of AI regulation. Metaverse may not exactly be covered in the AI Act, yet certain important issues that were mentioned in or implied by the question are. Disinformation and systemic concerns are an example.

Shaun Pather: The metaverse implies a different level of globalization than what we have today. A practical approach should be cooperation and convergence. It is possible to find consensus on universally acceptable ethical principles.

Celina Bottino: The Berkman Klein Principled AI Report showcases convergence in ethical principles amongst Ethics principles from different global stakeholders. These may be starting points in our endeavors.

 

We face questions of digital imperialism or digital colonialism stemming from a huge market concentration. where users are in the South and developers, in the Global North. Examples can be found in the processing of health and education data from the South in the North. The question is whether there is any room for a Global agreement or coalition building so that AI infrastructure for education and health should be developed in an open standard and not only closed and for commercial purposes?

Samson Esayas: Local communities should be engaged in the first place.

Celina Bottino: UNESCO is increasingly becoming a focus for discussing such topics and may be an important forum for such discussions.

Sandra Cortesi: Definitely in such cases, the majority world should be included. This may not be easy but there is a lot of good work on the matter being done worldwide. One group that we should note is youth as they are 1 in 3 internet users, yet do not have a seat in the discussion.

Shaun Pather: It is significant that we bound together and coordinate efforts having all continents represented.

IGF 2022 Open Forum #16 Commonwealth Hard Talk for Action

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The importance of political will from the governments to ensure connectivity and access in all areas of the country. The government should look at internet as a public utility.

,

Energy is a crucial factor to ensure technology access and bridging digital divide. Its is worthy to diversify sources of energy to power technology in rural and hard to reach areas. Alternative energy sources like solar, wind and other renewable source can play a significant role.

Calls to Action

Fit for purpose regulatory frmawork that can attract investment into ICT. The governments should be regulating for development and not for tax collection. It is crucial in this era to ensure that the policies and regulations create an enabling environment to spur investment into ICT to bridge the digital divide.

,

Protection and guarantee to human rightsin the use of modern technology should not only be the work of policy makers in Government but also the ICT industry service providers. The government should ensure adequate regulations for protection online especially for childeren, PWD and girl.

Session Report

The session started on time with welcome remarks from the Secretary General of the Commonwealth Telecommunications Organisation (CTO). She welcomed the participant and gave a quick update on CTO’s work in meaningful connectivity to enhance digital transformation. She reiterated that meaningful connectivity encompasses affordable and availability of appropriate content to advance citizens lives. The SG noted that universal affordable access to broadband networks and devices, the ability to use the network facilities and devices safely, and the availability of appropriate content, applications and services are the essential components of meaningful connectivity. She also underscored the importance of human rights as a must-have for online access that contributes to the wellbeing of societies.

The provocateur introduced speakers who discussed the need for political will to provide affordable universal broadband connectivity, leveraging infrastructure and technological innovation while addressing energy requirements for connectivity. The speakers also discussed the need to protect human rights in the digital environment and how to deal with online harm such as cyberbullying, privacy issues, cybercrimes and negative effects of social media.

The panel was composed of 5 speakers of which 2 were ladies.

Key points of discussion included: 

  1. The need for political will to realise universal broadband connectivity. Importance of Governments to plan, set targets and allocate budgets for connectivity nation-wide.
  2. Governments should view Internet as a public utility and increase access through proper use of universal service and access fund to cover rural and hard to reach areas.
  3. There is an urgent need for a new generation of human rights activist to address the online harm and protect human rights in the same was way as offline.
  4. Availability of connectivity does not equal adoption, to increase adoption it is essential to:
  • Focus on digital skills development
  • Local content promotion
  • Provision of devices for use
  • Develop specific programmes that focuses on the vulnerable groups (PWD, girl & women)
  • Enhance connectivity through community projects like smart villages and digital centre
  • Increase market competition to bring about affordability
  1. Energy goes hand in hand with connectivity and it can impede access. Therefore, it is crucial to diversify sources of energy, incorporating renewable source to facilitate access to the digital economy.
  2. There was a call for a new regulatory philosophy to drive investment and bring about affordability.
IGF 2022 WS #229 Dark patterns: an online challenge in consumer protection

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

In order to reduce consumer detriment from dark patterns all actors must act and cooperate for effective consumer protection.

Calls to Action

One of the next steps should be trying to establish consensus among actors on which dark patterns are most harmful.

Session Report

A summary of the main takeaways of the session:

The discussion focused on the challenges posed to consumer protection by dark patterns, through the lens of various sectors. The speakers were introducing their comments one by one according to the round-table routine. Led by moderators, they displayed the discussed issues from different angles depending on their background. Thus the ultimate conclusions incorporate a combined vision of policymakers, business and academia.

The first challenge the panelists touched upon was that dark patterns are not all the same, making it difficult to draw the line between dark patterns and common marketing practices. This issue led to the discussion on how can we differentiate, with some panelists claiming that common marketing practices have fair and neutral designs, rather than manipulative and detrimental. These can be classified as choice architecture nudges instead of dark patterns, which are seen to negatively impact autonomous decision making. However, other panelists argued that this is not so simple, as no practices can be seen as wholly neutral, and that even marketing that tries to be unbiased does affect consumer decision making in one way or another.

The panelists then moved on to say that in terms of terminology we need to be careful and analyze which words we use. One example given was the use of the term “sludges” instead of “dark patterns”. Another issue surrounded consumer detriment – are consumers aware that dark patterns can cause them harm, and from a consumer protection perspective does it matter or are dark patterns that consumers are aware of equally grievous as those that they are not aware of?

Next one of the panelists gave two case study examples of companies that were caught using dark patterns. The first company was Amazon, who made it difficult to cancel its Prime subscription due to many stages of cancelation. This was seen as a deliberate attempt to retain its customers, and Amazon was ordered to drastically reduce the stages of cancelation to make it more accessible to consumers. The second company talked about was Booking.com, who had misleading scarcity claims – these claims were found to be vague or referred to a different time range than the consumer was interested in. This led to the key issue of corporate responsibility – that businesses must hold themselves accountable for the use of dark patterns and decide whether they want to engage in activities that they know cause consumer detriment. However, in order to achieve corporate responsibility, the panelists agreed that consumer protection authorities need to provide adequate guidance and raise business awareness and education. This is so that businesses can develop and abide to ‘fair marketing’. The panelists argued this is important as many consumers have a ‘fear of missing out’ and due to this can make decisions that they know may not be in their best interest. One of the panelists even stated that the business community needs a shift – that currently there is too great of a focus on turnover and not enough focus on consumer welfare.

There was some participant engagement from the audience in the chat. The first comment from the audience made was that consumer awareness is the first step and then they will protect themselves just the same as with data protection and cybersecurity. One of the panelists replied that whilst this is the right direction, the difference between, for instance, cybersecurity and dark patterns is that dark patterns are more subtle and not as discernible. Another audience member observed that we can make consumers aware of what influences their choices and decisions (education) and we can educate them on remedies they can take (like the right of withdrawal).

Lastly, one participant asked a specific question on how to protect minors within online gaming, that does not require excluding them from such games or making ‘lite’ versions that usually are not as entertaining as the full game. Our panelist from the European Commission started an answer to this question, by explaining that there is work done regarding this in the Fitness Check of EU consumer law. The European Commission has concerns about certain interface designs and elements, such as "loot boxes" and the use of in-game currencies that could distort the consumer's decision-making. She finished by inviting everyone to respond to the public consultation on EU consumer law that the European Commission has released.

IGF 2022 WS #500 Role of Community to Achieve Universal Acceptance

Updated:
Connecting All People and Safeguarding Human Rights
Session Report

Summary and key takeaways:

UA requires a joint effort by multiple stakeholders. UA-Day is a good opportunity to mobilize all of them together. Here are the main tasks that each stakeholder should do to adopt UA:

    • UA reforms are needed in governments, academia, the domain name industry and technology organizations in collaboration with the language communities.
    • Open source communities, developers should update their products, systems by deploying UA-ready tools and UA-ready code samples. UA should be a part of all developers’ skill set.
    • Academia should include UA in their research, study and curriculum for subjects like software engineering, computer science and informatics. To support hands-on experience, workshops or hackathons should be organized. (Hackathon case study)
    • As governments in Africa are accelerating the roll out of the new generations of digital assets, to support the digitalization of many services, there is a need for UA courses in education for sustainability.
    • End users should demand more UA-ready systems, and report bugs about non-UA ready systems and applications (report issues at https://uasg.tech/global-support-center/).
    • IDNs should be considered a public good to promote digital economic and social inclusion.
    • Awareness is needed to promote the use of domain names and email addresses in local languages by end users.
    • ICANN and IGF should keep UA in the top five priorities to discuss and implement.
    • Local communities should be encouraged to create UA Local Initiatives in their regions and join the UA Ambassador Program to spread UA awareness to all relevant stakeholders.
    • Local communities need to mobilize their services and work to develop label generation rules for their local languages in order to have them in the Domain Name System.

The UASG will announce the date for UA Day soon, which will be a day to celebrate UA milestones and progress, as well as to raise awareness among stakeholders at both the local and global levels. The community is encouraged to participate. There will be a page on www.uasg.tech with more information.

 

Key Takeaways in Detail

Question to Ajay Data, Technical Community, Asia-Pacific Group: What role can business and governments play in achieving UA? Answer: It’s an important question because government makes policy, and businesses are supposed to execute those policies and support the government in achieving solutions. For example, in Rajasthan in India, the government worked with local technology firms to provide 70 million people with Hindi email addresses. Business and government must work together to make an impact when it comes to bringing the next billion people online.

Comment from Raymond Selorm Mamattah, E-Governance and Internet Governance Foundation for Africa (EGIGFA): It’s worth noting that in India, all government entities will be UA-ready within the next 18 months.

Question to Hadia Elminiawi, Civil Society, African Group: What role does academia play in achieving UA? Answer: Raising awareness of UA and the need for UA-related skills is important. The academic community must enhance their training and coursework to instill the need for supporting UA in their students to make the Internet more inclusive and teach them how to implement this multilingual computing technology. Academia needs to highlight the socio-economic implications on communities and how policy and governance can impact UA adoption.

Same question to Dessalegn Mequanint Yehuala, Technical Community, African Group: Academia has social responsibilities – UA is the one critical element missing in the localization of the Internet in the truest sense. Gave a description of the various UASG working groups, which academics can contribute to. Educational institutions can also tailor their curriculum toward UA so they can produce students with UA orientation, particularly engineering students.

Question to Maria Kolesnikova, Coordination Center for TLD RU: What role does the domain name industry play in achieving UA? Answer: It’s good news that ICANN and APTLD have developed guidelines for registries like ours. We can’t achieve UA by ourselves. We have launched a special project for developers and system administrators – documentation standards and manuals around how to set up UA-ready systems. We provide training, webinars and even hackathons to engage the technical community.

Question to Satish Babu, ISOC-TRV: What role does the technical community play in achieving UA? Answer: The global population has just crossed 8 billion, and Internet users around 4 billion. We have a long way to go toward bringing everyone online. How do we persuade the technical community to prioritize UA? The technical challenges are non-trivial, and we need a concerted effort from them as well as the language communities. ICANN and the UASG have developed a number of resources to help check UA compliance and become UA-ready.

Question to Hadia Elminiawi, Civil Society, African Group: How can we grow UA adoption in a sustainable way? Answer: UA is a target that is relevant to all countries and can be achieved through a common approach. We need coordinated efforts and policies to ensure that systems work with the common infrastructure of the Internet. Digital transformation also helps accelerate adoption.

Same question to Anil Kumar Jain, Technical Community, Asia-Pacific Group: To make UA more sustainable, we need an integrated approach – starting with the government and services that impact citizens directly. Next, all players – the technical community, businesses, academia, etc. -- should agree that UA is a priority. Another thing is capacity-building. When you make the whole ecosystem UA-ready, then all the players should understand how to use those systems effectively.

Question: As UA requires a broad contribution, what mechanisms are needed to increase UA acceptance, taking into consideration our global UA Day that we’ll be celebrating?

Answer: UA affects all of us, and we all benefit from it. We need to reach a critical mass of support in systems, services and applications. UA Day is a way to get the global community talking about UA. Just as we celebrate IGF, UA Day is a way to celebrate UA. It’s a chance for, once a year, all the stakeholders to come together and create synergy.

Question to Dessalegn Mequanint Yehuala, Technical Community, African Group: How can we ensure UA is globally successful? Answer: Again, the role of academia is really important. Educating the next generation of developers, programmers and technology enablers on UA helps future-proof their careers and ensures that they are equipped to help build a truly global, inclusive Internet for all.

IGF 2022 WS #66 Reassessing Government Role in IG: How to embrace Leviathan

Updated:
Avoiding Internet Fragmentation
Session Report

REPORT

The workshop`s discussions are mainly about what kind of role should the government play in internet governance, what are the major changes that need to be made by the government in the new situation and figure out what should the government do, and what can`t do.

For governments` role on internet governance, we should first be aware that government is not one monolithic group in its understanding of Internet governance. Usage of the term "governments" as name for one stakeholder group is not entirely correct., we should identify which part of government or department should be responsible for this. Before that we should first understand the deal between government and citizens. governments have to deliver on this basic social contract. They have to ensure security of citizens, flourishing society economically, politically, socially, we should return to common sense and the role of the government and many questions will be simply answered. Lao Tzu once argued that if governments followed the principle of wu‐wei (non‐action), social and economic harmony would naturally emerge and people would prosper.

As digital technology continues to be deeply involved in daily life, the government, which has a monopoly on public power, will undoubtedly need to play a more active and critical role, otherwise, the order will be unsustainable. While at the same time, as seen in events such as the tech conflict and the Russia-Ukraine conflict, the powerful power of government will also have a more destructive effect if it is not restrained.

We shouldn’t overcomplicate the questions and answers, the role of government is primarily to create an environment for internet development. At the same time promote and respect human rights. Governments may have overreach, it is their duty to address harms but not necessarily by regulating the internet. big countries are in a situation where they not trying to accept that there are differences and build more commonality and establish common ground.

Major source of disorder in cyberspace today is actually coming from government. one of the most important things that government can do in the current role is to leave, certain forms of global governance that require global compatibility, to the private sector. Government can foster an open and competitive market in internet service and cloud and software. They can prosecute effectively cyber criminals, which means that they have to cooperate and harmonize their rules with other governments. Government should stop trying so hard to control content, should allow end to end encryption to protect the privacy and confidentiality of users on the internet.

Some governments think consulting with stakeholders is already multistakeholder, consultation is not really a multi stakeholder approach, multi stakeholder approach goes far beyond consultation with other stakeholders. Regulatory intervention needs to proportional, feasible, and designed to fulfil a legitimate purpose.

Big players like US, China, Europe and India. should talk to one another, to bring other stakeholders into the conversations and instead of every big player developing its own set of principles, its own declaration for the future of the Internet., they should work together. They need to get together and recommit to free trade and internet services, and to de-securitization or to the pacification of Internet Governance among them.

It won't be an easy process will need a lot of diplomacy will need a lot of listening, we need to compromise and delicate tradeoffs. All views should be taken around the table, and only in that way. We will ensure the future of this great, great network and great human result of human creativity as the internet is. we need a new quality of interaction between the various stakeholders.

Proposal for global digital company, government leaderships and United Nations should identify the issues, and then to create multi stakeholder drafting teams, and then to draft could go in the final stage to the government's and because the United Nations is intergovernmental body, and then governments have to adopt it. But it should not be left in the hand of the governments.

There is a need for new fundamental theories to structure a new theoretical system of global internet governance; IGF, WSIS, GGE, and G20 and other mechanisms should play a better role, and also there is a need to have more in-depth communication mechanisms; the interdisciplinary academic community should play a more active role, The scientific community is the creator of the Internet and also the best guardian of cyberspace.

IGF 2022 Town Hall #69 The Amazon is online, and it is not the Prime

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Youth play an important role in activism for creating public policies that guarantee accessibility for riverside, quilombola, and indigenous populations in the Amazon region since accessibility and interconnectivity are powerful weapons against misinformation.

Calls to Action

It is crucial for organizations, institutions, universities, companies, and governments to invest timely and financial resources in the Nothern region of Brazil, with a special focus on Amazon populational groups and ethnicities. It is our responsibility, as specialists in the field of internet governance, to ensure diversity, inclusion, and equality in projects, meaningfully mobilizing local stakeholders so their voices can be heard.

Session Report

Youth play an important role in activism for creating public policies that guarantee accessibility for riverside, quilombola, and indigenous populations in the Amazon region since accessibility and interconnectivity are powerful weapons against misinformation.

Intending to discuss how internet governance can create solutions capable of creating a fair and sustainable future for all people in the world, the Town Hall  session #69 “The Amazon is online, and it is not the Prime," came up to find solutions for connectivity and accessibility for the indigenous, “quilombola” and riverside populations of the Brazilian Amazon forest.

The session featured testimonials from people who already live in the Amazon region and are activists for connectivity and respect for the land. Gustavo Souza (Brazil), a resident of Acre state, for example, highlights an exciting initiative from the Federal University of Acre to tackle a large number of illegal forest fires; they installed IoT devices to monitor the burning in the region, which reaches new records every year.

Another exciting initiative, brought by Lori Regattieri (Brazil), aims to research internet infrastructure and connection points through local Amazonian communities, creating bridges between the different communities so that they can cohort in favor of the environment. In addition, the connection would bring a regional perspective from the point of view of the populations who inhabit the territory.

There are even successful cases in the region. For example, in the state of Amazonas, in the north of Brazil, the Internet has been in implementation for some years now and can therefore bring education opportunities to more distant and low-access regions, allowing students to learn and share their knowledge. Mariana Filizola (Brazil) emphasized, during the session, that talking about the environment is not just dealing with its physical issues but also understanding how people relate to nature, their roles, what they have been producing, and their impacts on others. Thus, the question of connection and access is a challenge. However, there are important cases, mainly concerning constructing an environment for students to understand their role within the Amazon.

CETIC.br,  a Regional Center for Studies on the Development of the Information Society in Brazil, acting under the auspices of UNESCO, has already been highlighting evidence that the Amazon region is experiencing a significant increase in internet access. For example, according to Fabio Senne (Brazil), the north region currently has 83% of its population connected, which is a great success in some ways, although, compared to other regions, it still needs to catch up. Furthermore, he emphasized that when we talk about connectivity and access, we also have to talk about meaningful connectivity, as having access to the Internet by itself is no use. This access needs to be effective and has to give a voice to these communities. In that sense, he highlighted how many factors need to be considered to understand meaningful connections.

Karla Braga (Brazil), a resident of the northern region of Brazil, said that, despite celebrating that connectivity in the region is growing, there is still a lot to evolve. In 2020, more than 3 million people still did not have access to the Internet, which directly impacted even public policies involving the issue of COVID-19 and the pandemic. For her, the Internet must become an ally of youth, can foster prosperity and can save lives. She displayed a documentary created by the Institute where she works, which talks about the difficulty of accessing the Internet in the Amazon region. She said that the youth in the region are very active and are fighting to ensure equal connectivity in the region, but not only that, they are also fighting for their rights and the environment.

Entering the topic of connectivity, disinformation, and global health, with particular attention to the COVID-19 pandemic, Siena Frost (USA) brought some perspectives on how the Internet can help with health. During the pandemic, with everyone isolated in their homes, the lack of internet connection made life even more difficult for people. For this reason, for her, students need to start getting more into activism on the Internet. Therefore, young people must develop skills such as leadership and empowerment when discussing the Internet.

From a more academic perspective, Rodger Richer (Brazil) said that misinformation in the Amazon territory is a considerable problem in Brazil. He spoke about some research results of the Protocol Ipê project, carried out by the Vero Institute, where he is part of the research team. Based on these results, he emphasized that some Brazilian politicians are one of the most responsible actors in spreading misinformation about the Amazon rainforest. Therefore, we need to think about solutions to mitigate this problem. In addition, Rodger stated that environmental and climate issues are linked to an ethnic-racial perspective. Therefore, the fight against disinformation in the Amazon must consider the voices of indigenous and local communities. He also spoke about the importance of co-creation with the local population of counter-narratives about environmental and climate problems in the Amazon.

Finally, Victor Durigan (Brazil) closed the debate by talking about how misinformation in the environment becomes a barrier to creating public policies. It hinders the legislative debate, distorting the discourse in favor of the environment. For him, the fight against misinformation has to gain priority within the platforms. They must place the discourse on the defense of the environment as a priority agenda. On the other hand, society has to take the agenda to the creators of public policies, defending scientific grounds. The countries which have the Amazonia in their territory must unite to create laws against environmental and climate misinformation, user protection, democratic discourse, and social values.

It is crucial for organizations, institutions, universities, companies, and governments to invest timely and financial resources in the Nothern region of Brazil, with a special focus on Amazon populational groups and ethnicities. It is our responsibility, as specialists in the field of internet governance, to ensure diversity, inclusion, and equality in projects, meaningfully mobilizing local stakeholders so their voices can be heard, especially the youth. There is a myriad of challenges that contemporary societies face on a daily basis, and most often, they connect to climate change, environmental depredation, and digital inequality. However, the youth can be key game changers within their communities, and they bare the creativity and hope to solve problems, find solutions, and create sustainable dynamics for future democratic societies.



 



 

IGF 2022 DC-Environment Internet & the Environment: Beyond Connectivity

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

A knowledge gap still exists between Internet governance stakeholders and the environmental sector, which has to be bridged further through a multi-stakeholder approach. The DCE works in tackling such cross-cutting issues across communities.

Calls to Action

Best practices from the session have shown various landscapes, dimensions, and perspectives to dissect the aforementioned knowledge gap, whether it's with the bottom-up or top-down approach. The DCE is still growing, and although it faces challenges such as lack of resources and no concrete follow-ups yet, the DCE will choose the main topic to work on in the future. Other dynamic coalitions are keen to follow-up on the DCE's work.

Session Report

PNE Updates

The session continues from the previous discussions that has resulted in the publication of Policy Network on Environment (PNE), which provided guidelines for high level diplomacy advocacy. It encompassed tackling the cross-cutting issues with multi stakeholders and inclusive approach. It also acknowledges the gap in knowledge that has to be bridged between two big sectors of the dynamic coalition: the internet governance or tech sector and the environmental sector.

Study Cases

Best practices from the communities are discussed and shown various landscapes, dimensions, and perspectives. These included:

  • UN Joint Staff Pension Fund (UNJSPF) shared about their Digital Identity greener and more sustainable solution using the blockchain and biometrics technology to manage the administrative process of 84.000 people in 193 countries.
     
  • The work of CITAD & APC in Nigeria which has pushed for e-waste management through reuse-repair mechanism and ensured the ecosystem supported the practice holistically, for example guaranteeing the spare part for repairment is available and affordable.
     
  • UNDP shared the CODES' publication of the Action Plan for a Sustainable Planet in the Digital Age which addressed three important systemic shifts of aligning digitalization, mitigating the negative impact, and accelerating digital innovations with sustainable actions.
     
  • The EcoInternet research project by DotAsia has looked into the carbon footprint emission of internet use especially during the pandemic.

What's Next?

​​​​​​​Other Dynamic Coalitions agreed that more discussions are needed in addressing the environmental impact of digital technology. Whether through a bottom up or top down approach it has to be inclusive of multi stakeholders, especially the ones who are the most impacted.

IGF 2022 WS #240 Pathways to equitable and safe development of AGI

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:
Currently there is an inadequacy in terms of policies and regulatory framework in action when it comes to Artificial (General) Intelligence. The technology infrastructure is also dominated by the private sector, usually leaving states dependent on the solutions provided. There is also a lack of representation when it comes to involvement of minority groups or youth in policy making discussions.
Calls to Action
Global standards should continue to be used as AGI is not limited regionally, and global solutions are needed to address the deployment of AGI by global companies; these can be adapted at the local level. Efforts should be taken to encourage transparency, maintain a human rights-based focus in policymaking; while working towards a multistakeholder approach for more inclusive solutions.
Session Report
REPORT

Question 1

Promise and perils of Artificial Intelligence. AI is seen by some as one of the most strategically important parts of the third-millennium economy, possibly warranting massive private investment and state support. 

How can ethically responsible AI developers and policymakers ensure that their innovation and regulations capture the needs of all stakeholders globally, and contribute to shared prosperity rather than exacerbate existing inequalities?

  • The question of ethics will always bring a challenge when it comes to any technological development, and AI doesn’t escape from that. Saying this, we need to recognize before of doing any new regulation or ethical framework related to AI if there’s a need inside any country or globally of changing the rules of the game entirely or just we need a set of adaptations and add those new things that appear as challenges and making it into solutions that must have the human rights demand that society ask for and also those economical aspects that allows social progress. And this is where ethical aspects or principles becoming laws or regulatory frameworks became so intricate, a possible solution that I can see to that are the regulatory sandbox and /or dashboards in the differents aspects and principles of AI ethics and Governance, these kinds of best practice could be the one that allows to develop policies with a human-centered perspective and in the same time adapt those regulations according to the digital and economical rights we need to ensure in order of increasing they're already existing inequalities and fulfil that minimal of ethical principles in AIs.  All of this mentioned above under the rules of fairness, accountability and transparency. 
  • Ensuring that the AI will be ethical is the responsibility of the policy makers and regulators. It is unwise to trust that the business will follow any other rules than the ones that facilitate gains and profits. Thus, it is the regulation that can – and should – support the development of ethical AI. Firstly, we should develop the approaches that take into account the fact, that AI-based solutions have to comply with already existing laws. In the context of their ethical dimension, we should think how human-rights based approach can be used for forming the requirements regarding AI. What we need is, secondly, more consideration for the solutions which support transparency. It is important in the light of, e.g., the inclusion of the provisions that limit transparency in international economic agreements (the provisions that limit the possibilities of demanding access to source code). Transparency, as a precondition for accountability, is necessary for the development of the mechanisms that would allow broader public to scrutinise the solutions that are implemented. Thirdly, we have to ensure that the new rights that are introduced would be easily enforceable, both by the individuals and by the organisation that focus on human rights as well as, more specifically, digital rights, protection. Last but not least, the states should be more actively involved in the development of the solutions that could actually improve the quality of life of their citizens and support the achievement of goals, e.g., in the area of environmental protection. AI development is not an achievement by itself: it is what we can achieve using AI and what kind of improvements it can support.

Question 2

Uncontrolled pursuit of Artificial (General) Intelligence. 

How will socio-economic, geopolitical dynamics, and historical factors affect the design and deployment of AI technology?

  • As a person from the Majority World or also known as Global South, I think we can see the effects or disproportions when it comes to AI through the following aspects: First, data colonialism, this mean how the datasets of those technologies we are implementing has the biases of a population that doesn’t represent the Majority World and are presented as universal replicating the time of European Colonialism which brings to problems that affect the quality of life of those in the Majority World and/or force them to improve under implementation of this kind of technologies when it should be widely inclusive by design. Second, the lack of tools to develop by our own or the belief that we doesn’t enough resources to make our own AI design & developments that include not only what the global north is doing but also our own perspectives and solutions and for this to become a reality we more support for our local and regional governments, because we have the people to make the changes, just need to believe in them and break this structure of only being the consumers or the ones that implement something that was design and develop in a context that is not close to ours and see us as another. Third, and most importantly, and ideally we need to come to agreements where everyone's perspective and principles about AI are embedded by design in a way when the model is deploying the biases related to historical, geopolitical and socio-economic factors are minimal or zero. 
  • Firstly, if we do not have requirements which would support the direction of the development of AI which would take into account historical (as well as contemporary) discrimination, my fear is that the use of AI can only strengthen these mechanisms. Thus, the development of the requirements concerning the representative character of data that are used for the development of the algorithms seem to be a good solution. Factors that are broadly recognised as protected characteristics on basis of which discrimination is prohibited should be taken into account when testing certain solutions. Additionally, we can think about the potential that the AI-based tools could have not only from the perspective of avoiding discrimination, but also from the perspective which would actually focus on promoting fairer and more just solutions. However, this will not happen by itself. Therefore, we need to ensure that steps which would lead to the development of such requirements are taken. Secondly, there is a question of the resources needed for the development of AI-based innovations. When these solutions are developed mostly or solely by the private sector, the states become dependent on the solutions provided by private companies. Thus, we come back to the issue of the investments needed for the development of useful solutions that could be implemented by the states in order to achieve societal benefits (e.g., higher level of energy efficiency, better health services). Thirdly, to develop inclusive solutions, we need representation among the people that actually develop AI. Thus, there is a need to develop, e.g., programs which promote and support women who work as developers.

Question 3

Redesigning AI governance. There have been calls from multiple parties to adopt a multistakeholder model of governance, but its adoption is still far from reality. 

What could help enable representation, as well as transparent and fair policymaking processes which mediate between the interests of all stakeholders, especially those of the Global South, minorities, youth and future generations?

  • Right now around the world are efforts to increase AI Governance from a perspective that we all can adopt some set economical and ethical principles from the OECD and UNESCO and implemented on our regulation at a local level as several countries in the Majority World or at regional level as is the case of EU or the African Union. These efforts are showing us that a multistakeholder approach to AI governance is the most transparent, fair, way to get into policies that actually represent the vision of an AI for a country and in the future. As I mentioned before one of the exercises that caught my attention studying the implementations of OECD and UNESCO principles is how some countries not only in the global north but also in the majority world are implementing sandboxes and dashboards as a strategic open to the public to monitor and demand that the design, development, deployment not only policies, but others aspects related to AI actually are in line with what the society needs and human right, and I say human rights because this is a matter of that, the last pandemic left us clear that our digital world and so-called real world are two sides of the same coin that means that is important in any case defend the rights of the all population in both in the same way. In conclusion, if you ask me for a good practice that could serve as a mediator between policymaking and the rest of stakeholders, my answer is AI Regulatory Sandbox and/or Dashboards. 
  • Firstly, I think that one of the problems in this regard is the tendency to treat the digital environment as somehow separate from the offline world. This kind of approach makes it more difficult for many stakeholders that deal with, e.g., human rights more broadly, to be involved in these issues. Thus, I believe that it is important to show that the digital becomes inseparable from the material, and, thus, that if one cares about, e.g., human-rights or environment, the digital solutions could become helpful in regard to the fights concerning these issues. On the other hand, digital solutions can also make it more difficult to protect human rights (e.g., when they are not transparent), thus, there is a need to fight also for digital rights, when one cares about the offline world’s issues. This can be illustrated with the role that short-term rental platforms play: the fact that they are digital does not change the fact that they cause problems in regard to the issues such as housing, which is very material in its core. Secondly, we need more transparency in regard to the law-making. The case of Uber files, as well as other investigative journalism works and non-governmental organisations reports show the scale of lobbying which results from big-tech companies’ activities. Considering the differences in resources possessed by these companies and the resources that, e.g., non-governmental organizations or the unions have, it is impossible to match these kinds of efforts to influence the adoption of legislation. Thus, it should be the institutional and procedural solutions that would, on the one hand, more effectively protect the inclusion of the representatives of the society in the law-making process, and, on the other hand, limit the impact that the companies have. 
IGF 2022 Open Forum #29 GPAI: A Multistakeholder initiative on trustworthy AI

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

- To build a safe, healthy and prosperous future, we need to discuss how humanity can live and work harmoniously with AI. It requires collaboration and coordination of diverse stakeholders, such as governments, industry, civil society and academia to unfold the full potential of AI to serve the good of society.

,

- GPAI has its values in demonstrating powerful synergies by embracing multidisciplinary perspectives from governments, institutions and experts to promote the efforts on developing and deploying human-centric and responsible AI.

Session Report

The session was a 30 minute-dialogue moderated by two speakers.

Elizabeth Thomas-Raynaud, the Head of the GPAI Secretariat introduced GPAI’s mission, structure, Experts Working Groups and the different forms of participation in GPAI. Yoichi Iida, Director of International Research and Policy Coordination, at the Ministry of Internal Affairs and Communications of Japan (MIC-Japan) stressed the GPAI’s growth and diversity in promoting its work toward shaping a responsible AI society.

Summary of talking points

  1. GPAI’s mission is to bring governments and experts together to support and guide the responsible adoption of AI grounded in the shared principles from the OECD AI recommendation. Founded in 2020 with 15 Members, GPAI has almost doubled its membership with 29 Members in 2022. On 22 November in the context of the GPAI summit in Tokyo, GPAI welcomed four new Members: Argentina, Senegal, Serbia and Türkye. GPAI produces project outputs under the four topic areas – Responsible AI, Data Governance, Future of Work and Innovation and Commercialisation. The projects are the responsibility of the GPAI Experts who are from a wide range of sectors and a diversity of countries.
  2. Membership in GPAI is open to countries, including emerging and developing countries that endorse and share the values reflected in the OECD Recommendation on AI and have proactive role in advancing responsible AI. Experts wishing to participate in GPAI can do so through a nomination by GPAI Member countries or by the process of self-nomination. The call for self-nominated Experts opens once a year and experts from all around the world are welcome to apply.
  3. AI has progressed to the extent of bringing innovation to industries, transforming our daily lives and providing technology solutions to the most pressing issues of our time. However, this transformative potential can come with challenges when left unchecked. GPAI, despite its infancy, plays an important role in strengthening the shared values and efforts to shape a broader responsible AI community across the globe. To do this, GPAI gathers leading AI experts to produce impactful and useful AI projects in view of helping Members ensure AI development and deployment that elevates humanity.

Summary of interaction

  • Potential risk of overlapping with other initiatives: GPAI was not intended to duplicate activities taking place in other organizations on policy making. GPAI focuses on leveraging insights and expertise from the experts for the applied AI projects that aim to benefit governments in pursuing their AI discussions and implementations.
  • Criteria for Membership: GPAI is open to developing and emerging countries and its strong importance lies in like-mindedness based on the OECD AI principles.
  • Domestic priorities vs. Global priorities: There’s a strong willingness of the majority of GPAI members in contributing expertise and insights from the national AI institutes and their own experiences. Their domestic priorities and interests are shared for the development of GPAI activities and projects. Similarly, insights from international cooperation are contributing to the domestic work on AI. There are mutual benefits from that exchange.
  • A Voluntary initiative: GPAI is a voluntary organization and there is no treaty requirement. GPAI promotes project-based efforts for the development and deployment of responsible AI, while the OECD focuses on discussions of policymaking and governance framework around the topic. They should both go hand in hand to promote the OECD Recommendations on AI which are non-binding. The discussion of binding regulations is left at the government-level policy forums. GPAI distinguishes its work from binding AI regulation and emphasizes the collaborative efforts to facilitate responsible and human-centred use of AI across the globe.

IGF 2022 WS #342 Protecting a Global Internet in an Age of Economic Sanctions

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

Sanctions regimes, while legitimate policy tools, may affect Internet connectivity of citizens and diminish their online presence. // It is important to consider measures to prevent unintended consequences affecting connectivity and preventing Internet fragmentation. // Multistakeholder approach should be used to consult and help to evaluate impact, consider exemptions and monitor unintended consequences.

Calls to Action

Bilateral and multilateral arrangements for governments to coordinate and prevent adverse effect of sanctions on access to the Internet.

,

License, exemptions, derogations, processes to review and revise sanctions. Also immunity considerations for technical organisations, such as ICANN and RIRs. Maybe a treaty based solution?

Session Report

Chris Buckridge from RIPE-NCC introduced the topic of Internet sanctions. Followed by a presentation by Farzaneh Badii, with a historic overview of how sanctions regimes have become more sophisticated and targeted. Specifically, she mentioned impact of sanctions on: a) Regional Internet Registries; b) inequitable access to number resources; c) network operators; d) DNS. Nathalie Jaarsma, Cyber Ambassador from The Netherlands, explained that sanctions are a legitimate policy tool. She also explained the concept of public core, that her country has been advocating in the UN to foster responsible State behaviours and prevent attacks to public core infrastructure. She also advocates for the governance of the public core to be conducted in a multistakeholder fashion. Sanctions are dealt in a different area from the cyber silo, and they come under a lot of time-pressure. Sanctions are, in the view of The Netherlands, a means to change the behaviour of another country. The impact is not always looked at in a very complete or comprehensive way. Sometimes citizens might be deprived from access to the Internet. Jane Coffin from Connect Humanity shared a story about equipment for Internet Exchange Points being donated to developing countries. This equipment cannot be shipped to countries under sanction regimes, with penalties to the non-for-profits that donate them. Coordinating these shipments of equipment in countries that are sanctioned is very onerous, as well as inviting participants from these countries to events, becoming daunting and off-putting for organisations that mean well. Dawit Bekele from the Internet Society also shared stories about technical support to countries like Sudan and consequences to the Internet. Also he mentioned that sanctions may have the unintended consequence of fostering Internet fragmentation, sovereign national intranets. Alexander Isavnin, talked about impact of sanctions on users and further Internet fragmentation in countries in conflict. Nathalia Foditsch talked about the impact of the US embargo in Cuba resulting in low quality Internet connectivity, which also have an effect in freedom of expression in the country.  Moving into potential solutions, Farzaneh mentioned immunity as a consideration, for example, to ICANN and RIRs, so for them to be able to operate globally without being affected by sanction regimes. To a question in the chat from Marilia Maciel, about not only economic sanctions, but also trade restrictions, Farzaneh talked about the security and the security updates of the equipments and how these restrictions can also affect availability of service and connectivity. Farzaneh also talked about treaty protections to national post offices that protect the traffic of parcels, while she did not advocate for a treaty based solution for Internet sanctions. She asked what are the Internet services that, if sanctioned, can reduce the online presence of individuals.  There was a question in the room about humanitarian impact of unilaterally imposed measures. There was another comment in the room in support of sanctions in Russia and that misinformation is not a consequence of fragmented Internet due to sanctions, but a failure of civil society and digital rights activists. There was a counterargument that sanctions affect innocent citizens and the flow of information. A conversation continued in how much resilient the Internet is, and how difficult it was to prevent content blocking, as network operators were over compliant with local regulations. Nathalie Jaarsma explained that sanctions are always political, but they need to consider proportionality and their impact needs to be thoroughly considered, as well as how exemptions should be handled and help from the multistakeholder community should be welcomed. The session concluded with a suggestion to have a multistakeholder approach to map the chain of effects of sanctions.    

IGF 2022 Open Forum #86 The video games sector at the time of NFTs and the Metaverse

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

The metaverse, NFTs and AI are frontier technologies of increasing public interest. They are all closely linked to the born digital videogame industry, which is forward looking, transnational or borderless, and therefore well placed for predicting future technology trends. Frontier technologies raise significant policy questions, which can arise notably owing to the divergent approaches to frontier technologies in the field of IP.

,

This is why the work of the World Intellectual Property Organization (WIPO) in collaboration with other actors is important so as to increase the awareness of the IGF community about these issues, and also equip people and businesses to take advantage of the opportunities available.

IGF 2022 Open Forum #78 Guaranteeing Universal Digital Rights for All

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

The digital space mirrors and amplifies our physical world which is rife with discrimination and inequality, which profoundly and disproportinately affects people with intersecting identities like gender, disability, migrant status. Yet exisitng laws mostly focuses on the physical body and not the digital aspects. This needs to change to acknowledge online harms. If we are not secure online, we cannot fully exercise our human rights.

,

Addressing the continuum of violence of women and girls which manifests online and offline. A holistic approach needs to be adopted, like the Istanbul Conventions’s four pillars - Prevention, Protection, Prosecution and Co-ordinated Policies. Including laws and supporting measures preventing all types of violence, protocols preventing stigmatisation and revictimization and accessible pathways to justice including removal of online content.

Calls to Action

We are calling for the adoption of a universal digital rights framework, rooted in human rights law and underpinned by an intersectional feminist, anti-discrimination analysis. The Global Digital Compact provides an opportunity for this approach to be embedded in globally shared principles for an open, free and secure internet for all.

IGF 2022 Town Hall #45 Town Hall: Internet & Jurisdiction Policy Network

Updated:
Avoiding Internet Fragmentation
Key Takeaways:

The Secretariat of the I&JPN is pleased to announce the launch of the new Internet & Jurisdiction Regional Status Report on "Framing, Mapping and Addressing Cross-Border Digital Policies in Africa". The report explores why policy coordination is important to building an inclusive and vibrant digital economy in Africa. It identifies key trends taking shape across the region and presents opportunities and challenges.

,

To spur the development of innovative data governance frameworks across borders in Africa, the Datasphere Initiative is launching a region-wide effort to build an Africa Forum on Sandboxes for Data. The Forum will leverage the know-how and community of regional policymakers curated through the I&JPN’s, Cross-border Digital Policies for Africa Project which led a mapping of data policy approaches across the continent.

Session Report

The Internet & Jurisdiction Policy Network & Datasphere Initiative Town Hall, took place on Thursday, December 1, 2022 from 5:30pm to 6:30 pm EAT. It provided stakeholders with an update on the work in 2021/2022, shared announcements of upcoming projects, and how IGF participants can join the efforts.

The session presented the Internet & Jurisdiction Policy Network Regional Status Report "Framing, Mapping and Addressing Cross-Border Digital Policies in Africa", which has been produced with the support of the Deutsche Gesellschaft für Internationale Zusammenarbeit (GIZ) on behalf of the German Federal Ministry for Economic Cooperation and Development (BMZ). 

Participants discussed findings from the Report and why policy coordination is important to building an inclusive and vibrant digital economy in Africa. Panelists identified key trends taking shape across the region and presented opportunities and challenges for government, private sector, and civil society actors to consider.

Marking the 10th-anniversary members of the Internet & Jurisdiction Policy Network, Contact Groups shared updates on the work of the Internet & Jurisdiction Policy Network Policy Programs (Data & Jurisdiction, Content & Jurisdiction, and Domains & Jurisdiction) developing concrete proposals to support legal interoperability in specific policy areas related to domains names system abuse, content moderation and access to electronic evidence. 

The Domains & Jurisdiction Contact Group has been working on addressing some cooperation and coordination challenges arising in the mitigation of botnets at a global scale and identified key questions that need to be addressed. Participants discussed recent outcomes including the Framing Brief on Improving the Workflow of Fighting Botnets: Handling Algorithmically Generated Domains (AGDs) which provides a brief outline of the way tackling AGD botnets is currently undertaken.

Speakers also shared updates on the work of the Content & Jurisdiction Program including work on the complex question of the geographical scope of online content restrictions and how this issue emphasizes the fundamental tension between the cross-border internet and territorial national laws.  

Regarding the Data & Jurisdiction Program, the Contact Group Coordinator shared updates on the work and how the discussions have been leveraged in policy processes around cross-border access to electronic evidence. The I&JPN Toolkit on Cross-border Access to Electronic Evidence outlines the ways in which data flows and privacy can be reconciled with lawful access requirements to address crime. The Toolkit intends to inform public, private, and civil society actors in their own activities and interactions in developing and implementing alternate practices for cross-border access to electronic evidence.

Finally, the I&JPN shared news of the new organization it has incubated, the Datasphere Initiative, which seeks to build agile frameworks to responsibly unlock the value of data for all. 

Data is growing at an accelerated pace and increasingly underpins, affects, and reflects most human activities. However, legitimate concerns have emerged regarding security threats, economic imbalances, and human rights abuses that can impact a society increasingly dependent upon data. The Datasphere can be defined as the complex system encompassing all types of data and their dynamic interactions with human groups and norms. Approaching the ecosystem in which all digital data exists as the Datasphere provides the fundamental perspective shift needed to govern data for the well-being of all. The Datasphere Initiative is a global network of stakeholders fostering a holistic and innovative approach to data governance. The speakers shared recent outcomes of the Datasphere Initiative and how IGF attendees can join as a partner or a friend. 

The session also launched a new paper by Datasphere Initiative Senior Fellow 2021/2022, Tim Davies which provides a preliminary mapping of academic literature on data governance and explores how the conceptual framework of the “Datasphere” can help bridge research silos. Participants discussed how there is no single data governance academic field to speak of, literature related to data governance is growing and covers many different disciplines from computer science to health.  

The session also announced the launch of the Africa Forum on Sandboxes for Data (2023) which will build a pan-African community to enable innovative cross-border data governance solutions. The multistakeholder process will invite local, regional, and global experts to explore the ways in which regulatory and operational sandboxes could facilitate responsible data flows and exchange.

The Forum will leverage the know-how and community of regional policymakers curated through the Internet & Jurisdiction Policy Network’s, Cross-border Digital Policies for Africa Project which led a mapping of data policy approaches across the continent with the contributions from regional actors including the African Union Commission and Research ICT Africa. Self-paced learning modules & certification will be developed to equip policymakers and in particular Parliamentarians on data policy issues.

IGF 2022 Open Forum #44 Enhancing cybersecurity of the National Administration

Updated:
Enabling Safety, Security and Accountability
Calls to Action
Cybersecurity affects us all differently. The government perspective in terms of creating the ecosystem, the private sector in terms of creating the technology and then the end user because we are the citizens who are the absorbers of the technology.
Session Report

This open forum  focused on how to enhance cybersecurity and tack I will cybercrime. Also, the efforts that are different governments are doing regarding this important issue.

In the multi-stakeholder model, the private sector and governments are singled out as the main stakeholders in the model. But, today there is no single model for cyberspace governance as it depends on different issues:  If it is about cybersecurity, the governance model will be different when it comes to domain name system management, since, governments are in an advisory capacity because all the work is done by technical experts and by registries, and many diverse capabilities on the non-state side.

One concrete proposal is the creation of multi-stakeholder drafting teams to work on certain texts to find a decision-making capacity that will remain in the hands of the government. An intergovernmental body.

 

Latin America

One of the challenges in developing countries, in matters related to cybersecurity, is to keep up to date with technology, since they tend to acquire it in other countries, depend on the budget and it is difficult to follow up on progress when there is stability in the security operation.

Another important aspect is the lack of human resources, there are not many universities that offer training courses in cybersecurity.  Especially in Latin America, many of the good human resources go to work in developed countries, that is another big challenge to be faced.

On the other hand, Latin America, on its path towards digital transformation, also faces challenges in terms of infrastructure, digital economy and e-government. It requires the promotion of proactive public policies that, for example, enable the deployment of broadband services, both nationally and regionally, social inclusion and sustainable development, and finally, the use of ICTs for environmental protection.

Specifically in Argentina, to improve cybersecurity especially in the national administration, they have a national cybersecurity strategy. A national cybersecurity committee involving several ministries and a public innovation security directorate, which issued a policy called "minimum information security requirements for national public sector organizations". This policy sets out the status of the minimum information security requirements that are mandatory for these ministries or parts of the national administration.  And the main recipients and producers of information in the country.

In the Dominican Republic, for example, alliances have been made with the private sector to bring telecommunications services to remote communities where no provider would reach. The approach to the regulator and NGOs are efforts to try to bring connectivity to the people who need it most, so that the Internet can be an enabler of their transformation.

 

EU

The EU is the world's largest provider of development aid. Through the CyberNet project it tries to support efforts by raising cybersecurity awareness among both donors and recipients.

For CyberNet the big challenge comes when designing trainings or selecting experts, as, the result must be something that the recipient can feel they have really contributed to, otherwise the work will be quickly forgotten and the policies that have been drafted and even adopted will never be implemented.

In Germany they have a national cybersecurity strategy that describes the task and responsibilities of each and every part of the public administration.

The strategy also describes the second level, which is a society-wide approach.  Therefore, the technology sector, telecommunications providers, critical infrastructure providers, academia, civil society actors, etc. are also included. Thus, the multi-stakeholder approach is reflected.

The third layer is embedded in an EU framework, where a network and information security directive, gives instructions on what to do at the national level. They have an EU cyber defense posture, and there is an EU cybersecurity strategy that describes some tasks that they do together, such as capacity building.

A big challenge for Germany is that this multilevel, multipronged perspective on cybersecurity means for the government a sense of loss of control. 

Another big challenge, is to understand the impact that developing technologies will have on national security, due to their high speed of development. 

The third challenge has to do with this delicate balance between privacy and individual rights.

The fourth challenge is misinformation, since in the German system the mechanisms dealing with this problem are completely separate from cybersecurity.

 

Conclusions:

Cybersecurity affects us all differently.  The government perspective in terms of creating the ecosystem, the private sector in terms of creating the technology and then the end user because we are the citizens who are the absorbers of the technology, we need to have different levels of knowledge and different levels of information available to us. 

IGF 2022 Town Hall #98 Launch of the Coalition for Digital Africa

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Working in partnership to grow African Internet infrastructure to support the development of Africa’s digital economy, the Coalition for Digital Africa comes in a timely manner. More multistakeholder partnerships are needed on the African continent to develop a ensuring a stable, resilient and secure Internet, which is the backbone of growth.

Calls to Action

The Coalition is open and new projects can be developped along with new partners that fit in the Coalition's Guiding Principles

IGF 2022 WS #247 It is for all: Meaningful access and affordable Internet

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Meaning access and affordable internet has a gender component that we can't not let appart for the way we address, develop, apply and measure the results of the politics destinate for women and gender-diverse in ICT and Internet in Particular

,

Not only has access to devices or insfrastructure to connect is enough, beyond that we need education , one that can articulate with the industry and promote a fully inclusion in all stages of using, developing and implementing ICTs when we talk about women and gender-diverse people.

Calls to Action

Thnik politicies that can articulate and increase the incidence of women and gender diverse people online beyond just education, accesibility to devices and the building of infrastructure.

,

Have into account the diverse cultural aspects involve in defininyng gender and how and how this assumptions actually play a role on the empowerment and enjoyments of all digital rights for women and gender-diverse people globally

IGF 2022 DC-DDHT Community Connection to Ehealth, Telemedicine & MIoT

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

Accessibility issues are critical matters that must be addressed to enable universal access to ehealth

, Many stakeholders at different stages of the ehealth development life cycle makeup the ehealth eco-system; sharing of information amongst them is critical for attaining the universal SDG #3.
Calls to Action

Ensure all ehealth systems have addressed accessibility issues, and provide ehealth operators with the education to enable effectiveness.

,

Develop easily accessible spaces for stakeholder conversations on ehealth development.

Session Report

Session: IGF 2022 DC-DDHT Community Connection to Ehealth, Telemedicine & MIoT, presented by the United Nations Internet Governance Forum’s Dynamic Coalition on Data Driven Health Technologies.

The overall context of the session was based on the opportunities available for communities, from the international space to the village, to support each other in the development of ehealth systems. Fair and equitable internet access, is an opportunity to support a resilient ehealth service, so as to reach the United Nations Sustainable Development Goal #3, Health and Well-being for All. The need for the support of “good samaritans” who may be a neighbor with internet skills, or a compassionate internet philanthropist, is sought, so as to enable this end goal. The session commenced with a status update on ehealth, followed by a discussion of issues pertinent for onboarding a broader group of users, to the medical internet of things. The DC DDHT onboarding to MIoT tool kit was launched, with an invitation for everyone to contribute to developing this online information tool. A general question and answer period followed. The session was moderate by Ms. Amali De Silva-Mitchell (DC Coordinator) online and Dr Amado Espinosa onsite.

Dr Joao Rocha Gomes reflected on the use of emerging technologies for ehealth and spoke about the role of patient self-monitoring (internet as a tool). There is the need to increase the active engagement from the patient (often through the use of the internet, for patient specific care) for this self-help activity, which will lead to the resilience and opportunities for greater capacity for the ehealth system. He spoke of the use of AI to screen and profile patients for treatment options, saving time at patient intake (more patients could be triaged), again increasing the capacity and efficiency of the ehealth system.  

Mr. Sean Dodge, Royal Bank of Canada RBC Capital Markets US, highlighted the tremendous growth in financial investment within the healthcare technology sector. He pointed out that non-traditional health care-based entities, such as a large consumer brand company, as well as a large database software technology company, were now investing in the space. He noted the changing behavioral profiles of patients, who were now rapidly seeking telemedicine options (40+%) that came into active use, during the covid pandemic, as their preferred engagement option with their healthcare provider.  

Mr. Gerry Ellis of Feel the BenefIT and Ms. Lidia Best, President European Federation of Hard of Hearing People (EFHOH) spoke of their work with the ITU / WHO working group, that in June 2022, set up a universal standard for enabling accessibility to the internet, for all persons with visible and invisible disabilities. An inclusive internet is an internet that is open for use by all peoples (diversity) of the world, in a fair and equitable manner. The medical internet of things MIoT, can only be effective if it has ehealth systems (including devices) that are compliant with universal accessibility standards. They also noted that disabled persons should be provided with additional time to navigate the internet based technology, such as by the extension of time-outs to web page access and so-forth.

The general discussion that ensued was active and was based on a broad set of topics. It was noted that the opportunity for conversations and information sharing amongst stakeholders is important for the development of ehealth eco-system. The full length utube recording of the session is available for viewing on the IGF 2022 website. The webpage with access to the DC online book and tool kit is available at: Dynamic Coalition on Data Driven Health Technologies (DC-DDHT) | Internet Governance Forum (intgovforum.org)

Reported by Ms. Amali De Silva-Mitchell, Coordinator UN IGF DC DDHT

IGF 2022 Town Hall #92 Ethics and Regulation of Emerging Technologies & AI

Updated:
Addressing Advanced Technologies, including AI
IGF 2022 Open Forum #94 Privacy Risk Management in AI

Updated:
Governing Data and Protecting Privacy
Key Takeaways:

Privacy and data protection are central to the governance of AI, and all stakeholders in the AI supply chain have a role to play in upholding privacy rights.

Calls to Action

Reinforcing the role of national and sub-national privacy authorities, they should be engaged and consulted by national governments and international institutions as AI frameworks are created.

IGF 2022 Open Forum #70 Open Forum: Shaping digital platforms for the public good

Updated:
Addressing Advanced Technologies, including AI
Key Takeaways:

UNESCO is developing a draft regulatory framework for digital platforms that will address harmful content such as mis- and disinformation and hate speech, while safeguarding freedom of expression and other human rights. The framework will focus on processes and will guided by the principles of transparency, content management policies consistent with human rights, user empowerment mechanisms, accountability, and independent oversight.

Calls to Action

The draft framework should provide a set of principles for social media platforms to fulfil their due diligence obligations regarding management of content that damages democracy and human rights. It should be a contribution to the global conversation on online content moderation to empower users, in particular the most vulnerable groups as well as users of minority languages.

IGF 2022 Open Forum #68 Our Action Plan for a Sustainable Planet in the Digital Age

Updated:
Connecting All People and Safeguarding Human Rights
Key Takeaways:

CODES is going forward with its nine Impact Initiatives and presented its current status on Impact Initiatives 5, 7 & 8.

,

Looking forward to the Global Digital Compact CODES is eager to set the topic of digital and environmental sustainability on the GDC`s Agenda

Calls to Action

Please feel free to join and connect with other CODES members via our Sparkblue page: https://www.sparkblue.org/CODES

IGF 2022 DC-Blockchain: Model Law on Decentralized Autonomous Organizations (DAOs)

Updated:
Key Takeaways:

DAOs face significant legal uncertainty that can be detrimental to their development and utilization. The DAO Model Law seeks to provide legal certainty for DAOs and their participants, and unlike other extant and proposed regulatory frameworks, accommodates flexibility for unregistered DAOs, their unique features and enables further innovation. During the session it was identified that the issues of taxation and insolvency and regulatory sandb

Calls to Action

During the session it was identified that the issues of taxation and insolvency and regulatory sandboxing are of growing interest in countries that are interested in providing legal recognition to DAOs and how this can be assimilated within the wider permissionless blockchain ecosystem.

, DAOs face significant legal uncertainty that can be detrimental to their development and utilization. The DAO Model Law seeks to provide legal certainty for DAOs and their participants, and unlike other extant and proposed regulatory frameworks, accommodates flexibility for unregistered DAOs, their unique features and enables further innovation.
IGF 2022 WS #71 The Good, the Bad and the Ugly; online gender violence

Updated:
Enabling Safety, Security and Accountability
Key Takeaways:

More data is needed to train IA tools to help tackling online-gender based violence. The problem is that society is normalising men as perpetrators, more investment need to make in educating boys and men as well on this matter. Why does one gender need to make an extra effort to be safe online? Many women and girls are tired of the advice society is giving to them and the language that has been used. We are also putting extra burden on women and

Session Report

IGF workshop: Online-gender based violence

Online participants: 27, including Bangladesh remote hub

Onsite participants: 50+

Opening statements

Women and girls are disproportionally victimized. Gender-based violence is mainly due to power imbalance between women and men. (Roberta Metsola)

Online gender-based violence has become a driver for gender inequality when it comes to internet access. However, with the support of digital technologies, the process to identify victims of gender-based violence and their abusers has been simplified. (ITU)

It’s frustrating to see that mainly women and girls are victims of online violence. This leads to them leaving online spaces, and being less empowered alone. This is a very disappointing trend to see. (Youth IGF)

Through the work done by the Digital Rights Foundation it is clear, that there is a huge lack of awareness between women and girls about their rights and according to policies. Resources on this topic to educate women and girls on this matter are still missing. A multi-stakeholder approach needs be followed to collectively tackle this trend globally by making sure it fits needs and policies at national level. (Digital Rights Foundation)

Gender stereotypes are still very widespread, which in many cases lead to online gender-based violence. To counteract this, over the recent years Meta has actively put reporting mechanism (machine learning) in place to better react. However, the collaboration with civil society organisation is vital in order to properly educate citizens about this matter. (Meta)

Question to panel: Women and girls have the right to participate equally, how are we going to achieve this?

Youth IGF: Three key pillars: Industry, law enforcement and empowerment of women and girls: Industry and law enforcement have the power to actually act in order to achieve this goal. In addition, women and girls need to be more empowered, especially when it comes to their online presence and the pressure of showing an “ideal” image of themselves.

Digital Rights Foundation: Political will is the most important element here. A multi-stakeholder approach is key, however governments need to do, especially when it comes to implementing laws, as often there are misread and misunderstood. In this regard, capacity building of judges is fundamental.

Meta: Funding civil society is vital! Industry and governments need to invest more into work of civil society organisation, who have proven evidence based solutions that so often need funding to continue their great work.

David: The normalization of gender-based violence is concerning, it is a cultural issue we are facing.

ITU: More capacity building (especially in the Africa region) needs to be done, girls and women need to better equipped with skills, how to be safe online and protect their privacy. Educating about cyber hygiene is fundamental. ITU is making great efforts in this regard across Africa, where the issues is especially worrying.

  • More data is needed to train IA tools to help tackling online-gender based violence.
  • The problem is that society is normalising men as perpetrators, more investment need to make in educating boys and men as well on this matter.
  • Why does one gender need to make an extra effort to be safe online? Many women and girls are tired of the advice society is giving to them and the language that has been used. We are also putting extra burden on women and girls to feel empowered online and offline.
  • Resources, community guidelines and reporting on platforms need to be made available in local languages.
  • Inclusivity is key taking inter-sexuality and vulnerable groups into account.
  • Education people (mainly men) about their privilege needs to be enforced. But who can be made responsible for this task?

Final statements by speakers:  

  • ITU: Many men are working on alongside women and girls and are keen supporting them on this matter.
  • Digital Rights Foundation: Treating women is human beings and not only as mothers and daughters will eventually close gender equality.
  • Meta: Efforts on this matter will continue in the best way possible.
  • Youth IGF: Inclusivity and making sure vulnerable groups feel represented is key.