Check-in and access this session from the IGF Schedule.

IGF 2018 TECHNICAL & OPERATIONAL TOPICS

    Room
    Salle I (Main)

    Main session on Technical and Operational Issues
    Content Blocking and Filtering: a challenge for Internet growth

    1.     Title/Date and Time/Length of the Session

    Title: Content blocking and filtering: a challenge for Internet growth

    Length: 80 minutes

    Format: Panel discussion

    Venue: Nov 14 / Salle I / 10:00 am to 11:20 am

    2.     Brief Description/Objective

    As RFC7754 states, the Internet is structured to be an open communications medium.  This openness is one of the key underpinnings of Internet innovation, but it can also allow communications that may be viewed as undesirable by certain parties.  Thus, as the Internet has grown, so have mechanisms such as content blocking and filtering, to limit the extent and impact of abusive or objectionable communications.  The approach to blocking and filtering that is most coherent with the Internet architecture is to inform endpoints about potentially undesirable services, so that the communicants can avoid engaging in abusive or objectionable communications. Technical implementations of filtering and blocking techniques misuse Internet protocols (such as the DNS) and can cause unintended consequences to third parties. These technical implementations have limited efficacy and can be easily circumvented, compare to the alternative which requires dialogue, collaboration and due process, supporting how the Internet actually works. As all communication over the Internet is facilitated by intermediaries (such as Internet access providers, social networks, and search engines) and the policies governing their legal liability for the content of these communications have a larger impact on users’ rights, it is important to balances the needs of stakeholders, incorporating baseline safeguards and best practices based on international human rights instruments and other international legal frameworks.

    The session is linked to SDG9.

    3.         Agenda

    • Set the scene and session focus
    • Understanding Internet architecture and content filtering/blocking and adverse effects
    • Why content filtering/blocking? Why not?
    • What is the impact on the operations of a network/content provider? Risks, costs, technical
    • What can we do about it? Open microphone  

    4.     Policy Questions

    • How important for policy development around content filtering/blocking is a multi stakeholder/bottom up/consensus building approach? What stakeholders should be included and how?
    • Any blocking order of unlawful content must be supported by law, independently reviewed, and narrowly targeted to achieve a legitimate aim. How is that process conducted in practice?
    • What steps should be taken to minimize negative side-effects for service providers (risk/compliance, cost, technical expertise) and for users/consumers?

    5.     Chair(s) and/or Moderator(s) (Confirmed)

    Organizers:

    • Sylvia Cadena, Female, Technical Community, APNIC Foundation. Colombia/Australia
    • Sumon Ahmed Sabir, Male, Technical Community, Fiber@Home. Bangladesh

    Moderator:

    • Bill Woodcock. Male, Technical Community, Packet Clearing House. United States

    Remote Moderator:

    • Carolina Aguerre, Female, Academia, Argentina

    Rapporteurs:

    • Sylvia Cadena, Female, Technical Community, Colombia/Australia
    • Bhadrika Magan, Female, Technical Community, New Zealand

    6.     Panelists/Speakers

    Set the scene by MAG members (confirmed)

    • Danko Jevtović, Male. Technical Community, ICANN Board, Serbia
    • Sumon Ahmed Sabir, Male, Technical Community, Fiber@Home. Bangladesh

    Panelists (confirmed):

    • Peter Koch, Technical Community. DENIC. Male
    • Andrew Sullivan, Technical Community. ISOC. Male
    • Sebastian Soriano, Government. ARCEP Autorité de régulation des communications électroniques et des postes, Male
    • Irene Poetranto, Academia. CitizenLab - University of Toronto. Female
    • Alexander Isavnin, Civil Society. Internet Protection Society. Male
    • Mariko Kobayashi, Youth Representative. Master student, Keio University. Female

    7.         Plan for in-room participant engagement/interaction?

    Comments and Q&A will follow each segment of the session making use of the facilities in the room (microphones) to enable the audience and remote participants join the conversation. An open mic session will be conducted at the end of the session. 

    8.     Remote moderator/Plan for online interaction?

    There will be a remote moderator to assist with the online conversation on the WebEx platform. The plans for the session will be posted prior to the event, Questions and comments will be gathered via Facebook/Twitter to enrich the on site participation. The organizers will liaise with the IGF secretariat to engage remote hubs to gather input prior to the event, in case the real time options are too difficult to handle.

    9.         Connections with other sessions?

    WS 180 Net neutrality and beyond: ensuring freedom of choice online

    https://www.intgovforum.org/content/igf-2018-ws-180-net-neutrality-and-beyond-ensuring-freedom-of-choice-online

    10.       Desired results/outputs? Possible next steps?

    • A shared understanding of the technical design, the architecture of the Internet works, and how important/relevant is to adhere to standards and appropriate use of Internet protocols

    • A shared understanding that fast and hard approaches about how to deal with harmful/unwanted content on the Internet can have unintended consequences and an impact on its operations

    • A shared understanding that the best way forward to tackle the challenge pose by harmful/unwanted content requires dialogue, collaboration and due process

    Session Time
    Session Report (* deadline Monday 20 December) - click on the ? symbol for instructions

    IGF 2018 Long Report

    - Session Type (Workshop, Open Forum, etc.):

    Panel

    - Title:

    Content blocking and filtering: a challenge for Internet growth

    - Date & Time:

    Nov 14, 10:00 AM to 11:20 AM

    - Organizer(s) by MAG members:

    Sylvia Cadena (Female, Technical Community, APNIC Foundation. Colombia/Australia)

    Sumon Ahmed Sabir (Male, Technical Community, Fiber@Home. Bangladesh)

    - Chair/Moderator:

    Bill Woodcock (Male, Technical Community, Packet Clearing House. United States)

    - Rapporteur/Notetaker:

    • Sylvia Cadena (Female, Technical Community, Colombia/Australia)
    • Bhadrika Magan (Female, Technical Community, New Zealand)

    - Remote moderator:

    • Carolina Aguerre (Female, Academia, Argentina)

    - List of speakers and their institutional affiliations (Indicate male/female/ transgender male/ transgender female/gender variant/prefer not to answer):

    • Danko Jevtović, (Male. Technical Community, ICANN Board. Serbia)
    • Sumon Ahmed Sabir (Male. Technical Community, Fiber@Home. Bangladesh)
    • Peter Koch (Male. Technical Community, DENIC. Germany)
    • Andrew Sullivan (Male. Technical Community, Internet Society. Canada)
    • Sébastien Soriano (Male. Government, ARCEP. France)
    • Irene Poetranto (Female. Academia, CitizenLab - University of Toronto. Canada)
    • Alexander Isavnin (Male. Civil Society, Internet Protection Society. Russia)
    • Mariko Kobayashi (Female. Youth Representative, Master student, Keio University. Japan)

    - Theme (as listed here):

    Technical and Operational Issues

    - Subtheme (as listed here):

    Content Blocking and Filtering

    - Please state no more than three (3) key messages of the discussion.

    Understanding the architecture of the Internet and how it works, and the relevance of adhering to standards and appropriate use of Internet protocols is a corner stone of any discussion around the stability and growth of the Internet.

    Fast and hard approaches to deal with harmful/unwanted content on the Internet can have unintended consequences and negative impact on its operations.

    The best way forward to tackle the challenge pose by harmful/unwanted content requires dialogue, collaboration and due process, respecting civic fundamental rights.

    There is great need for careful, responsible, peer-reviewed and evidence-based research around content-blocking and filtering and its real implications on censorship and surveillance.

    - Please elaborate on the discussion held, specifically on areas of agreement and divergence.

    The session started by setting the scene, discussing the focus of the session around content blocking and filtering, its technical implementations and operational implications.

    Any mentioning of any technology does not include any judgment about its fitness for purpose or any endorsement for its utilization.

    Two MAG members, Danko Jevtović and Sumon Sabir opened the session.

    Mr. Jevtović stated that as the Internet developed and grew, the application of technical implementations for content blocking and filtering became almost a necessity, but it has also become clear how the use of technical deployments without following a proper due process or considering the operational implications have negatively impacted access to legal content, as well as increasing the burden on network operators to comply with such filtering and blocking requests. At the same time, the implementation of fast and hard approaches to deal with harmful/unwanted content on the Internet, seem to minimize the space for dialogue, where the interest of all stakeholders is discussed, and due process is observed.

    Mr. Sabir stated the different ways that content filtering has been used for many years both at home and at work. At home, many families use gateway filtering to safeguard children. At the office, employers might block access to entertainment or social networks to avoid distractions and improve productivity. ISPs are also implementing filters and firewalls to protect their infrastructure. The growth of what might be considered as harmful content has led to some countries to request ISPs operating in their territory, to apply blocking and filtering rules for the whole country, at unprecedent scale. At the same rate that filtering and blocking techniques are implemented, ways to circumvent such techniques become available. Techniques such as filtering or blocking IP prefixes defined as the source, URI / URL blocking, DNS Blackholing are not effective as more content is transmitted via HTTPS/TLS encryption and DNSSEC implementation progresses. Requests from law enforcement to take down content have also increased significantly. The transparency reports from Google and Facebook show how almost every country in the world has submit such request, and most of them cite national security as the main reason.

    The agenda for the session continued with two panellists from the Technical Community, Peter Koch and Andrew Sullivan.

    Mr. Koch started his intervention with a basic concept about how the Internet works, referring to how anything that is transmitted over the Internet is chopped into small pieces called packets, that travel independent of each other.  As such, no content appears at any place in the network in its entirety, and therefore with some exceptions it cannot be inspected in total or judged in total at any point in time, except, of course, at the end point which is either the source of the traffic or the device in front of the consumer. This very same characteristic has a direct co-relation with how effective the content blocking and filtering methods can be. There are a variety of technical implementations for filter/block, from the simplest option of blocking IP addresses, block the resolution of the domain name (DNS), DNS over HTTPS, intercepting traffic, blacklisting, transport layer security (TLS), Content delivery network (CDN).

    For example, to stop employees to access a website or to protect them from a phishing website, botnet mitigation or malware, can be easily done by blocking the IP address of a particular website, at their first route to the Internet, and then the traffic will not flow back and forth, and employees will not be able to access that website. Content can be replicated, technologies that help manage load spikes that help to distribute content, so IP blocking is also be easily circumvented.

    Another method called blackholing. Instead doing the filtering close to the user as in the example above, the blocking of an IP address or a range of addresses is done at the ISP or IXP point. This affects all the packets going back and forth between anyone using the Internet, and those the address range in question. The further away the filtering is applied from the source, the more organizations have to interfere with the packets flow. That also means that some process should be in place to determine the ranges of addresses to block, and how many organizations should be involved so it is effective.

    Blocking a website can also be done by blocking its domain name. This type of interference at the ISP level will stop the resolution of a particular domain name, will suppress the translation or the mapping of the domain name to an IP address.  However, if the user has the IP address from where the content originates and if the IP address is not blocked, they will be able to reach the content. Another method on this layer, is called domain name takedown, which means to remove the domain name from the DNS, so it will not resolve anymore. Same as the previous example, the content will still be available although unreachable as the domain name will not resolve.

    On the examples mentioned, the issue about “judging” content as harmful or illegal, has been mentioned. There is no technical solution to apply such judgements, human intervention is required.

    Mr Sullivan agreed with Mr Koch statements and took a few minutes to reiterate that the blocking and filtering techniques described are not removing the content from the Internet but blocking access to the point from where the content is coming from, and this can include other sources of content not identified as harmful or illegal. The Internet is a network with a common numbering and naming space, formed from many different networks. This means is not possible to block the end points from end-to-end in one place: the identified content will have to be blocked throughout all the cooperative networks, to get every network to block it, as there is no centre of control.

    The blocking and filtering mechanisms described in RFC7754, written by the Internet Architecture Board, is an informational document, not a manual about how to implement them. Such technical mechanisms are not in line with the architecture of the Internet, intended to be neutral.

    One of the challenges about inspecting content, is that it is very hard to say if intercepting traffic is actually legitimate, as it puts into question the validity of the exchange, for example on banking transactions where Transport Layer Security (TLS) is used to encrypt the information. When applied, filters do not work, and content can’t be blocked outside of its host.

    When blocking content, the challenge is that due to the large scale, blocking techniques are proving to be incompatible with the policy goals at the macro level, like trying to prevent a house to be inundated by blocking the water at the molecular level.

    Take-down notices have no technical component. They are a legal notification to a content publisher or producer, to stop publishing on the Internet, or face a penalty to be applied if a specific content is not removed from the server that host it. That is the spirit of the legislation, and there is no technical feature to be able to do that. To remove content from the Internet, it has to be removed from its source by the content publisher or producer.

    In response to a question from the audience about how to ensure there is no misuse of the techniques for content blocking and filtering, Mr. Sullivan answered that there is no “evil bit” that identifies malicious packets, so in principle there is no way to make sure you are getting it right. He referenced RFC 3514 written as a joke, about the evil bit. Mr. Koch provided an analogy indicating that if access to the streets of a red-light district is blocked, ambulances and postman will also not have access. He also provided an analogy about the inability to inspect packets equating it to demolishing or closing down a whole building just because something inside that building was identified as bad, but as it was impossible to know exactly where in that address, then everything has to go.

    On the following block of the agenda, the session looked at the approaches from Regulators in Europe around the content blocking and filtering, presented by Sebastién Soriano.

    Mr. Soriano started his intervention by quoting Stephen Brown “information wants to be free”. This brings great opportunities around creation and exchange of information. Sadly, some individuals use this power to cause harm. These threads are defined differently by every country, according to their cultural sensitivities, balancing freedom vs. protection.

    Mr. Soriano referenced President Macron’s speech at the IGF Opening Plenary, where cyberattacks, fake news specially around electoral processes, copyright infringement that threatens business opportunities and hate speech were mentioned as unacceptable uses of the Internet that are harmful and should be deal with.

    In that respect, Mr. Soriano highlighted that the strategies independent regulatory bodies deal with such threats in Europe are based on two very important principles: network neutrality (defined as the non-discrimination in the management of traffic and freedom of choice for end users) and limited liability for hosting companies (defined as no legal responsibility for hosting illegal/harmful content if the company is not aware of that). In this context, if a public authority with a legitimate objective requests a hosting company to withdraw the content, then the company has to comply with such request. These requests are managed under a very clear process, respecting civic fundamental rights.

    France is studying the creation of a new status for hosting companies that accelerate the propagation of content. As such, a pilot program will be implemented in 2019 where teams from different French regulators will work with Facebook to explore this proposal in more detail. He responded to a question from the audience seeking clarification about this approach, to explain that the idea is to protect content producers (such as bloggers) from being treated under the same regulations covering content accelerator platforms, such as Facebook, as they have a very different reach and impact on society. The big companies accelerating content should be covered under a different definition. He highlighted the importance of rethinking regulation in the Internet age, not to micro-manage content but to audit and verify how the content accelerator does it.

    A member of the audience highlighted the delicate balance between determining what harmful/illegal content is and freedom of expression, in the context of how much the Internet can amplify access to such content. He asked if this is the first attempt to redefine freedom of expression as it applied to different media.

    Mr. Soriano replied that the balance does not depend on the technology used but on the message being communicated, however the capacity the Internet has to propagate information freely and widely require specific responses. Such tensions have made possible that many countries defend more forcefully freedom of expression.

    Mr. Woodcock, the session’s moderator, also provided distinctions around technical implementations that threaten the neutrality of the network, by allowing preferential treatment to some content providers while degrading performance to others as well as other implementations that provide access without restrictions or additional costs to certain content providers while charging additional costs (zero-rating). He clarified that misbehaviour among unregulated market dominant companies based in the United States has an overflow effect as regulators around the world have a difficult time enforcing regulations on them. In Europe, as there are more active privacy and individual rights regulators discussing these complex issues mostly generated by US companies, more ways to address these problems and find solutions are coming from Europe.

    On the next block of the agenda, two speakers from the civil society shared their views.

    Ms. Poetranto started her intervention referencing the work of the CitizenLab of the University of Toronto, done in collaboration with the Berkman Centre of the University of Harvard to investigate and expose Internet filtering in over 70 countries around the world. The Open Net Initiative research found that censor the Internet in different ways and intensities, often under arguments compelling and powerful as mentioned before by Mr. Sabir and Mr. Soriano (copyright protection, false news, national security). She added the preservation of cultural norms and religious values to the list of reasons mentioned.

    Their research has documented widespread use of Western made software solutions such as BlueCoat (US company) or NetSweeper (Canadian company). They have also found relevant issues about accuracy of such deployments as defined by under-blocking or over-blocking. The first one, when the technology fails to block access for all the targets, while over-blocking refers to block content that was not intended to be blocked. One of the main problems identified is the incorrect classification used to apply such filters. The other problem identified was the lack of transparency and accountability when automatization is applied. As governments partner with such companies, the danger of any of these problems when such content filtering schemes are applied nationwide is very explicit.

    On their most recent publication in April 2018, a pattern of mischaracterization by NetSweeper’s use of a category called “alternative lifestyles” results in over-blocking of LGBQ content of non-graphical nature such as media or cultural groups, HIV-aids prevention organizations, and non-pornographic websites, which has serious human rights implications. The filter can also be set for specific countries.

    The CitizenLab continues their research in this area despite great risk. In 2015, they were sued by NetSweeper, as they published a report about the use of such tools in the state-owned and operated YemenNet. The lawsuit sought over 3M in damages and it was discontinued in its entirety in 2016.

    Ms. Poetranto concluded her intervention highlighting there is great need for careful, responsible, peer-reviewed and evidence-based research around content-blocking and filtering and its real implications on censorship and surveillance.

    Mr. Isavnin focused his intervention on his experience analysing content blocking and filtering in Russia over the last 6 years. There is a set of federal regulations, and content producers are required to take content down if requested. If content is not removed, the ISPs are obliged to restrict access. He warns about the fact that once content filtering starts to be an accepted practice in a country, it will never stop, and the resources required will only increase. In Russia, what started with a limited set of reasons why to filter or block content. The reasons as well as the number of agencies involved have only continue to grow. However, there is no evidence to support that filtering and blocking have worked for the reasons intended: no reduction on suicides, or prevention of terrorist or drug related crimes. A statistic from the European Union indicates that taking down sites increases revenues for content producers.

    The ambiguity and lack of clarity of what to block/filter leads to real operational expenses to remove blocked resources. One of the most popular French video hosting sites, DailyMotion, is blocked in Russia without a clear way to take them out of the list.

    As huge IP ranges were blocked trying to control access to Telegram messengering, it turns out that meant that because of shared hosting arrangements, over-blocking went up to 95% of the sites hosted on those IP ranges. As there is a strict enforcement and massive fines applied by the government, ISPs have resort to over-blocking. There are also huge range of resources devoted for further inspections. This has a direct impact on the technical capacity that an ISP requires to legally operate in Russia: more computational power, more auditing power, more special equipment. This has also led to misused of blocked IP addresses, appropriating them for use under internal infrastructure or to redirect other content to be displayed for advertisement using those blocked systems. The other risk for the operators is to play an active role on censoring of the political opposition.

    Mr. Isavnin stressed how RFC7745 is used in some countries as an IETF approved manual for blocking content and because of that it is important that the IETF and the IAB show a stronger position against the use of these techniques.

    Mr. Isavnin answered a question from Mr. Woodcock about how the costs of such implementations are covered. He stated that the blocking laws were passed in Russia without further spending of federal budgets required, so all costs are incorporated into what the ISP charge the end-user, at least doubling up the price so it makes it less affordable.

    A member of the audience raised a question about DNS over HTTPS. Mr Koch explained how this technology is facilitating the mapping of domain names would be done over the browser (like accessing a web page) instead of the DNS, which will circumvent interception. A technology called “safe browsing” allows users to identify the real content circulating over the HTTPS session. DNS over HTTPS is rather complex and is endorsed as an "anti-censoring" mechanism, just the opposite of blocking.  However, the concentration that might result from its adoption by a small number of providers could, however, enhance blocking opportunities for the lower number of points of influence. This does not refer to naïve or idealistic architectural purity, but as a clear risk for Internet growth. DNS over HTTPS is considered a form of encryption. Mr. Woodcock mentioned the battle between the standards for DNS over HTTPS versus DNS over TLS and how in his view it is more likely that DNS over TLS will prevail. It is more widely deployed, commercially offered through the Content Distribution Network providers. Privacy is treated as part of what the user can choose to pay from such service. Mr. Sullivan highlighted that many other popular services over the Internet use the same approach, by transferring a certain type of content over the web that is actually not web content: one protocol inside another. It is a very effective way to circumvent filtering and blocking, passing content over the most used protocol, which makes very costly to filter everything: to use a protocol that we can’t afford to filter.

    To finalize the panel interventions, Ms Kobayashi brought to the session the impact that the adoption and deployment of these blocking and filtering techniques have on the youth. She mentioned three main consequences: 1) young Internet users are growing with a limited view of the world, that may limit their ability to effectively articulate informed opinions, to participate in political processes; 2) these limits their possibilities to develop business, to innovate and create, and 3) to share their own ideas. The role of the youth, by asking questions from all stakeholders can help to establish dialogue. Young engineers have also been involved in the support for standardization of end-to-end encryption over the Internet.

    To wrap up the session, Ms. Cadena, one of the organizers of the session, invited the participants to continue the dialogue because the Internet is built by an ecosystem of organizations that are actively engaged in this process, and the more we continue the dialogue, the better the Internet is going to be.

    In summary:

    • The Internet is a network with a common numbering and naming space, formed from many different networks. The information that flows through the Internet, wants to be free. Content blocking and filtering are a reality today, but it is not possible to control everything in the network. There are clear tensions between freedom of information and protection against security threats, as requests to network operators to remove or to take down content increased. As such, it is key that due process, a transparent and accountable process with unambiguous and clear rules and guidelines, is carefully followed to address the concerns of users and governments while continue to support the development and growth of the Internet.
    • Authoritarian governments as well as democracies filter and block Internet content in different ways. These requests are increasing for reasons ranging from national security concerns to control the dissemination of phishing websites, malware, and botnets. However, there is no factual evidence that blocking or filtering actually works to lowering suicide rates, terrorism, while there is clear evidence that collateral blocking is hindering access to valuable information and that has been used as a tool to repress freedom of expression, quash dissent and opposite political views. On the other hand, there is evidence of the additional cost to establish government agencies to manage blocking and filtering.
    • There are a variety of technical implementations to filter/block, like blocking IP addresses, block the resolution of the domain name (DNS), DNS over HTTPS, intercepting traffic, blacklisting, transport layer security (TLS), Content delivery network (CDN) as well the not-so-technical solutions of a notice and take down system. Such technical mechanisms are not in line with the architecture of the Internet, intended to be neutral.
    • Consideration to intermediary liability is part of the framework to consider when addressing issues around content blocking and filtering to understand the burden placed on network operators to have the skills, the computer power and the budget to cover the legal costs to comply with such regulations. It is worth noticing how these costs can double the operational costs which in turn can reduce affordability for users. The review of the existing regulatory definitions to characterize content producers can offer avenues to explore and alternative solutions about the ways to protect and accelerate content provision that supports the growth of the Internet.
    • There is great need for careful, responsible, peer-reviewed and evidence-based research around content-blocking and filtering and its real implications on censorship and surveillance.
    • As a result of these practices, young Internet users are growing with a limited view of the world, that may limit their ability to effectively articulate informed opinions, to participate in political processes, to develop business, to innovate and create, to share their own ideas.

    - Please describe any policy recommendations or suggestions regarding the way forward/potential next steps.

    The challenge posed by harmful/unwanted content to preserve the open and stable Internet that continues to grow, requires an understanding of the architecture of the Internet and how it works, and the relevance of adhering to standards and appropriate use of Internet protocols when reviewing the design and adoption of regulatory frameworks, definitions around content production, etc.

    Fast and hard approaches to deal with harmful/unwanted content on the Internet have unintended consequences and negative impact on its operations and limit its potential.

    Consideration to intermediary liability, to the burden placed on network operators and how that can reduce affordability for users is required, as well as unintended negative impact on how the Internet users, in particular the youth, can grow with a limited view of the world, limited to participate in society, innovate and share ideas.

    France is studying the creation of a new status for hosting companies that accelerate the propagation of content. As such, a pilot program will be implemented in 2019 where teams from different French regulators will work with Facebook to explore this proposal in more detail.

    There is great need for careful, responsible, peer-reviewed and evidence-based research around content-blocking and filtering and its real implications on censorship and surveillance.

    - What ideas surfaced in the discussion with respect to how the IGF ecosystem might make progress on this issue?

    As the best way forward to tackle the challenge pose by harmful/unwanted content requires dialogue, collaboration and due process the IGF offers a unique space to bring together the views, knowledge and expertise from different stakeholders and disciplines to understand the problems and come with solutions, together.

    The IGF has a key role to facilitate the discussion around all these issues and bring light to the perspectives of various stakeholders.

    The IGF is the best venue to discussed current practices and their possible consequences of different approaches and do all that for the benefit of Internet and, of course, for the UN sustainable development goals.

    - Please estimate the total number of participants.

    Around 100 people were present in the room.

    - Please estimate the total number of women and gender-variant individuals present.

    Around 25 women were in the audience.

    - To what extent did the session discuss gender issues, and if to any extent, what was the discussion?

    The session referred to the technical and operational issues around content blocking and filtering. No specific discussion took place around gender on this session.