The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> ANASTASIA VLADIMIROVA: So I'm going to start, but please do let me know if the audio is still not working okay.
Hello, and welcome, everyone. It's so good to see people arriving still onsite, there are many people. And also those joining us online, thank you very much.
So yeah, unfortunately, we do have just one hour today for the session due to a confusion in scheduling. So I'm afraid we will have to skip the first part which was the poll, but if we have time we will do it at the end of the session.
So I will begin, I will dive right in with introductions. And I just wanted to start with introducing myself and my colleagues who are here with me today. My name is Anastasia Vladimirova. I'm security leader at Huradox (?) and also a digital security trainer with the Cyberstar Project, a private sector foundation.
But today I'm actually not wearing any of those hats, I'm here in my personal capacity. And I'm very, very grateful to have this opportunity to speak and to share this platform with you at the IGF to moderate and to speak about the importance of encryption in the work of human rights defenders.
Because this is an issue that I care deeply about myself, and I witnessed as part of my work in the field firsthand.
I want to also introduce the outstanding and amazing speakers who have joined me here today. Namrata Maheshwari who is Asia-Pacific Council at Access Now, an international nonprofit that works on digital rights. She also coordinates the organization's global work on encryption.
Mallory Knodel who is Chief Technology Officer at the Center for Democracy and Technology. Mallory is a public interest technologist who takes a human rights people-centered approach to technology implementation with a focus on encryption, censorship and cybersecurity.
And last, but not least, Neeti Biyani, Policy and Advocacy Manager at the Internet Society, a global nonprofit organization with the vision that the internet is for everyone and that works towards an open, globally connected secure and trustworthy internet. She's based in New Delhi, India.
Human rights defenders all over the world rely on encryption to protect themselves against an array of threats from authoritarian governments to businesses as well as private and nonstate actors.
Encryption helps human rights actors to ensure their own and their colleague's security and safety to protect personal, valuable, and sensitive data and to express themselves freely without fear of intimidation and repercussions from adversaries.
However, ever since encryption has made its way into the daily lives and work of human rights defenders, both non-democratic and democratic governments have been trying to restrict and undermine the use of encryption through a variety of means.
This deliberate targeting of encryption does not only affect the exercise of fundamental human rights such as the right to privacy and the right to freedom of expression, but it oftentimes indirectly targets human rights actors and disproportionately affects their work.
Here we wanted to go and show you a quick video that was produced by the Global Encryption Coalition. It was part of the Global Encryption Day 2022. But we'll share the link to this in the chat in Zoom. And we definitely encourage you to check out those videos later on your own. Because we unfortunately don't have time for now.
So without further ado, I actually want to dive right into the first question which I would like to address to Namrata and Neeti.
Namrata and Neeti, your organizations are at the forefront of the fight for digital rights and open internet globally, supporting human rights actors in many different ways from advocacy to direct digital security assistance.
Could you please speak about the role of the end-to-end encryption in enabling privacy and security particularly as it relates to work of human rights actors? Thank you.
>> NAMRATA MAHESHWARI: Sure I can get us started and then hand it over to Neeti. Thank you so much, Anastasia, and thank you all for joining us both onsite and online.
Just to -- just so we know, can you hear me clearly? Yes, you can. Perfect.
Encrypted platforms enable privacy and security in an online environment where there are several blurred lines about how much of your information is accessed, how much of it is retained and by whom. And I think it is important to contextualize end-to-end encryption. Offline and online spaces are by which no means directly comparable. The threat models are vitally different. But in the offline space to a great degree one has the option of secure spaces for communication.
And end-to-end encryption makes that possible in an online environment where communications data is otherwise generated, accessed, stored and even disclosed in ways that are not completely transparent or even comprehensible. Especially in today's environment with increasing user surveillance technology and spyware. So on a daily basis encryption allows people to voice unpopular opinions freely, have private conversations with family and friends, share confidential business information or even have private conversations with your doctor, your lawyer or a journalist.
When such end-to-end encrypted safe spaces are not easily available, every individual is affected. But as Anastasia pointed out, it has a disproportionate impact on human rights defenders, activists, dissidents and vulnerable communities who are often targeted for what they say and who they speak with.
In regions hit by crisis or repressive regimes, online safety has a very tangible connection to physical safety as well. For example, communications data or location data online that is not secure is used to track down individuals and to persecute them. And here I will try to quickly share some observations that our 24/7 digital security helpline has had in terms of attacks on privacy and the need for secure channels for communication.
There has been a significant increase in cases from regions going through social unrest or any kind of social political crisis. There is a high level of cases related to account security of arrested activists. And they have also seen a huge increase in adaptation to encryption tools especially from civil society and human rights defenders and in the context of communication platforms for messaging and e-mails. This is true particularly in regions where there is a crisis of independent media safety, for example.
This tells us a bit about the kind of attacks we are seeing and people who are affected to varying degrees and the need for encryption especially in today's context.
Neeti, over to you.
>> NEETI BIYANI: Thanks so much, Namrata.
You know, very much like what Namrata has already laid out for us, you know, if I were to add or draw a parallel between what an end-to-end encrypted space would look like if we were physically imagining it, it would be two people standing on a very isolated hill somewhere, you know, where there could be nobody to listen to their conversation and they would be sure that there are no eavesdroppers, there is absolutely no surveillance and there is no intrusion into the conversation.
And you know, the privilege of having that closed door or, you know, a conversation on a hilltop with someone that you want to share that information with and only with that person, that is the online parallel or rather the offline parallel of an end-to-end encrypted communication system or data transfer system.
You know, one of the sort of misnomers that I have often come across is that end-to-end encryption is only limited to messaging and therefore it's extremely user facing. That is not true at all.
In fact, end-to-end encrypted spaces exist on e-mail, cloud sharing and data transfers as well. And, you know, end-to-end encryption is very, very closely connected with our physical safety also. It is the strongest digital security shield and lead to a safer internet.
Anyone who connects online does that with the expectation that their connection, their experience would be safe, and it would be trustworthy. And encryption is exactly that security technology that while playing a vital role in our day-to-day lives, many a times without us even realizing, you know, most of us take it for granted the fact that our transactions online, you know, or our ability to shop online would be encrypted and therefore our data or our transactions would be safe from anyone who is trying to maliciously access those.
So very, very much like how you would lock your doors or how you would ask your child to lock their rooms, encryption gives us that security shield against being eavesdropped upon, against being attacked by malicious actors, against being surveilled. So you know, there has been enough research by technologists and companies and civil society to prove how privacy and security that encryption offers us keeps marginalized populations like LGBTQ+ communities, journalists, activists, dissidents, children and women and the elderly and especially human rights defenders. And all of these communities in one way or another depend on encryption to make sure that they have safe spaces to work and interact online.
Now with the lines getting blurred more and more our online safety means that we are also safe offline. With location tracking and the ability to being identified based on our digital footprint. All of these have bigger and bigger ramifications on our physical safety as well. So on the security aspect I'm going to pause it and hand it back to Anastasia.
>> ANASTASIA VLADIMIROVA: Thank you very much, Neeti and Namrata, for those comments.
I want to go straight to my next question because I think it will build on what you just said.
Could you please -- based on your experience, could you please share what are the most worrying threats to encryption that you have seen most recently maybe in the past year? And has anything changed in the narratives of the governments that aim to undermine or break encryption?
This is also a question for Namrata and Neeti. And of course, Mallory, if you want to jump in you are welcome to do that.
>> NAMRATA MAHESHWARI: I just wanted to check, Neeti, do you want me to take that first or? Sure. Great. Thanks for that question.
I think there is a great deal of commonalities in the justifications underlying policies that threaten encryption across regions whether it's in the Asia-Pacific or the Latin American region or EU and elsewhere.
For example, the traceability mandate in Brazil and then India and then Bangladesh or efforts to undermine encryption to allow law enforcement access for a number of, I'd say, purported goals including, for example, to keep people safe online. And, of course, that is a grave issue. But we'll go a little bit more into how breaking encryption is not the way to achieve it.
All of these proposals essentially push the idea that giving up privacy online is essential to achieving online safety. But that precisely is among the reasons why such proposals that threaten encryption are misplaced. Privacy versus safety is, of course, binary. Sorry, the video has frozen for me so I wanted to check if you can hear me fine?
>> MALLORY KNODEL: We can, yeah.
>> NAMRATA MAHESHWARI: Perfect. Okay.
Privacy versus safety I think it follows the binary. Like a lot of security experts have also demonstrated through research the two are mutually reinforcing and one cannot meaningfully exist without the other. So I think this is one very common thread between all these threads we are seeing from various governments and sometimes even the private sector.
The second thing is there is no such thing as end-to-end encryption that allows access when necessary. We don't have such technology. It is a zero-sum game. If a platform introduces the slightest possibility of circumventing encryption, it loses all the privacy and security promises of it. This is bad for individuals, businesses and even national security.
The third I would say there are alternative means of identifying miscreants or preventing crime and keeping people safe online that don't come at such a great cost to fundamental human rights. And this is where the focus should be. Weakening encryption is ultimately going to just make the problem much worse and make everybody including the more vulnerable sections of society that we are trying to keep safe online, it will make everybody unsafe.
And fourth, most of these proposals are not backed by any demonstrable research on how breaking encryption will in fact increase online safety. So intentionality just by itself, good intentions behind policies are not enough.
We need to look more at the tangible impact before destroying the tech that is currently among the best bets. As Neeti said, it is our best digital security shield for online privacy and security for goals that we are not even sure that they will achieve.
So I think all of these threats coming from governments are ultimately just going to jeopardize internet freedom and online safety and privacy in a big way.
>> NEETI BIYANI: Thanks. Namrata, a big plus one to everything you said.
I think we are seeing sort of like, you know, similar trends across countries where they are asking for encrypted content or end-to-end encrypted content to be made available to law enforcement through one or the other way.
And I'm going to quickly touch upon three ways in which governments are primarily asking for this to happen. There are other ways and could be many, many other short-sighted proposals that governments will come up with in the future as well.
But the three, you know, big strands of these attempts to gain access. The first would be exceptional access. So this is law enforcement, you know, telling service providers to provide law enforcement access in a lawful and targeted manner. Now Namrata already talked about how it is not possible at all on an entirely encrypted platform and how it is even sort of impossible and also unethical to let law enforcement do this on an encrypted platform. It is the same thing as, you know, us giving a key of our door to law enforcement just because they are like trust us with your key. Nobody would do that.
The second method would be client side scanning where services and platforms offering encryption are asked to scan for content on the platforms to check for, you know, child sexual abuse material, unlawful content, terrorism-related content, et cetera.
But that is the same as having sort of like, you know, this giant eye in your room all the time and not having private conversations when you are sharing personal thoughts or, you know, sort of like a stranger standing right behind me as I'm entering my credit card details or, you know, transacting online. So client side scanning is very much like having that huge giant eye in my room all the time.
And the third that we have seen, which is quite worrying in a lot of countries, is the ask of traceability. This is asking law enforcement asking service providers and platforms to be able to make the originator of a certain information or certain message or certain data on their platform traceable.
And the way that they are doing that is by hashing and all of that. We don't need to get into the nitty gritty how they may suggest it happens. But all three of the methods, they have one thing in common that they put the security of billions and billions of peoples and nations worldwide at risk.
Whether or not this material is actually prohibited, whether or not this material is harmful, you cannot go through everyone's content online to see and therefore catch criminals online.
And just my last point before I hand it over, there is absolutely no way for these security vulnerabilities which governments are asking to build into platforms to only then serve law enforcement purposes. At some point the keys or these backdoors or, you know, exceptional access is going to to become available to malicious third parties who will find easy quick ways to be able to exploit these vulnerabilities.
>> MODERATOR: Thank you so much, Neeti.
>> MALLORY KNODEL: I can comment as well on this one. There is some interesting trends like having watched the policy debates well over decade now and how it is different. I think one of the changes and some of the legislation we are seeing particularly in western countries is less prescriptive like the traceability requirements and more just kind of figure it out. We don't -- we are not going to prescribe how it's done but we trust the platforms to do it so just do it. That is one big trend.
The other trend is, like Neeti pointed out, they are not just looking for content. It's not about can you take this encrypted message and decrypt it so we can look at the message. It is way more of a backdoor. It is not a backdoor, it is beyond that now because countries are looking for the ability to ask for metadata and traceability. I like to think of the traceability as enhanced metadata. Can you build a platform that will now track where messages came from, how many people have seen them, who sent it, et cetera. That is a big ask of the way that a lot of these are designed.
Another thing requested can you get essentially a PIN register. Not just can you decrypt this one message but can you consistently send us all of the messages from this user account in the future or maybe actually from the past if you saved them could be also get access to those or the social network of the person in question, we would like to know who their contacts are, who they have been talking to, how often and that sort of thing.
So there is a lot actually in some of these requests. It's not just about can you decrypt this one piece of content. To design platforms that do all of that is a bunch of feature requests essentially.
I also wanted to point out another trend that I think is becoming very common not just in the encryption debate but in a lot of other spaces where the technology is both the problem, and it is the solution.
So if the encryption is the problem because it's now made drug dealers and other kinds of criminals like, you know, hidden from law enforcement, it is now also has to be the solution in that it will help us catch them and it will help us build evidence. And we can submit that evidence to court and so it is a bit of this weird paradox.
And I think the other thing, too, is at least in the case of the laws that focus on content so they are specifically focused on types of content that should be forbidden. So not all encryption laws do this, but some of them do.
They are built on top of a shaky assumption that machine learning computer vision is any good, and it is not really. Because we know that a lot of this -- and this goes back to the point Namrata was making before, we know that a lot of this objectionable unlawful content, or lawful but awful content is already on social media platforms that are not encrypted. It is everywhere. It's really hard to track and trace and take down.
And so some of the proposals for -- the technical proposals, for example, are to, you know, instead of decrypting messages maybe you could do some very fancy homomorphic encryption. I mean I'm not going to go into the details, but if you think that AI and computer vision and detecting this content is difficult now, wait until you throw lots of layer of strong encryption on top and it will not get easier.
And then lastly -- and I think this is the other trend -- is on the client side scanning proposal, that is not just a proposal related to messages or conversations. It is actually something that could affect our storage of content. So it could affect the content we have on our phones that we are not planning to send.
It could affect the content on our phones that is maybe uploaded to a cloud which often in modern mobile device operating system happens automatically without a lot of thought by users. And so this goes beyond also intent, right to share and to distribute illegal content. It could just be sitting on your phone. So those are some things that I think that hadn't been said before that I wanted to make sure we included in this discussion.
>> ANASTASIA VLADIMIROVA: Thank you very much, Mallory, for jumping in. And actually I did want to shift the conversation a little bit from the rights framework and the threats towards more technical angle that you already started doing.
My question for you, Mallory, was if you could speak about how encryption has shaped the architecture of the modern internet? And then after that how should the implementation and user encryption continue to evolve?
>> MALLORY KNODEL: It's much more fun to talk about the positives. The positive framing of the use of encryption.
So I think that we have come a long way. There is a lot more encryption pretty much everywhere. And I'm not just talking about apps and messaging apps but starting with transport encryption for accessing websites. It is now using encryption which has a central service, so it is not end-to-end, but it has gone a long way and it has been really important for the ecosystem and for human rights because it protects what users are looking for. It protects privacy and it also is a cybersecurity measure. There are lots of benefits to that. And that has been ongoing.
There has been also efforts to then encrypt DNS lookup. So by leveraging transport encryption for the web you can also do DNS lookups that way. It prevents not -- it doesn't just keep users private and data about what they are looking at, it also is a nice circumvention technique for blocks that use domain name blocking.
So it has this sort of -- so if I had to say one of the things that has been a good trend is we have been able to connect the dots between censorships, circumvention and privacy on this with the changes to the modern internet to your point.
And then, of course, right, we have had this trend now of more end-to-end encryption, more encryption of content that we intentionally put in the cloud. We use encryption now on our devices so our devices themselves when they are off or when they're at rest are encrypted.
So it is becoming more ubiquitous, and I think that has implications for the ways that now any sort of measures to either moderate traffic or to do other things on the network are really pushed to the edges. They are pushed to the ends actually where user devices exist, where the applications are. And that has a positive impact on a user's ability to understand and control their data and to understand where it is because they can be more assured that while it's out of their hands and traveling to its destination no one else can see it but them. So that's the idea. And I think those are some important things, too.
And I think to your second question how should it continue to evolve. I think that it should just be more ubiquitous. We would like to see more instant messengers adopt end-to-end encryption. There has been for many years a push to get Twitter direct messages encrypted. We'll see if that happens or not, or if it matters.
I think there are more apps out there that could use it. We know that Facebook or Meta did an analysis of actually a human rights impact assessment on if it used end-to-end encryption on both -- on all of its messaging apps -- besides WhatsApp, which is already end-to-end encrypted -- would it have an impact on, for example, parental settings in messenger and things like that.
So those are actually really positive, and we want to see more of those. And so we are hoping for more ubiquitous adoption at all of the different ways that I have just mentioned where encryption is now becoming more ubiquitous.
>> ANASTASIA VLADIMIROVA: Thank you very much, Mallory. I want to be mindful of our time since we don't have that much time left and I do want to make sure that we focus on the -- arguably the main question of this session.
And I wanted to address this question to all of you, the speakers, whoever wants to come in with an answer or some reflections.
Based on everything that we have heard over the past 20 minutes, do you think that encryption is an emerging first generation right especially because of its enabling and protective functions in relation to some of the core rights such as freedom of expression and privacy, arguably many more fundamental human rights? That's a question for all of the three speakers. Namrata, would you mind starting?
>> NAMRATA MAHESHWARI: That's a great question. I will think there are actually two prongs to the question. Right. Like do we want the right to encryption per se or do we want access to encrypted channels as being recognized as an integral part of our existing human rights including the right to privacy, the right to freedom of expression, or even the right to freedom of assembly and so on, right, the whole gamut.
I think encryption is a critical enabler of a lot of the human rights that we already have. It allows us to exercise the right to privacy and all of these other human rights that we have been talking about.
The other reason I encourage a greater focus on the existing rights framework and not a kind of tech focused conceptualization of the right is because tech has to and will keep evolving. We will in due course of time have even stronger methods of ensuring online security and safety. Today it is end-to-end encryption and I'm sure other methods that Mallory could speak to.
But I think when we then go through, let's say, years of effort to get right to encryption recognized as a right we risk having to do it all over again the moment we have a stronger tech. So I do think that access to encryption and similar secure channels and so on has to be recognized as a critical component, inalienable component of our existing human rights.
And the question is just how should we do it in the way that builds on the work that technologists and privacy advocates and so many others have done over the last many decades. And how will it build a good foundation in a manner that is sustainable even going forward and allows us to kind of propel innovation and build this jurisprudence of online privacy and the kind of open, free and secure internet that we all imagined.
>> NEETI BIYANI: If I could add to that, Namrata. I think, you know, tech for tech sake in my opinion means little. What that serves is the question we need to ask ourselves. And while, you know, I know that internationally even access to the internet --
>> MALLORY KNODEL: Neeti, we lost your audio.
>> NEETI BIYANI: Can you hear me now?
>> ANASTASIA VLADIMIROVA: We can hear her online, yes.
>> MALLORY KNODEL: We can hear you now, go ahead.
>> NEETI BIYANI: I will recap. I was adding to Namrata’s point saying tech for tech sake means little, but end technology serves is what we need to ask ourselves.
And we have seen the first second and third generations of human rights, and we are at that juncture where we are asking ourselves what should and should not be recognized for the fourth generation of human rights.
I feel like even the access to the internet is not yet recognized as a human right even though since 2012 there is widespread international agreement that there is realization of the first, second and third generation of human rights depends very much on the ability to be able to access the internet if one wants to and does have access to the internet. Because I do recognize that we have a long way ahead of us to connect the unconnected.
But having said that, those of us who are connected for the realization of our civil, political, economic, cultural, social, collective rights even you know, we do absolutely rely on the internet and all of the technology that, you know, there is to make sure that we have access to safe communication, we have access to secure communication, we have access to private communication. And the ability, in my opinion, to have that choice.
It is the resting of that choice away from citizens of democratic countries, citizens of many, many different countries, that is the question here today. It's not so much whether, you know, encryption per se is what we are fighting for. I think we are fighting in a broader sense for the ability to make that choice, right.
So I will hand it over to Mallory.
>> MALLORY KNODEL: Yeah, I really agree with your point about how access the internet and meaningful access intertwines with encryption.
And I also agree that with Namrata that it is enabling. This is not about is encryption itself a right but it's an enabler.
I think the one interesting thing about -- to center the technology for a moment, I think it has created an awareness that freedom of expression and privacy from a technical perspective are very intertwined. So to have -- and I don't know that that existed sort of previously. Maybe there is something about anonymity and expressing your opinion, but this is something deeper than that. Because if you are unable to see where someone is connecting from or what they are looking at, it is really hard to intercept that and to stop it. So that is one sort of way in which they converge.
And so encryption doesn't just keep us private. Like in some places where the primary concern is not privacy, it's access to information because there is a lot of blockages so there's a lot of information control, encryption is one way that people share content with one another.
And, of course, then in a lot of other places encryption is simply something that protects them in their banking transactions and their private conversations and so on that is quite ubiquitous. And then another point I wanted to make, too, is that this isn't also about online only.
We think of encryption as like something that you need when you are vulnerable because you have just gone on the internet. But it is actually something that has a really important impact on civic space in the offline world.
In twenty -- was it 2014 that WhatsApp announced it would be encrypted. Or was that 2016? Sorry, it was one of those. I think it was 2016 WhatsApp became end-to-end encrypted. And that was also the same year that there were massive protests in Zimbabwe.
And so it was a weird moment because protesters were using WhatsApp -- well, everybody used WhatsApp for everything -- but they were in particular using it to coordinate protests. And the repression was around -- well, first they tried to say we can still read your messages and if you are organizing protests we're going to throw you into jail.
And people were really confused, they were like can they actually read our messages, we thought they couldn't. The second thing that happened is, of course, they couldn't so then they just blocked WhatsApp. But everyone in Zimbabwe, they had a huge market, everyone was using WhatsApp so it really had a huge impact on everyone, not just the protesters. And that's evidence that end-to-end encryption is also important for our offline activities as well and not just when we're online.
>> ANASTASIA VLADIMIROVA: Thank you, Namrata, Neeti, and Mallory for your wonderful comments. They are all very to the point and I couldn't agree more. We have 15 minutes left for the session and I would like to make sure both guests onsite and online have the time and opportunity to ask their questions.
We would be happy to see questions in the online chat if you have joined on Zoom. But those who want to ask questions onsite, please raise your hands. And Mallory could you please facilitate if there are any questions.
>> MALLORY KNODEL: There are a lot, which is fantastic.
ANASTASIA VLADIMIROVA: I don't see anything yet in the Zoom chat so maybe we can start with several online -- or offline, sorry.
>> MALLORY KNODEL: I think we'll just kind of do it tour de etape, we'll go in this direction if that works for everyone.
I guess with the time left as a time check, right, we have about 30 minutes -- 30 minutes left. Is that right?
>> ANASTASIA VLADIMIROVA: If we -- we have an hour, that would be 15 minutes left. That's why I'm --
>> MALLORY KNODEL: We started -- we had to get some people out of the room. I think we can go to quarter past on the schedule. So we have 30 minutes, but just a reminder because we have many in the queue so keep it as brief as you can if you want a response from us as well.
>> AUDIENCE: I think that what we have learned from today's session about some effort, some details to the National Human Rights Institution.
We have been discussing the duties of the National Human Rights Institution in the digital era. And I think it is the right time to review Paris principles that formulated all the national human rights and institutions all over the world.
And I think it might be appropriate to talk about a declaration dedicated only to explain exactly the duties of the National Human Rights Institution in protecting the right to technology and the right to digitalization also and the right to access to the internet and to the information. Thank you.
>> KIAN VESTEINSSON: Thank you, panelists, for the wonderful presentations.
My name is Kian Vesteinsson I work for Freedom House's Technology and Democracy Portfolio. I very much appreciated all of your insights.
I know you spoke tangentially about homomorphic scanning and the seat scanning, and I will see if I can bring that boogeyman into the room. Of course, policy makers in the U.S. and I think western Europe as well have kind of framed encryption as a threat and an obstacle to the fight for child protection.
And so I wonder if the panelists could address first to what extent you have seen that trend outside of the global north. So whether this is -- this sort of logic is being picked up in other countries and regions around the world.
And second, how you would frame this argument or the response to that argument in terms of human rights as we have been speaking and particularly the rights of children. Thank you.
>> AUDIENCE: Hi, my name is (?). I work with the Dutch government on Foreign Affairs.
First of all, the Dutch have a cabinet position from 2016 in which we clearly stated we would promote end-to-end encryption and bring it up in discussions worldwide. And that is something that we keep doing.
One of the things that we -- sorry, I've got a comment and a question.
The comment is that the arguments that we have for encryption, they are very divided, right, because we've got people that are trying to combat illegal stuff online and we've got people that are saying that these are human rights, basic human rights.
And I think there's a third argument where you can find each other which is more like in the security space or hybrid warfare space because a lot of these encryptions also protects national security interests.
I would really argue to maybe address that more and more often because I think that would resonate better with other governmental speakers or people that you have the discussion with.
The question that I have for the speakers is in this room previously we had the discussion regulating platforms, talking a lot about these walled gardens. I was wondering, I think that encryption has an effect on that because everyone is locked in partly due to the encryption in systems that are hopefully encrypted. You can't use Signal to send a message to WhatsApp. So should we do something about that as well? Thank you.
>> AUDIENCE: Hello. I'm (?). I've been involved with the anonymizing communication technologies like i2p, Torah or Free Net.
And maybe it is just me or I might be confused, but I felt that end-to-end encryption and anonymizing communication technologies or like entire metadata and heuristic technologies like Torah are being confused here.
End-to-end encryption is essential technology for anonymization, but they are not the same. So I mean I think some application of end-to-end encryption is the easier target being regulated. And I personally fear that -- and of course both can be regulated separately.
So I fear that -- I think anonymization is more important when it comes to human rights. And if some application or end-to-end encryption are regulated, I think the end-to-end encryption itself will be witch hunted. Thank you very much.
>> LARS EGGERT: My name is Lars Eggert and I chair the IGF Engineering Task Force.
So I work in the plumbing of the internet, so several layers below where the discussion was today. I wanted to sort of bring up the point internet there is a lot of old equipment there that was deployed sometimes decades ago and that has an inbuilt assumption about what traffic looks like and what it can do.
And those of us who work on the plumbing, right, we are increasingly finding that we need to use the end-to-end encryption as a tool to evolve the internet infrastructure forward because it lets us punch through these old devices.
They will never get taken away unless they like burn out or something like that. So, you know, for us, end-to-end encryption is useful for the things discussed today and thanks for the great discussion. But for us it is a means also to evolve the core of the internet forward. And when discussions happen about should it be restricted or what should be done with it, it -- doing something that would limit the use of end-to-end encryption would take away a vital mechanism for us to ensure to keep involving the connectivity layer into the future. And I wanted to raise that point. Thank you.
>> AUDIENCE: Thanks for such a -- sorry, go ahead.
AUDIENCE: Okay. (?), Electronic Frontier, Finland. I'm sure people have mentioned already the client side scanning which I think is the most horrible privacy invention since forever.
That raise the point that right to encryption is all but useless if you can't trust the device you are doing it on. So you have to bundle the right to control your own device. And I think I'll leave it at -- ask the question is do you agree?
>> AUDIENCE: Very sorry about that. My name is Alex, I work in UK department in the Department of Digital. And so like the Dutch government, the UK government is a very strong supporter of encryption.
However, I think it is really important that we are clear eyed about some of the issues that are here. And I just want to illustrate one particular case that we had in the UK before I come to the question.
So in 2021 a British man was sentenced to a very long prison sentence because he had been conducting horrific child abuse of hundreds of children online. Data and messages which Meta were able to provide to the investigating officers were vital in securing that conviction.
Had Meta implemented end-to-end encryption into its channels before that point, that man would not have been convicted and would still be abusing children. So I think it is really important that we can have that child safety point in our minds and that, you know, it is a very real issue.
However, I think we can all agree in this room that end-to-end encryption is a good thing and child abuse is a very bad thing.
So my question for the panel is how -- are they sure that we have done -- that that there has been enough innovation in this space particularly around the very specific question around tackling child abuse imagery into encrypted channels? Are we really share that we have done enough to explore that issue? Obviously we don't want to open the backdoors and break encryption, but on that specific issue are we absolutely sure that that -- that we have bottomed that out?
In the UK we are launching a safety tech challenge fund which is looking to explore this issue. And I think it would be great if more countries and bodies could explore this as well. I'm really interested to hear from the expertise that we've got on the panel. Thanks again for such an interesting discussion.
>> MALLORY KNODEL: Such good comments.
>> ANASTASIA VLADIMIROVA: Any left in the room or? Can you hear me?
>> MALLORY KNODEL: Yes.
>> ANASTASIA VLADIMIROVA: I just wanted to make sure, are there any comments or questions left in the room?
>> MALLORY KNODEL: I think that is it for now, yeah.
>> ANASTASIA VLADIMIROVA: Okay. I don't think we have anything online, but we do have quite a lot of wonderful comments and interesting questions from the audience onsite.
So yeah, Namrata and Neeti and Mallory, if you feel like you can comment, address some of them, please go ahead.
>> MALLORY KNODEL: I'm keen to have Namrata or Neeti come back in. I think we can just answer sort of whichever questions we took notes on and stuck out for us.
>> NAMRATA MAHESHWARI: I could -- sorry, Neeti, go ahead, did you want to say something?
>> NEETI BIYANI: I was actually just saying could we start with Mallory and then I could jump in and, Namrata, you could as well.
>> NAMRATA MAHESHWARI: Sounds good.
>> MALLORY KNODEL: Sure. They are all really good comments and questions, and it is hard to not want to respond to all of them. But I will -- yeah, I will try to keep it brief to some of them.
So one, there was one question about like the risk of regulating the entire space where some, you know, encryption there is so many different examples of it. I think you brought up a really good point. Are we risking some really solid important aspects because other kinds of encryption are being regulated.
I don't know that there is that much of a risk in what I have seen from the policy trends. There are cases where things are a little muddy and at that point as advocates we can sometimes then come in and try to introduce some nuance. But I also think it raises the point that for sort of end-to-end encryption, for example, that's been maybe slightly backdoored, I don't know if we can do that, I think there is something about the national security element where it is almost like no encryption would be better than weak because it becomes a bit of a honey pot in a way.
Right? If you've got an app that says it is end-to-end encrypted, but it is behind some sort of for whatever jurisdictional reason or other thing it is a little bit weakened and it's got some kind of backdoor, that is almost like a worst scenario, right?
And then the other one maybe is fully end-to-end encrypted. So I think you were sort of talking to the -- I think so, yeah. That was the note --
>> AUDIENCE: I was just -- I didn't articulate well enough. Apologies for breaking in.
MALLORY KNODEL: No, that's okay.
>> AUDIENCE: My question, my point was actually that so, in the Netherlands, for example, we have a big tech company called ASML which makes tons of chips, technology chips, la, la, la.
And so we see that there is more interest into the IP that we have, about process that we have, and we considered that as a point of national security, and we want to protect that.
So if we would have some form of secure or some form of client-side scanning, there is an easy abuse to be made by nation state actors in order to hack those services. Because I mean if the European Commission is going to provide a program that we have to implement, well, let's hope that it would be very safe.
But if it will be one program, it would be a clear target for any nation state actors, right, and that aspect I think you could use that more often that if we have some weakened encryption or backdoor that could also easily be misused by other nation state actors.
>> MALLORY KNODEL: I will try again with some other comments and feedbacks.
To the point about client-side scanning, I also wanted to stay, and I often forget to say this in these discussions there is a raging debate also in the U.S. around compelled decryption. So this is around, you know, taking an iPhone and then you say no, I'm not going to decrypt, and I won't give you my face or thumbprint or whatever. Is it legal to actually, can you compel somebody?
And that is another civil liberties issue similar to client-side scanning where you are turning one's device or user agent against them, against their best interests which is I think it is good that it is controversial, I think. But it could be more -- I think we could come out much stronger against things like client-side scanning and compelled decryption.
And then to the safety tech challenge which is something that my organization was aware of and followed up on and submitted comments to. One of the issues I had with it was that it just assumed that tech solutions started with we have an encrypted system and how do we get around that rather than taking a larger frame which is what are -- what are some approaches that use digital technology to solve this problem which is wider.
So I felt that, you know, artificially narrowing it in that way limited the results. Because I also really think that there are things we can do. And if we accept that end-to-end encryption is ubiquitous, strong and people will use it, we accept that. We can I think build some interesting things on top of it.
CDT did a report on how you actually do content moderation in end-to-end encrypted systems, and we didn't actually say you can't do it. We did say you can do it, here are a few ways to do it and here is how you would. There are possibilities, but I think that assuming that you have to break it and then asking people for their innovations around how it gets broken was a little bit sort of artificially narrowed. That's it. Back to you both online.
>> NAMRATA MAHESHWARI: Thanks, Mallory. I could -- there was a question about the approach in the global south how encryption is being characterized and how sort of that characterization could be responded to. To that point, yes, there very much is that characterization of encryption posing a threat of it creating this kind of dark space where law enforcement is not able to do their job.
But I think a lot of the advocacy and responses have to also be tailored to the region even though there might be commonality in the challenges, right.
So I think in terms of responses, some of the things that I think would work, one is to break the kind of polarization we're seeing touched upon a bit in the comments, the privacy versus safety thing. All of us agree that we need to have private spaces, all of us agree that there are terrible things happening both offline and online and they need to be addressed.
So I think to break that polarization, to see that the goal is at end of the day the same and how to achieve it without undermining human rights.
So I think first we have to kind of agree that the goal is the same and we need to have that conversation. The second is I think there needs to be or we need to challenge the disproportionate focus on the negative use case of this technology. Everything has a negative use case, and I think if we were to look at anecdotal evidence there is also a lot of evidence on how because of lack of secure channels for communication people in a lot of regions have been tracked down, people have lost lives.
So I think just focusing disproportionately on one side will not lead us to the long-term sustainable solutions in that respect, right, it will be more or a knee-jerk reaction to something that we're seeing evolving around us.
And as Mallory pointed out, we are all struggling to find that perfect solution. So right now policies are just counting on tech companies to find a way to do it. So we need to come to that solution together. And admittedly that is going to take some time. So that focus on just one side of the negative use case has to be changed.
The second -- the third. In how policies are framed, one should not have to count on the good intentions of law enforcement and government. The way it's currently done, we often hear the response that this will only be involved in the rarest of rare cases. But the underlying problem is that ultimately a citizen has to count on the government's good intentions. And even if they exist, the very point of policy making is to create principle based solutions that can prevent any kind of arbitrary application.
So I think that kind of banking on good intentions and limitations that the government will place on itself is just not a good idea for the long-term. And finally, I think it just again in terms of focusing on solutions there are alternatives, there are alternative means of investigating be it capacity building or the kind of resources that Mallory just mentioned. CDT has done this report, Stanford had done a report. And there is already progress happening in that direction.
And this is an opportunity for all of these various stakeholder groups to work together. How do we strengthen alternative means of investigation without undermining this important tool.
I can take a couple of others, but I will just hand it over to Neeti because I know we are also short on time.
>> NEETI BIYANI: I just have two very quick thoughts to add to what has already been discussed.
I think we've spent a lot of time collectively, you know, this panel talking about what encryption adds to our life, right, in terms of safety and security and not just individual security. I think individual security is very much tied in with national security.
We have also spent a fair bit of time talking about privacy and how encryption due to its enabling nature helps us all realize other human rights. So, you know, it is not just some random technology that exists in a vacuum today.
So if I were to take all of those conversations we have had and ask people in this room, you know, to apply the same logic to also children, it is not that encryption serves adults and then mis-serves children. I think children are definitely safer because of encryption. I mean children who have access to the internet and have the ability to use the internet to learn online or to interact with their grandparents or to play games I think are safer because of encryption.
And I think there has also been, you know, certain parent accounts who said please don't make me raise my kids in a world without encryption because my kids are safer with encryption.
Even when the whole Apple client-side scanning proposal had come in, you know, one of the key questions that civil society asked Apple is how are you so sure that the guardian or the parent that you are relying on to be able to direct a kid is the right person to do that? Can you as a technology company take that guarantee. And, you know, very clearly Apple could not, right. The proposal was withdrawn.
And just the last thing I will say is that these problems are societal. They existed before the internet existed and they will continue to exist in the future. But I think societal problems cannot have technological solutions alone. I think we need to find societal solutions for societal problems. And tech has a role to play for sure, but tech cannot be made the scapegoat or the ultimate sort of provider of unilateral solutions in case of societal problems.
I'm not brushing off the enormity of the problem, we're on the same team here, we're on the same side. But I'm saying we need larger broader multi-stakeholder dialogues to tackle what is essentially a huge, huge, societal problem.
>> ANASTASIA VLADIMIROVA: Thank you so much, Neeti. I want to ask, Namrata, if you have any other remarks since you mentioned that earlier or maybe we can jump into closing remarks also if there are no other questions.
>> NAMRATA MAHESHWARI: Just very quickly what somebody said about encryption also being very crucial for national security and the need to push that argument more.
I do completely agree. I think as Mallory pointed out a lot of it gets lost in just communications data and safety for particular types of individuals. And Neeti also mentioned how it serves the wider economy, serves your government security and national security.
So I think there are several prongs to it, right. And I completely agree as these conversations evolve, depending on the tables we're at, to achieve that final goal sometimes different arguments plays different roles. So plus one to whoever raised that point. Thanks.
>> ANASTASIA VLADIMIROVA: Thank you, Namrata. So if there are no other questions, are there any last minute remarks to questions left in the room? And I do hope that we managed to address all of them that were mentioned before.
>> MALLORY KNODEL: I think there was one somebody asked about intra-operability or they made the point that these are all at the moment walled gardens which is a very good point.
But I think there is effort around in general doing more federated and intra-operable messaging based on the Digital Services Act which will only apply to gatekeepers. But in general I think there is an interest definitely from human rights advocates and organizations like CDT and ISOC and so on to actually figure out ways to do intra-operable end-to-end encrypted messaging, or at least federated. So that hopefully will change. And the IETF are looking at that as well, there are folks interested in that problem space. So that's good news.
>> ANASTASIA VLADIMIROVA: Thank you. So if there are no more questions we may take a few minutes for closing remarks, just a few remarks from each of the speakers would be great on today's session and on the takeaways.
>> NAMRATA MAHESHWARI: There is one question in the chat in case you are wondering.
>> ANASTASIA VLADIMIROVA: I did miss that while I was focusing on keeping track. Okay. Yes, there is one question, indeed. Thank you, Namrata.
Actually, it is very similar if not almost the same as we stated in the description earlier in the session. It is should encryption have special protection under international human rights law? That's an interesting one. Maybe, Namrata, you could address that really quickly, and Neeti, or? Just a few comments because we don't have that much time left.
>> NAMRATA MAHESHWARI: Sure. So I think just to recap a little bit of what Neeti, Mallory, and I said when the question came up.
We think it is a crucial enabler for a lot of the rights that we have within the international human rights framework so to that extent, yes, it should be recognized as playing a crucial role in folks being able to exercise that right.
Second, we spoke about how it is very intertwined with access to the internet and that in itself is a challenge globally and perhaps more so in specific regions. And without that being recognized as being crucial for fundamental rights, I think that would have a very direct impact on how we think of the right to encryption.
And as Neeti put it, tech for tech sake does little. We have to look at the kind of goals it serves. So I think more than just right to encryption it is the right to private and secure communication channels or just protection of your data in ways that cannot be tampered with.
So I think just within that broader framework recognizing it as an important cog in the machine essentially for people to realize their human rights.
>> MALLORY KNODEL: And I wanted to jump in to point people towards the Office of the High Commissioner on Human Rights put out a report roughly three or four weeks ago about the importance of encryption for human rights. It is an excellent report.
And so folks who want to understand how those connections are already being made within the human rights framework should definitely take a look and read it. It came out on the side -- very strongly on the side of the ubiquitous use of strong encryption.
>> ANASTASIA VLADIMIROVA: Thank you. So any closing remarks?
>> NEETI BIYANI: None from me except save encryption.
>> MALLORY KNODEL: I would just say in my closing that there will be another session about encryption tomorrow.
I don't have the schedule in front of me, but I believe it is in the first afternoon session. It is an open forum, or it is a town hall of the Global Encryption Coalition. It is multi-stakeholder but mostly civil society. There are some small companies and then academics and technical advisors that comprise this global group that focus on policy threats. And also technical ones as well. So it will be a really robust discussion so please join us there if you want to continue the discussion.
>> NAMRATA MAHESHWARI: Thank you so much both to the organizers and moderators and panelist and everybody there. Thank you for being engaged for the fabulous questions or inventions. I wanted to check if there is a way for us to get the resources we shared on chat to the people attending onsite?
>> ANASTASIA VLADIMIROVA: I just thought about that I can actually edit the description of the session and can post the links to the videos that we -- including the one that we wanted to show in the beginning that were produced by the Global Encryption Coalition to celebrate Global Encryption Day.
I will edit the description so you can go back to the description of the session and see all of the resources and links that were shared in the chat.
And yes, I just wanted to say thank you all for attending. It is great to see so many people in the room and also onsite and online. Thank you very much, Mallory, Neeti, and Namrata for joining me here today for this really important conversation.
Most importantly, for your wonderful and brilliant insights. And I hope that we can continue having this important conversation not only at the IGF platform but also in our communities, multi-stakeholders, yes, to keep protecting the encryption. And yeah, thank you very much.