The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> Thank you for being here at 8:30 Japan time and all other times you may be logging in. We appreciate it. My name is Al Smith, I work at the Tor Project, and I'll be the room facilitator today. This is a hybrid session, so there are going to be some panelists participating online and they're going to be these two right here with me.

I'm pretty excited about our panel today, because we've brought together some experts from the tech, policy, human rights, and advocacy spaces to talk about a handful of policy issues and uncover some key elements for a human rights forward governance framework for encryption.

Online today, I'm not seeing them show up on the screen, so I don't know if I should introduce them now. I'll talk about who we have here in person and then hopefully our online people will appear on the screen for whoever's doing that.

In person we have Roger Dingledine, who is the President and Cofounder of the Tor Project and Sharon Polsky, President of the Privacy and Access Council of Canada. I don't see the people online, but I'll introduce them anyway. We also have Rand Hammoud, Surveillance Campaign from Access Now, and Tate Ryan‑Mosley from MIT Technology Review. Tate is going to be our online moderator today.

So, I was thinking I could toss it to the folks in the room to just say a sentence about what you do and where you're based, do a quick introduction and then we'll get into the meat of the conversation. Actually, I want to talk about the structure of the conversation quickly before we do that. We're going to have 30‑35 minutes of structured moderated convo and then we'll take questions both from the online space and here in the room.

And we will end at 9:30. So, I'm going to hand it to Roger and then we'll go to Sharon and then we'll go to Rand and Tate and then take off the conversation. Roger, could you say a few words about who you are?

>> ROGER DINGLEDINE: Hi, everybody. Microphone work. I'm Roger Dingledine from the Tor Project. Tor is a privacy nonprofit in the U.S. We write software to keep people safer on the internet. We care about the civil rights, human rights, surveillance resistance and we also care about the censorship resistance side of things. So originally I wrote Tor and now I wear all sorts of hats, including talking to policy people and law enforcement and governments and hacker conferences and so on.

>> SHARON POLSKY: Good morning, I'm from Canada, a nonprofit organization that represents the data protection professionals in Canada and beyond, and offers a professional certification program and we also have written a variety of standards, collaborated in some and written the standard for competency for people who are data protection professionals and a big part of what we do is education, both of our members, of legislators, of the general public, because it all ties together with matters that we will be discussing today and that have been discussed throughout this forum.

>> AL SMITH: Let's go to Rand next.

>> RAND HAMMOUD: Hi, I'm Rand Hammoud from Lebanon, I'm a campaigner at Access Now, a digital rights organization aiming at defending and extending the digital rights of people and communities at risk. I lead our global campaigns on spyware and surveillance technologies and I'm very excited to be talking to you today on this panel.

>> AL SMITH: And finally, Tate.

>> TATE RYAN‑MOSLEY: Hello, good morning, everyone. My name is Tate Ryan‑Mosley, I am the Senior Tech Policy Reporter at MIT Technology Review. I cover the emerging technologies that are changing government policy and politics and how we participate as citizens in those activities. I'm also really excited to moderate the panel today.

>> AL SMITH: Thank you, Tate. Do you want to kick us off?

>> TATE RYAN‑MOSLEY: Yeah, that sounds great. So, I'm going to very briefly set the scene for our chat to give some basics, because we are talking about some more sophisticated technology. I like these sorts of primers. So technology ‑‑ encryption is a technology that allows people to keep their information and activities private online by scrambling messages through mathematical cryptography and algorithms.

Often, when we talk about encryption today we're talking about end‑to‑end encryption and that means that when a sender sends a message it gets encrypted and sent as cipher text and the receiver has to decrypt to read the message in plain text. With end‑to‑end encryption, even the tech companies that make encrypted apps do not have the keys, as they would call it, to break the cipher text.

But more on that later. Most commonly, when we talk about end‑to‑end encryption for the average internet user we're talking about messaging apps like Signal, Telegram, and WhatsApp, HTTPS protects websites and web activities, some devices are encrypted by ‑‑ with passwords and pass codes like an iPhone, for example.

And encryption has been debated from a policy perspective since the beginning of time for 20 or 30 years as authorities have sought access to encrypted messages and devices. This access is commonly called a backdoor and authorities or law enforcement agencies that have advocated for backdoor access often will say we just want access to some messages on a case by case, restricted, small‑scale, targeted allowance.

In the past, tech companies argue that doing so would have pretty substantial risks to encryption as a whole because the creation of a master key, which doesn't exist today, would be really hard to control from bad actors, inappropriate government uses and generally weaken encryption. Opponents of backdoor access say that, of course law enforcement can't really be trusted with this type of access, plus it's not really how the technology works.

Additionally, strong encryption is necessary for human rights advocates, journalists and free speech more generally. Historically, the UN as sided on the opponents of backdoor, saying backdoors are contrary to the freedom of expression. In the past, we've seen the encryption debate pop up during times of crisis when law enforcement agencies are looking for a particular piece of intelligence in a high‑profile case like the shootings in the U.S. or the Paris bombings, both of those in 2015.

But we're seeing this debate crop up in the form of online safety and content moderation. There have been a handful of bills in the U.S. and globally, U.S. at the state level but also Australia, UK, places, Canada that we'll talk a little bit about today that are threatening encryption. So, we're going it talk about all of this today in light of also the growing use of surveillance technologies by governments around the world and what we might do to strengthen encryption protections.

So we will have some time for questions at the end. So please do think of them throughout our chat so that you're ready to shoot them to our panelists at the end of this. So, now that we're kind of all on the same page about what we're talking about, I want to pass the first question to Roger. Roger, why do governments, law enforcement agencies, anybody, want backdoor access? What are they getting at?

>> ROGER DINGLEDINE: Yeah. So that's a broad question. The fundamental conflict here is between society being safe and national intelligence, law enforcement, governments wanting control in these cases. So the way that I look at this, the question is about privacy. And by privacy I mean control or choice about your information.

So if you are successfully having privacy and one of the ways to get that is through this encryption that we're talking about, then you get to choose who learns things about you. So that's my definition of privacy. And one of the interesting characteristics of it is vulnerable populations find it more valuable. So if you already have a lot of power, if you're a nation state or Russian mafia or whatever large, power group, you already have power. It's not so important for you to have an extra layer of privacy.

Whereas if you're a minority, LGBT, journalist, human rights activist and so on, then it's much ‑‑ then this is one of the most important things for you to retain control of your own safety.

>> TATE RYAN‑MOSLEY: Yeah. Roger, sticking with you on that point, when governments or law enforcement agencies, whatever party is in control, asks for backdoor access to encryption, from a technical point of view, why is that a slippery slope? Why is that such a risky request?

>> ROGER DINGLEDINE: Yeah. So there are several problems here. One of the big problems is math doesn't know what country it's in. Technology doesn't know what country it's in. So if you ‑‑ let's say you have a country with perfect rule of law. I don't know where you'd find one of those, but let's say you have one of those. In that situation, the judicial process gets to decide who can break the encryption and whose messages we'll look at.

That same tool is going to be used elsewhere in the world and there are other countries who are going to try to reviews the same mechanism for breaking the encryption. So even if in the U.S. we had a perfect judicial system, which we don't, what do the tech companies do, what do the tools do when the judge in Saudi Arabia asks for that same access?

So the fact that there are different countries in the world is one of the main challenges to having this whole backdoor concept make any sense at all. And I guess the other way of saying that is this notion of a backdoor that law enforcement keeps asking for weakens society as a whole. It makes everybody less safe. And that's not a worthwhile tradeoff.

>> TATE RYAN‑MOSLEY: Mmm. Rand, I want to pass it to you, because you work with protecting free expression and people on the ground who are doing human rights work. How have you seen encryption being used to protect activists or citizens who are just expressing their voices?

>> RAND HAMMOUD: Thanks, Tate. I think one of the main things that comes up when it comes to encryption and protecting or safeguarding or enabling fundamental rights is the fact that it is one of the biggest technologies today that is the foundation of security and safety and trust online. And so it's enabled activists, lawyers, human rights defenders, dissidents to securely communicate, organize, and protect their freedom of expression.

So if we go ahead and undermine encryption, we are undermining their ability to do so. We need to place this conversation within the context of a pervasive surveillance industry where even with strong encryptions and when we do have data that is now encrypted and safe, we have a large, billion‑dollar industry that is working day in and day out to find vulnerabilities to exploit and already survey these individuals and place them at risk, thus putting them in harm's way and even causing and enabling human rights abuses such as disappearances and killings.

The conversation around safeguarding encryption needs to also be aware of the already‑existing surveillance capabilities of governments and malicious actors.

>> TATE RYAN‑MOSLEY: Yeah. I think that's such a good point. And one thing, Sharon, I want to ask you about is, even from an economic perspective, encryption is essential to data protection activities at normal businesses. So yes, as Roger spoke about, you have these grave power imbalances between activists and states, but also you have people at their jobs who are protecting sensitive information who rely on encryption. Can you talk about that use case as well a little bit?

>> SHARON POLSKY: Absolutely. And you're right. It's not just the human rights people and the advocates, but it is everyday people in business. And the one area that is seldom mentioned is also the lawmakers themselves. Whether you are a lawyer who has to maintain client confidentiality, or a doctor and you have to maintain confidentiality of your patient information, if you're a lawmaker, strategist, policy analyst and you're in discussion with your colleagues, you don't want somebody else being able to infiltrate and figure out what you are strategizing.

So, everybody has privacy issues, whether it's for personal privacy or for business and economic and for national security reasons. Maintaining encryption is absolutely fundamental.

>> TATE RYAN‑MOSLEY: Yeah. And I think Rand, I want to pass it back to you. We're in a room with some policy‑makers at a policy‑making conference. How do you think we should respond to governments who want backdoor access to encrypted technologies and who stands to gain and who stands to lose? Do you trust them?

>> RAND HAMMOUD: I think to piggyback on what my fellow panelist just said, undermining encryption is also a national security issue. When you look at it that way, no one stands to gain. It will place national governments at risk, the same governments that want to ‑‑ or are advocating for undermining encryption will be themselves at risk. And then democratic processes are included within those risks, because when you think about journalists, activists, essential people that uphold democratic processes being at risk or having to self‑censor because they could be surveyed in such a way that is mass scaled when you talk about undermining encryption, then that whole process is lost.

And so I think from my point of view, there is no one left to gain except individuals or malicious actors who want to survey those people and who want to gain access and who want to exercise population control because essentially, that is what undermining encryption will do. It will make surveillance so much cheaper. It will take us into, you know, pre‑Snowden revelation days when there was mass surveillance from governments and companies.

And so there is no one to gain. There is no one that is going to gain. And we shouldn't be trusting backdoor accesses or any sort of pretexts that are not technologically sound.

>> TATE RYAN‑MOSLEY: Yeah. And I feel like you're picking up on one kind of key tension that has been in this narrative for a long time, which is our security and privacy, are they opposing things. Can we have both? How do you achieve both?

And Roger, I wanted your perspective on that. To what extent is this binary of security and privacy real?

>> ROGER DINGLEDINE: Yeah. Security and privacy are the same thing in a lot of ways. Imagine you give out your financial data and then somebody does identity theft on you. Going back to your example of encryption being a national security thing, I was at an FBI conference years ago and I talked to a bunch of FBI people and some of them used Tor and some of them fear Tor. One guy was saying, surely you have some sort of backdoor, right? Surely you have some way to learn what people are doing on the Tor network.

I explained, I pointed to his colleagues and said, these people just told me today that they use Tor and rely on Tor every day for their job. Do you want me to have a way to learn what they're doing on the internet? So from that perspective, it's a national security, it's a security, it's a privacy. They're all the same sides of the same coin.

>> TATE RYAN‑MOSLEY: Yeah. Sharon, do you want to expand on that?

>> SHARON POLSKY: I have to agree with Roger. It is all connected. And it's all too often that people will talk about one aspect or another without relating, without connecting the dots and you absolutely have to. But the problem I've found through my career, and that's been dealing with governments and policy people and corporations, there's been very little education about these things.

We use the internet. We use computers. But a lot of people, unless you live it, unless you're a Roger and you design these protective mechanisms, most people just use them. And that's a problem because they know how to use it to a very small degree. They don't understand the implications of what they're doing quite often, and that also falls over to the lawmakers and the people who prepare the research and briefing those for the lawmakers.

If they don't understand what the technology is about, what the risks really are, and the unintended consequences of the legislation they draft, then they are building something that is going to create a world of problems. And for that I look to things like various pieces of legislation in Canada. Some have just come in. Some are still on the books being debated. And the so‑called online safety act in Britain.

They're all being promulgated as necessary to protect children and doesn't everybody want to protect children? That's the argument. Of course we want to protect children. They are among the most vulnerable. But if you undermine encryption, to ostensibly protect children, other people will also be able to get through that backdoor and endanger not only the children, but everybody else.

And it is the very children who will be endangered because the way the laws are being written, the content will have to be scoured automatically, proactively, automatically reported to police if it is suspected as potentially, maybe, possibly being child sexual abuse material. So what happens when a child has been abused and wants to report?

Their content gets stopped and reported. And they are the ones who become the suspects. in Canada, a child is chargeable as of 12 years old. Imagine the possibilities and the unintended consequences of breaking encryption.

>> TATE RYAN‑MOSLEY: Yeah. And I'm glad you brought that up. I want to get further into the specifics. This is where we're hearing a lot of the encryption debate. If we have a lot of encrypted messaging, if we have a lot of really secure portals for communications, we can't moderate those spaces.

And we know that internet users are increasingly moving to private spaces in this age ‑‑ in this current moment of social media. And lawmakers are saying hey, what can we do about all this abused information, what can we do about all of this bad, harmful content that's being passed through people that tech companies themselves and governments have no visibility into?

You brought up the UK online safety bill. This was a big one. Australia, India, the U.S. We've seen some discussions providing either technical or real backdoor access to encrypted messages. I'd love to know, Sharon, can you tell me something specific about some of the bills in Canada where you see an unintended consequence or a misunderstanding from lawmakers of the technology, or the ramifications?

>> SHARON POLSKY: Absolutely. And really, I don't have the imagination to make up the stories, the examples that I will cite. I had a conversation with one of our current members of parliament a year ago. We were talking about this because the legislation in Canada was just being formulated and I said, but if you break encryption for some, so that all the content can be monitored ‑‑ she stopped me and went, I don't think that's how it works.

And changed the subject. She, like many of our current members of parliament, come from journalism. They're educated, they're worldly, that's great. But they don't get it. We have ‑‑ you might have heard of bill C18 that just became law to up‑to‑dated broadcast act, which sounds wonderful, except it includes radio, television, and governing the internet globally. Canada has declared they will govern the content.

Combine that with another piece of legislation on the books, Bill C26, and we refer to them by their numbers because unlike the United States, Canada has a history of creating legislation with very lengthy, hard to say names, not nice, concise, easily said acronyms. So Bill C26 is another piece of legislation. And that one is going to amend the Telecommunications Act to create the Critical Cyber Systems Protection Act.

And like the others it will infringe on privacy and freedom. All of these will narrow identified gaps. They do ‑‑ if you look at it from a certain perspective ‑‑ have a legitimate application. Protecting children, preventing terrorism, preventing all the ills and harms that we see so often. The very same things that were going on long before the internet became a thing.

But the problem is everything is going to be surveilled, as Rand said. That is a problem, particularly because when everything is surveilled, and the various pieces of legislation say some content we will deem ‑‑ we will have our separate agencies deem as misinformation, disinformation, unwanted content, the government will not be the one to do the censoring.

The law will have the platforms do the automatic, routine, mandatory proactive screening. Those are outside of Canada, outside of the reach of Canadian law, of course. So it's actually a very interesting way that they've created it, because similar to the Americans who have Constitutional rights to freedom of speech, we have charter and protecting right to freedom of expression, which protects Canadians against overreach by government.

So it's not going to be the government committing overreach, it's going to be the companies that the charter doesn't cover. The companies will just do as the law requires. And that affects everybody, including everything from children to the elderly in every walk of life including the politicians themselves.

>> TATE RYAN‑MOSLEY: And on that point, luckily for us we have someone on this panel who runs a tech company. Roger, how do you think about balancing privacy with content moderation? This is not the Tor Project's bread and butter, but, we do know that there have been the proliferation of child sexual abuse material on some private messaging apps.

So is there an approach that balances these two things? Can you achieve some level of moderation and encrypted privacy?

>> ROGER DINGLEDINE: Yeah. So fortunately, Tor is a communications tool, not one of these platforms. So we don't have content to moderate in the way that Facebook and so on have. But from the ‑‑ so everything Sharon said is right, but it's worse than that, because you were talking about if the technology behaves in a perfect way then it's still bad for society.

But the reality, for example, in the UK Online Safety Bill, they're imagining there will be magic AI machines that perfectly look at pictures and perfectly decide correctly if they're bad pictures or not bad pictures. And the reality is, AI doesn't work that way. It's not perfect. You're going to have some false positives. And let's say you have 2% of the time it says that's a bad picture and it shouldn't.

And there are 10 billion pictures being sent each day. Then 2% of the users are going to get reported each day for being criminals. And maybe they can drive the false positive rate from 2% down to 1%. So now it's only tens of thousands of people being misreported and having their lives ruined because the math screwed up a little bit for them.

So, it's definitely a challenge here, because the politicians want this reality to be possible, and it isn't, but they want it to. And there are all sorts of for‑profit tech shark scam companies that say, oh, yes, yes, give me millions of dollars, and I'll build a magic thing for you and it will be magic. And the reality is, it's not going to work. It's not going to do what people want.

But the politicians really want it. They would love to have a technology solution to be able to give people privacy while also surveilling all of them. But the reality is that the tech does not support the things that they are wanting.

>> TATE RYAN‑MOSLEY: And some context if people aren't familiar, these are ‑‑ I'm sure you're referring to a handful of technologies. Message, client‑side scanning, server‑side scanning. The idea behind these technologies, they are different, I'm sorry for painting with a broad brush, are technologies that allow a machine to evaluate the content underneath the encryption so there's not a person reviewing necessarily the content of encrypted messages, but there is a machine checking and saying this might be child sex abuse material, for example.

And in the UK law, it was a stipulation of the UK online safety bill that technically feasible was the terminology they used, to use those type of technologies. It was repealed a month ago because those technologies do not exist. That part of the bill was changed. I like how you said, let's talk about reality today. Rand, I want to pass this back to you. Talking about reality today, what sort of protections do human rights advocates, journalists, need right now when it comes to, you know, protecting their own privacy and protecting themselves against government surveillance?

>> RAND HAMMOUD: So I think there are two main subjects to this kind of answer. And when it comes to protecting themselves from government surveillance, it mainly takes us into the idea that, you know, even before we get into undermining encryption, we already are in a space where spyware is largely used against human rights advocates, dissidents, etc.

With the most recent request that Amnesty put out, it's become even cheaper. A predator infection costs 9,000 Euros, when years ago it was more expensive. The technology is proliferating. And it is off the shelf. It is unregulated, unchecked. And governments and who knows what other actors are using it against human rights activists, lawyers, journalists. So the first thing that we need to tackle, or governments need to tackle is firstly ban spyware vendors and technologies that have already been used to enable human rights abuses.

And then talk about establishing the safeguards that are needed in order to have a more human rights respecting framework to use certain digital surveillance technologies in a way that does not infringe on human rights, if that framework exists. But we first need to be able to have multiple safeguards that would ensure that even if these technologies are used, there is a mechanism to access remedy, a mechanism for investigations, etc., which largely even in spaces that it exists today is not respected.

And we see that where there are multiple democracies where there are legal frameworks that deem the surveillance illegitimate but it is still happening. The conversation around protections should look into why the technology is proliferating in such a way and the pretext behind why it exists or the need behind why it exists.

And the pretext that law enforcement needs this kind of technology today to ensure, you know, that everyone is safe is completely false. We have not seen any evidence that this technology has helped in any way to maintain national security or make anyone safe. But we have plenty of evidence of it making people less and less safe and infringing on their rights.

>> TATE RYAN‑MOSLEY: Yeah, absolutely. That's a really interesting point. And Roger, I want to pass it back to you. So, what can tech companies do and how are tech companies responding to both I would say increased surveillance, increased demand for access to citizen data and also to this kind of policy moment? Tech companies are beholden to the laws that govern them. What are you seeing from the tech side?

>> ROGER DINGLEDINE: So tech companies is not a monolith. There are a bunch of different sides to the technology world. In terms of the huge companies like Apple, it's interesting to notice that Apple is mostly on our ‑‑ on society's side in this, where their users want safety. And Apple wants to give them safety. And it's actually in Apple's interest to give them safety, because if Apple had the ability to watch everything that they're saying over messaging, then they're a target for people trying to break in and harm the users.

So in this sense, we are aligned with groups like Apple. On the other hand ‑‑ so this ‑‑ we haven't said the word crypto wars yet, but we have to look at history and the fact that governments have been asking for weakening security over and over for years. And for example, in the West, for internet routers, like the backbone pieces of the internet, each router has a port called a lawful intercept port.

And the idea is you go to a judge and you say I want to be able to watch all the internet traffic going along this part of the internet because there's a bad guy and I want to be able to watch him. And the judge thinks about it and says okay, sounds good. And then you plug into the lawful intercept port and you get to listen to all of the internet traffic there.

And I was years ago talking to a group in the German Foreign Ministry, trying to figure out should we regulate, as Rand was talking about, these spyware tools? How do we decide what counts and what doesn't count? There was an engineer from Dubai telecom like, you guys put the lawful intercept port in when my prince in Dubai asks what's that port and I say that's the lawful intercept port and he says, plug it in, the jurisdiction is wildly different, but the tool works the same in Dubai versus the U.S. versus Europe.

So part of the thing to bring it back to Tate's question, part of the things that the tech companies need to think about here is this is a recurring theme where governments keep asking for more and more access, more and more weakening and there are side effects such as having lawful access ports on backbone internet routers, which can be used well and wisely and often are not. Every time we think about weakening safety for society we need to think through where that's going to go in the future.

>> TATE RYAN‑MOSLEY: Yeah. And we're just about ready to take some audience questions. Roger, I wanted to ask you to expand on that last point. When it comes to thinking about this globally, as you said, technology doesn't know boundaries. There is this competitive market for both spyware and privacy technology. How do you think about how might we foster a global approach to encryption protecting framework for governance? Again, a big question for you.

>> ROGER DINGLEDINE: Yeah. So the answer isn't to make all of society less safe. That cannot be the answer. And it is frustrating that the U.S. and the UK and Europe are so excited to do that. And it's especially frustrating at the same time as each of these countries is signing the Freedom Online Coalition, the Declaration of the Future Internet, the Global Compact, all these acronyms we're hearing about at IGF this week.

We've got countries saying that they value safety for society, yet here they are trying to pass these laws each year. So yeah, I guess it can't be mass weakening. A lot of countries then look at the targeted attacks, the ones that Rand was talking about where they go to some Israeli company and buy the ability to break into their specific target's phone and bypass encryption and other mechanisms.

And in a sense, that's better. At least it's not mass attacks. At least it's not harming everybody. But the reality there is we keep seeing these targeted attacks being used against not just journalists and bloggers and activists but French politicians, and the parliament members in Germany and so on. So I'd like to live in a world where the targeted attacks are the better answer, but that seems like a pretty bad answer, also.

I guess as a technology person, I'm good at explaining why things won't work. But the best solution that I have is we need to maintain strong security for all of society, meaning we need encryption to work well. And as Rand was saying we need to start regulating and deciding what small arms dealers are allowed to do in the software vulnerability exploit space.

We could go on and on about this, but I'll pause for other people to jump in.

>> SHARON POLSKY: And I'm going to do just that, because I think for the people who are going to create the regulations, if they don't have a proper correct understanding of what it is they're regulating, what the impacts of not regulating, regulating in a certain way, regulating ‑‑ if they don't get it, then regulating is going to be a Band‑Aid approach.

Long‑term, it should have started many, many years ago, is education from the youngest grades, not just in how to use a computer, how to use these wonderful devices that do provide convenience for the good among us and the opportunists among us, but educate people about everything from how are laws made, that is democracy, what are different types of political structures.

Give them the education so they can make critical decisions and grow up to build systems that don't provide the very same problems we're tackling and struggling with now.

>> ROGER DINGLEDINE: We need to normalize what encryption is. One great success story is HTTPS. Governments and law enforcement used to say, but if everybody has encryption when they go to websites, society will collapse. Think of the children. What would happen if we aren't able to watch what you do? Now when you do online banking or log into the IGF website or any website you use HTTPS. It's normal.

They fought that fight. We won. Let's look to that as an example where we need to somehow figure out how to make society safer for the next round also.

>> TATE RYAN‑MOSLEY: Yeah, and I want to pass it over to Rand to get your perspective. What can we do to take a positive step forward globally?

>> RAND HAMMOUD: The answer is quite simpler than many policy‑makers would like to hear because they would want to know it's a complicated matter and use that to not pass progressive laws. But really, the international standards we already have are quite strong. We already have many rights respecting laws. When we look at international standards for due process, for fair trials, freedom of expression, etc., they already render surveillance capabilities illegitimate.

Surveillance in the sense that would be promoted when encryption is undermined is assuming that everyone is guilty until proven innocent, which is the opposite of what should happen. Brings to the consciousness of the state people who are not guilty of anything. And so it already is sort of an unlawful kind of attack.

So really, what we need to do is be able to enshrine in an international framework what surveillance and encryption means, inspired by spirit of what we already have, which is strong international protections for our rights as they stand.

>> TATE RYAN‑MOSLEY: Mhmm. Yeah. And I feel like the infrastructure approach is something that is increasingly ‑‑ I don't mean to put words in your mouth, but it feels like that's a similar approach that you're advocating for that has also been applied to areas like anticensorship technologies, and that space as well.

So I want to pause and see, are there any questions online or in the room if it's online, you can just add them to the chat and in the room please make yourself known and Al will take care of you.

>> ROGER DINGLEDINE: We've got some hands in the room, so go to the microphone.

>> AL SMITH: Go to the mic and get in line, please and thank you. Please introduce yourself before your question. That would be great.

>> AUDIENCE: Hi. I'm Honda. I worked for Google at the safe function and policy implementation for a while and then I founded the Internet Observatory. I have questions for the three of you. For Roger firstly. When we talk about protecting human rights activists, I feel like the conversation is sometimes assuming a functioning democracy and a functioning government.

That kind of is really willing to protect the citizens. So that really doesn't apply outside of western Europe and the U.S. What we see and hear, for example, in Turkey, when there is any encrypted app found on someone's phone or someone's computer, that can be used as evidence to support a case that someone is doing something illegal.

So if Tor Project, if I'm using Tor on my computer, that can endanger me. So it might be more safe in terms of surveillance, but it's not safe if we talk about the tools of oppressive governments, for example. So I was just wondering if you are discussing ‑‑ talking about human rights activism and protecting democracies, is there any context or any information that you get on how autocracies work?

These countries learn from each other. So any law that pops up in a country is likely to be transferred. For example, as someone who works in technology and human rights and democracy we do not suggest some of the Telegram or Signal. It is encrypted, it's open source, but it might put you in more danger because of this. That's my question to Roger.

I also have a question to Tate, actually. So, when we see about ‑‑ because I'm following MIT Tech Review and we do have a lot going on in the Middle East and other countries. What we see is that these are not being reported often. There might be an issue in the U.S. and it will get a lot of news and presence, but when the Turkish government or any government has a big tech request and the tech company complies or some other stuff happens, these things would get a lot more, I feel like, coverage if it was happening in other countries.

But like I said, when problems happen in a country it is not just for that country. It's probably going to be replicated. If there is a law popping up in a certain country against encryption, it is very likely to be replicated in a similar geography. So I was wondering if you have any insight on maybe improving the coverage on going beyond the Western look on how human rights issues and activists could be protected.

And for Sharon, sorry, yes. I'm bad at remembering sometimes, but my question, you mentioned you do talk a lot with the government bodies and your interaction with them, what percentage of your work is actually focusing on holding big tech companies accountable? And if that is a perspective, because big tech compliance in autocratic governments is growing a lot.

And these companies, they really want to earn a lot of money and they are willing to give up every single human right. So for example, Messenger is encrypted but we have learned from Facebook officials that they do actually give information, chatting information, once it's requested. And these are not requests based off of security reasons. So it's not a request to identify someone who has been missing for a while.

They are politically motivated. These are my three questions to you. Thank you.

>> ROGER DINGLEDINE: Should we try to answer them now, or should we take more? What's the right way to ‑‑ okay. So, you're absolutely right that there are not as many functioning democracies in the world as we would like. In fact, if you know a good functioning democracy, please let me know.

In terms of the safety of having tools like Tor installed in dangerous places, there's actually a really interesting synergy, because Tor is not just for resisting surveillance, it's also for resisting censorship. In a lot of countries like Iran and now Russia and Turkey there's a lot of censorship.

So the average Tor user in Iran is using it to get to Facebook because they blocked Facebook. That means the average Tor user in Iran is an ordinary Facebook user just using it to get around the censorship. Yes, there are some political dissidents, but the average user is an ordinary citizen, which is an important security property for having these tools.

And similarly, as the whole world moves to not just Telegram or Signal, but WhatsApp, iMessage, and more ordinary tools get real encryption, it becomes a normal thing that everybody has, not a sign that you're a political dissident. So you're absolutely right. And the tools need to become pervasive and ordinary in order to be safe.

>> TATE RYAN‑MOSLEY: I can briefly answer the question and reminder so that we can get to all the questions, we can all try to be brief in our responses. Thank you so much for that question. It's a very important question. I don't know if I can give you a very satisfying answer other than it shouldn't be that way. And I as an individual reporter and us as Technology Review are trying to be better about this.

I think, frankly, you get into all of these issues with journalism and local journalism and journalism business models right now, and racism and where people pay attention and who people pay attention to. I think those are all parts of the answer to your problem. But certainly the press can and should do better at covering countries outside of the West.

And so thank you for encouraging me to do so. And feel free to send me tips at any point as well and I will do my best to cover international stories more.

>> SHARON POLSKY: I appreciate your question, do we deal directly with tech companies, no. We tend not to. We deal with putting on the record what is going on. So, when we spoke to the Canadian parliament about facial recognition or about spyware, we put on the record the billions of dollars, the statistics from industry as to what sort of contribution those industries, cyber crime, spyware, what do they contribute to an economy.

And often it's larger than some nations' economies. And we put on the record what the impact is. And, of course, it's very simple. As you said, companies are not interested in your privacy or mine. They are interested in providing the greatest return possible for their shareholders. That is their reason for being. And this isn't specifying one company or another, for them to say we take your privacy seriously, we will protect it, I think that's a promise that nobody should try to make, because it's inevitably going to fail.

We need to see governments recognizing what the problems are, realizing that the tech companies ‑‑ yes, they certainly do provide employment, innovation, for perfectly legitimate and wonderful purposes. Using AI for medical advancement, that's great. Using AI so I can pay whatever the fee is today to spit into a vile and have my DNA analyzed by a country in the United States that says in their so‑called privacy policy online we will protect your privacy and then they are breached ‑‑ this just happened.

And millions of people's genetic identities have been spirited away. You can't change your genetics. You can change your password. Do governments understand? Do the bureaucrats, lawmakers, and policy‑makers understand? No. When it happens to them, that, I find, is when things might start to change. So we do a lot to increase their awareness of these risks.

>> AL SMITH: Thank you for these questions and answers. A question from this line.

>> AUDIENCE: Good morning, I'm from Japan. This may be a bit extreme, but it's kind of related to the previous question. Do we have a plan of action for when the backdoor is somehow ‑‑ I mean, encryption backdoor is somehow mandated? Since we finally avoided the, you know, worst with the UK bill, and I think the fight will be continued, especially in Japan or anywhere. So. Thank you.

>> ROGER DINGLEDINE: Yeah. So do we have a plan of action for when the backdoors are really, really required. Is that the question? We will never put a backdoor in Tor. We will never undermine Tor's security. I don't care what the laws say. So we're going to have to wrestle with whatever the political policy implications of that is.

We've got EFF, ACLU, a bunch of legal organizations in Europe and the U.S. and around the world who want to fight these things. And I hope they succeed. We will never weaken Tor's security.

>> SHARON POLSKY: If I can add to that, I think the most important part is that people are now becoming aware ‑‑ and I don't mean just people in technology, or in the privacy realm or certain policy‑makers. I mean the general public has gotten fed up with seeing their personal information monetized.

They are starting to ask questions. I'm working with some people who are developing systems so it will completely change the dynamic. No longer will you have to submit to whatever the so‑called privacy policy is on a website. You will have control over whether, when, how much, to whom your personal information goes. You will be in control.

To flip things around. Companies aren't going to like it, but when the people who are their bread and butter now say we've had enough, they will have to change how they do it. And that is going to be a plan of action en masse.

>> AL SMITH: We have four more minutes. I want to try to get both of these questions in. If the answers could be brief, that would be great. This line next.

>> AUDIENCE: Thank you. Andrew, I'm a consultant on internet standards and a trustee of the Internet Watch Foundation. A couple of quick comments. In the discussion, the title is about human rights and it's mainly been about privacy. We've largely ignored surveillance capitalism. We focused on evil governments and ‑‑ deflect attention from what the tech sector does itself.

We've ignored the rights of the victims of CSAN to focus on the rights of others as their expense. We need to acknowledge and talk about that. Treating privacy as a right, in Europe it's a conditional right. Other human rights are absolute rights. Often, now we're protecting the conditional right to privacy at the expense of the absolute rights of people whose other rights are being infringed, such as the CSUN victims.

We need to acknowledge that when we have a blind use of encryption, that can weaken privacy. So when you apply encryption to internet protocols, that can actually weaken cybersecurity. If you don't have good cybersecurity, you have no privacy, even when you think you do. That's a significant problem.

We need to acknowledge that most of the tech companies ‑‑ not the ones here, probably ‑‑ they're not defending my human rights. They're defending their revenues because they are encrypting the data that they're extracting from my end point to when they surveil me and they don't want their competitors to access that data. That's why they want the encryption, not to protect my rights. That's an interesting byproduct to justify the encryption.

And finally, acknowledging the comment you just gave on Tor's position on backdoors, almost all of the big tech companies absolutely compromise their approach to privacy in order to have market access in some of those very problematic states. So you don't have private relay in China, because it's illegal. But they will cheerfully ignore the laws in democracies. But will comply with the laws in more autocratic states.

And I think that's pretty problematic as well. Thank you.

>> ROGER DINGLEDINE: We could definitely have a session on surveillance capitalism and the evils of large tech companies and how they are attempting to primarily maximize their profit rather than actually caring about their users. One of the points we tried to make here is there are some synergies, some overlaps where at least in this case, Apple is interested in privacy, first of all because it's good for marketing.

People ask for it this year. But also because it helps them have less surface area for attack so that they don't have as much that they have to worry about for people trying to attack their users. But you're right, that doesn't make Apple great. It's an excellent point that many tech companies choose to design their approaches with China, Russia, Saudi Arabia, all the interesting big markets around the world, India in mind.

And that causes them to do bizarre and dangerous things for their users.

>> AL SMITH: I think we have like one minute left, so if you could ask your question, hopefully we can fit in an answer.

>> AUDIENCE: I just want to build on the first question, really, and ask about the mechanics of advocacy in different countries and parts of the world. So, Tate, one of the examples you mentioned was India and I'm just wondering whether there's a sense in which you need to adapt the messaging and the arguments around this to different parts of the world.

>> TATE RYAN‑MOSLEY: Rand, do you want to take that? I feel like you have a good perspective, better than I would, certainly.

>> RAND HAMMOUD: Yeah, sure. That's a very good point. Of course using the same narratives within different contexts doesn't always ‑‑ or isn't really fruitful. It's not as productive as you would hope. Of course when we are trying to do any advocacy within autocratic states that have no regard for human rights, we cannot be using a human rights‑based argument, which is when you kind of talk about national security and how that is also in the interest of the state, or use economic advantages or economic arguments to say there is business espionage, how do you protect the economic advantages of certain companies, the competitive advantage.

And that's when other companies come on board and try to, as Roger was saying, try to become allies in this space. And so it is definitely incredibly important to make sure that we are using the appropriate narrative within the advocacy spaces that we are using, but also to be very mindful that the advocacy avenues in some contexts are just not there.

It is really difficult to talk about rights respecting framework for the use of surveillance technologies in autocratic governments or even in democracies these days, which is why we need to look at it as a more global or international framework, because you cannot depend on the jurisdiction where this technology is utilized. The technology ‑‑ the infrastructure is there. So we cannot control how well or how bad it is utilized. So that's why we need to look at a more international framework for the use.

>> TATE RYAN‑MOSLEY: Thank you so much to everybody for all of your questions, for your comments. To all the panelists and Al for the participation in today's panel. I hope you all learned something. I certainly did. And I hope you have a great time at the rest of the day's events.

(Applause)