The following are the outputs of the real-time captioning taken during the Tenth Annual Meeting of the Internet Governance Forum (IGF) in João Pessoa, Brazil, from 10 to 13 November 2015. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> NICOLAS SEIDLER: Hi, everyone. We are going to start in about two minutes if you could please put your headphones on, thank you.
So hi, hi, everyone. Again a reminder for those that don't have their headphones, to put them on. It's a challenging room.
>> You can't hear without headphones.
>> NICOLAS SEIDLER: I think people are putting them on. Let's do a test. If you can hear me, please raise your hands now. That's pretty good, isn't it? I know you are all starting to get hungry. It will be a short and punchy discussion. We will discuss one of the hottest issues on the relationship between security, privacy and freedom of expression and that is the tension between law enforcement objectives to, for example, use data to investigate crimes or prosecute criminals and on the other hand encryption which is a way of encoding information so that only authorized parties can see it.
So on the panel, we have Frank Pace, Sergeant at the Phoenix Police Department, involved with cybercrime, also involved with the FBI's cyber taskforce.
We have next to him Mike Nelson. Mike is working on Internet related at CloudFlare so he is going to bring us bigger perspective on those issues.
Next to Michael, Ted Hardie, the Director of Internet Architecture Board, which is a group of engineers developing key technologies and protocols for the Internet and they focused a lot on securing so encryption is really relevant here.
Next to Ted, Xianhong Hu from the Freedom of Expression unit at UNESCO. UNESCO is about to launch a report on encryption‑related issues.
And finally at the top end, Carly Nyst, who has been involved in anonymity issues.
So after Snowden, there was a rise in keeping confidential communications. On the other hand, something that the perspective of encryption everywhere could actually undermine the capacity of governments and law enforcement agencies to investigate crimes and receive public safety objectives. And you might have heard probably that even some countries have proposed legislations to limit the use of encryption or ask for special access for law enforcement authorities.
So how do we reconcile those views? The session that we are in today will actually have a twist. We are going as a starting point to imagine a world of pervasive encryption. We are going to imagine a world where all communication is encrypted and start from that and think about the implications of that for users, law enforcement and other actors. We are not there yet, for sure. But there are some trends in this direction. There is approximately 45 percent of traffic on mobile networks that is encrypted. As of October this year, 30 percent of the Internet's most popular websites have implemented HTTPS, which is a secure way to visit websites, as well. Just in the past few months, several big ICT companies have even enabled encryption communication for their services such as apple or what's up.
Finally both among the Internet community and civil society there have been pretty vocal calls to have full encryption to restore user trust.
So, let's try to play this game, imagine this world of pervasive encryption. It is not in a galaxy far, far away; it's down on earth. And the first question I would like to ask the panelists is to address, well, how could we get there? First of all, what do you think it could be? Do you think it could be next year? In five years or ten years? And what do you think could be the steps that could lead to this scenario? User demands? Business conducts? Country leaderships? Technical push? What could be the steps that could lead to this hypothetic scenario? Floor is open. Michael?
>> MIKE NELSON: I'll start. I'm here for a couple reasons. One is that cloud flair protects about 4 million websites around the world, preventing DDOS attacks and also providing end‑to‑end encryption for free to any of our customers that want that.
But the other reason I'm here is that more than 20 years ago I worked in the Clinton White House on what was called the original clipper wars where we had this exact same debate: A new technology from AT&T was being released to the market to provide military‑strength encryption for $3,000 embedded in a telephone that businesses could use all around the world. And this was a huge concern to the FBI and to the NSA.
The privacy advocates said this was a great thing. The head of the FBI and the head of the national security agency were very concerned. And the end result was a proposal for encryption with a backdoor. It was called clipper chip. And my job was to be talking to industry about why this was a great thing.
It did not work.
The interesting thing for me was that encryption did not spread worldwide within five or six years. I wrote a piece in 1998 for the Aspen institute called sovereignty in the networked world. And I predicted that by 2015, we would have the scenario that Nicolas just mentioned.
I think one reason we haven't is because encryption is like antibiotics. You really don't know they work. You have to trust that it works. And if it doesn't work, you could end up dead. But the fact that you can't test the product to know that it's actually working makes it harder for people to put their faith in it. And with the Snowden revelations that revealed that some of the encryption products that everyone trusted were not really secure led to a lot of additional concerns.
I think what's going to drive this, though, is probably two things. One is going to be companies like Apple realising that promising absolute strong encryption to their customers is a competitive advantage, and the second, I would predict that in the next two or three years there will be some amazing scandal where some very important political leader or CEO is embarrassed because what they thought was a secure communication was not. And maybe billions of dollars will be lost. And suddenly there will be this huge shift in how we think about these issues. We haven't had that yet, but in Washington, at least, so much of policy and political perception and culture, actually, is driven by that one story that, one incident.
As an example, in the United States we have the world's strongest protections regarding your videotape rental records because more than 20 years ago, a nominee for the Supreme Court, Robert Bork, was under consideration for the post, and somebody dug up his video rental records and made them public. And every member of Congress got very scared that that could happen to them. So it may just take one good story, and the market and the politicians will kind of shift to realising that end‑to‑end strong encryption is really a good thing, that it's about protecting crimes using encryption, not just about protecting criminals who I don't encryption.
>> NICOLAS SEIDLER: Thanks a lot, Mike. Yeah, there's no such thing as a good crisis or scandal, I guess.
I forgot to mention Sanja is also with us from Freedom House. I guess my vision was tilted towards the right side.
Ted, please let us know your thoughts on this.
>> TED HARDIE: So I think you asked two different questions. I want to answer them in turn. One is how long will it take? And the second was how will we get there?
I think it will take between 5 and 10 years to get the bulk of it done. I think there's a long tail here of things that will take much longer, but that we can get to a world where there is pervasive inscription at least available, if not the default, fairly quickly, and that our aim is to get it to be the default.
And I think the reason we want it to be a default kind of comes in three subreasons. One is if it's the default, you can't be targeted for using it because everyone is using it. And that's an important part of the defense we need for the people most at risk.
The second is if it's the default, the systems will be built to make it easy. At the moment, encryption can be quite hard to use in some cases, and making it the default is a way of making sure that everybody involved in deploying it makes it easy enough to use that they don't get tech support calls on it.
And the third is: I agree with Mike that we will probably see calls for it to counter the calls we're seeing now for increased transparency because of signals intelligence desires.
And I was struck earlier this week reading The Guardian and its analysis of the proposals from Prime Minister Cameron that said "all of these proposals make the citizen transparent to the state. And there could be no better tool for tyranny."
And I think all of us look at this and say, look, we're not just wanting to be citizens of a particular state and good citizens of it, we also have relationships with friends, with colleagues; and each one of those has its requirements for confidentiality. I don't need to share with you when I tell my husband that I love him. I don't need my company to share with you when it's considering a business deal with another company. Each one of those requires the ability to set up confidentiality. And I think when we talk about encryption, what we're actually talking about is an enabler of a communication that is confidential to the people who are having the communication. And when we look at the Human Rights goal here, that's key to Freedom of Expression and it's key to freedom of association.
And I really commend some of the people on the panel like Carly and Sanja for raising this up in challenging environments. But honestly, Mike may be right, we may need a good scandal to take advantage of to make sure that everybody realises it could hit them. And, frankly, I look at this and I realise that people are self‑censoring in the world now because they know about pervasive surveillance and we want to restore the balance so that people can have the conversations online that they could have down the pub, that they could have in their bedrooms, that they could have in a boardroom. Thanks.
>> NICOLAS SEIDLER: Thanks, Ted. So noted 5 to 10 years. I'll take your word on that.
I have Carly, Xianhong and then we'll continue. Sanja and Ted.
>> CARLY NYST: Interestingly when I was making notes in preparation, I also wrote scandal at the top of my list. I do think that it's a very important part of policy change.
I thought about this from the perspective of a lawyer and a person who works in Human Rights. And in addition to the necessary technical and political developments, I see there being, I see this being achievable in the next 5 to 10 years, as Ted said.
And when I thought about it from the present where I stand today, thinking about all of the incredible things that are happening in the world of Human Rights law and surveillance law, I thought, in fact we could potentially look back in 10 years' time and say "the thing that made this happen, the trigger point was the fact that the UK government advanced legislation in 2015 which had a provision which might allow the surveillance authorities to require the removal of encryption. That exists. That law was published last week in a draft form.
And we may in 10 years' time look back at that as the turning point that really gave the impetus to lawmakers and politicians the world over to say ‑‑ and companies, mind you ‑‑ say actually, no, this is not an appropriate activity of a democratic country.
So looking back from that 10 years' perspective, standing in those shoes, I think we might say what happened was the UK government enacted this legislation that had a broad provision to allow for the removal of encryption for the purposes of surveillance. Companies became fearful that that provision might be used against them. And when it was used against them, they stood up in court and they said no. And companies speaking out in this space is a hugely important thing. They challenged that provision in court. The case went through the courts up to the very highest courts in Europe. It went to the court of justice, the European court of Human Rights. And that court said it is not a legitimate activity to require the removal of encryption. Encryption protects privacy and free expression and we say this is fundamentally not allowed under Human Rights under international law.
So looking at it from a legal perspective, I could see how a court could one day make that decision and that will be the last barrier to come down to prevent ‑‑ to stop the spread of pervasive encryption. So I think that's one way we might get to that point.
More generally I think buttressing that, I think the way I wrote it is I'll put pressure sideways, upwards pressure, upwards from the users, requiring better, more private and secure products. Sideways, governments putting pressure on each other. I can see how potentially government like the United States might, with its interesting mix of having many of these companies living within its jurisdiction, might become actually advocate for more pervasive encryption and put pressure on governments such as the UK to protect encryption rather than undermine it.
And then I think we will see downwards pressure and that would be the form of Human Rights courts, hum and rights institutions and other mechanisms saying actually Human Rights laws require that encryption be protected.
>> NICOLAS SEIDLER: Thanks, a lot, Carly. So a lot of pressure on Ted, sorry, Frank, to respond afterwards.
Shortly, please, Xianhong, Sanja, Frank, and then I'd like to move to the next section. Thanks.
>> XIANHONG HU: Thank you. First UNESCO in line with the position of UN Special Rapporteur did a case report on the encryption anonymity that we perceive the challenges at the international level because if we envisage ubiquitous encryption we need Member States, all the countries, governments, a political will to agree to have the supporting legislation regulatory framework on the encryption.
And that's not in place yet. And I'm not sure if five years will be enough to get there. But the good news I can share is that yesterday at UNESCO we are having the 38th general conference. They have endorsed new resolution on UNESCO's action for the future action, including one is recognize, look the language is very important, to recognize the role of encryption and the anonymity in supporting free expression of privacy.
It's a good start, but we need time to translate them into the national original level of the that takes more time. I will say that maybe since this echoes the UN's posted 2015 development agenda, maybe in 15 years when we are reaching our end of the 2030 agenda, maybe it can be a goal, we will get there.
And the last point that I also agree with ‑‑ that encryption is not sufficient. It goes in hand with anonymity. This will add more challenges to it. Okay. I stop here.
>> NICOLAS SEIDLER: Thanks a lot, Xianhong. Sanja, based upon what you found in 2015 freedom of the net report, are you hopeful we will enter that encrypted world?
>> SANJA KELLY: That's an interesting question because I agree with some of the other panelists but only in part. I believe that encryption could become pervasive in many countries that are willing to enable that sort of environment. But what I'm concerned about is that in much of the world, based on our research, actually the authorities are trying to crack down on the use of encryption. They are trying to either limit its use through new legislation or they're trying to institute backdoors even if encryption officially exists within their legal frameworks.
So I think for me the fundamental question here is if encryption is pervasive, what are its pervasive in all countries? And unfortunately for the time being I really don't see that in some of the more repressive countries, whether that be China or Iran or elsewhere, they really have a poor record on privacy and surveillance issues.
>> NICOLAS SEIDLER: Thanks, Sanja. Frank?
>> FRANK PACE: I'll bring up the rear here. In regards to the first questionee would concur I think with most of the panelists that we would probably expect within the next decade to have a probably more pervasive deployment of encryption globally. And quite frankly, I think from the perspective of law enforcement, in general that may be what is needed to ultimately get to the solution of which I think we need to have in order to achieve our objectives.
And what I want to start with is to delineate the difference between what law enforcement does in the investigation of crimes and the difference between that and the intelligence services that as a matter of bulk data collection and the interception of signals, whether that be encrypted or not, for the use of their objectives are very different. Although at times our paths may cross, the point is that when law enforcement needs access to encrypted information, that is because we are investigating a specific crime. And for that, there is the evidence of that crime often contained within either a device or in the cloud that we need in order to successfully perform an apprehension and to successfully prosecute.
And what I think is of our concern most is that with pervasive encryption, you no longer have the argument to say well most criminals are not smart enough to use encryption when they're going to be ultimately using it by default, anyway. And so when that occurs, that, in turn, makes more of our need or our need to successfully prosecute crimes more important because now we're going to see encryption more often than not.
And when it comes to the access of encrypted information for law enforcement, we're usually talking about data that's at rest. And that's of course meaning when we have a mobile device, we have a computer, we have a server, we have something that is containing data that has already been encrypted, so at that point now if we have access to that, we may or may not be successful in our investigation. The interception, when law enforcement does so for the purposes of a criminal investigation, usually is only a part of what we need to have a successful prosecution in that investigation.
A good example would be the interception of images related to child sexual exploitation. That interception of those hash values, of obtaining the evidence that points to our probable cause to further investigate the suspects in those crimes does not mean that that will be alone enough for us to successfully do so.
It means that it gives us the evidence that we need to then go to where we now will find more data at rest. So I want to make that distinction between the two.
But more so I think what needs to be part of the conversation ‑‑ and as Michael had pointed out, the clipper chip idea did not work out. And now we're at a point where in that discussion, ubiquitous encryption, I don't think the argument just needs to end with "it needs to be there and there's no access to it whatsoever" because I think even within law enforcement, we all agree that we want to protect the rights of Freedom of Expression and especially of concern would be in developing nations in parts of the world where, yes, there are members of law enforcement. There are members of government that are going to use the same access that we would have in countries like the U.S. for methods of oppression of those people. So where do we go from here? And so my point would be to bring up the topic of TTPs, third‑party, key escrow platforms to where we don't have to ask for nor do we support the idea of coming up with some new back door idea, but at the end of the day, there is the need for public safety, there is the need to secure our societies. And when we have an absolute inability as a norm to not have access to that information, then we are going to begin to fail on that. So I think that's where the discussion needs to go.
>> NICOLAS SEIDLER: Thanks, Frank. I have Ted who ‑‑ just a second. I think that it's really interesting some of the points that Frank made that while law enforcement is looking for some type of access to encrypted data, so I'd like to hear from Ted. Do you think that's technically possible? And then Mike.
>> TED HARDIE: So actually I raised my hand to answer a slightly different question but let me answer yours first. Is it possible when somebody has done encryption correctly for law enforcement to have access to it? And if we're talking about encryption with forward secrecy, the answer is no. That's why it was good encryption. And if we're talking about things at rest, they need some credentials. They need some mechanism to get the credentials so it can be retrieved from the device you're talking about.
And I think that there are two points I want to make about that, one of which is it's very important distinction that Frank made between pervasive surveillance and targeted surveillance. Pervasive surveillance is a mechanism by whichever one's communications are pulled into a big data store and people troll the big data store looking for things of concern. That's obviously of concern to everyone because we are all in that big data along with everybody else.
Targeted investigations, however, have a different character. And that character is, in part, that they can go to a judge or within whatever legal construct they have available and ask for permission to begin a targeted surveillance which can use fairly old fashioned methods. I mean if we think about what happened before the technology developed for the signals intelligence community was available, I mean people actually, like if you were investigating a gang, you had to put somebody inside the gang and get them to tell you what was going on. You had to do actual people following people around. And I think one of the things that the legal community has had to face in this is at what point does the change in technology cause a change in the actual legal framework? And I love the tiny constable theory. And this was the decision that said, look, policing a G‑‑ placing a GPS tracer on somebody's car is equivalent to giving a tiny constable constant access to this person. And in the old days you would have had to put seven constables and four cars. And the sheer cost of maintaining that would have meant it would have been something that you only did when you had real probable cause.
You get it down to like a $50 dongle that you can put on somebody's engine and walk away, the cost differential has made this something that a targeted investigation is no longer required and you cast a very broad net.
What we're asking for as technologists and as members of civil society is that when these targeted investigations occur, that they have the same barrier that was there before the Internet was there for the intrusion that is the same before/after the Internet. That it should be as difficult, frankly, for somebody now to go in and listen in to me having a conversation in my home or me having a conversation with my spouse as it was before the Internet existed.
We have this amazing thing which has given us the ability to create communities and have conversations across great distances, but the social aspects of that should not change as a result of the technical aspects. And so I think we look at this and we say, yes, it is going to get harder for them to use technical means to intercept this communication or to get it at rest. But it, in fact, returns them to the status quo anti ‑ where status quo anti means anti‑Internet and that's not a bad thing.
>> NICOLAS SEIDLER: Thanks a lot, Ted. Mike, and then I'd like Frank to maybe address other means other than encrypting data.
>> MIKE NELSON: I just wanted to add a little bit because I'm very aware of the fact that Frank is sort of on one side of the issue and most of us are on the other.
To be fair, I thought I would put on my old hat which was when I was at the White House trying to make some of the same arguments that Frank made. And to say, look, this is really important. I was quoted a few times saying, this is so important that if we don't get it right, thousands of people will die. And the privacy advocates ridiculed me and thought I was taking hallucinogenic drugs, but three of them called me up on 9/12, the day after 9/11 and said maybe you knew something. There are real threats out there. This is a real problem.
And this is the other scenario I worry about that as we roll out strong encryption and we're two or three years into that, there is going to be something even more horrific than 9/11. And the wrong response, of course, would be: Let's roll back encryption. Let's ban encryption. Let's put in backdoors. The right response will be to look at all of the great tools that law enforcement could develop using the digital technologies that are pervasive that don't require cracking encryption.
I actually argue that in the scenario you described, we could see strong encryption enabling a lot of new technologies from digital cache to censors to linked cars, all of those things are going to generate a huge amount of data, which could be used to monitor criminal activity without breaking encryption. So there's a win/win here, I think. We have strong encryption everywhere to prevent cyber crimes but we also generate a lot of data with which proper warrants could be used to track down the bad guys. This can be a good story. We have to worry, though, about this bad story which is catastrophic event, turns out the criminals used encryption. And there's a political fire storm. And a knee jerk reaction to somehow pull back on encryption.
>> NICOLAS SEIDLER: So it's interesting you mentioned strong encryption. I'd like to hear, as well, afterwards from the civil society folks here, as well, what's your definition of strong encryption? Is it warrant‑proof encryption? Is that encryption that would still have some holes for law enforcement agencies?
So I'd like Frank, if you could please just address and provide some guidance to the audience on the other means that you may use? And then I'd like to also the floor from questions from the audience.
>> FRANK PACE: Sure, law enforcement certainly has alternative means. And I think everyone that's involved in law enforcement, especially involved in investigations, you never lose the part of the human‑to‑human contact with speaking to the suspect of the crime, with speaking to their co‑defendants, speaking to their witnesses, speaking to the victims. And oftentimes when it comes to access to information that we may need, we often can get that through the other parties that are involved in those crimes. And I would agree with Michael that that's certainly always the first method by which we do attempt to get that information.
However, just as society has evolved, so has the way that criminals perform and try to hide the evidence of their crimes. And with that, when we do need that access, when we talk about alternative technical measures, we will refer to them as extraordinary measures, which would be the employment of malware or in the case of interception, can we do that? Sure. Is it practical and can we do it as often as we would need to have access to the information with the volume of crimes that we investigate? No, it is not. And in the U.S., we do thankfully have a high threshold that we have to meet legally in order to have that type of access. When we write a search warrant for legal authority, we do have to express our probable cause. But we go a step further when it comes to those extraordinary measures. And that takes time. And oftentimes time is of the essence when you're dealing with violent crimes, when you're dealing with kidnappings, when you're dealing with outstanding violent suspects.
One example without really touching on what could be the next 9/11 or the next big catastrophe that oftentimes is what is the flavor of the day and what's really the issue in the media, but would be a common homicide investigations where oftentimes now the criminals are communicating not by voice but by text. They're taking photos or they're keeping documents on their mobile phones. And in law enforcement when we deal with violent crime investigations, almost every single one of those cases involves a mobile device.
So in that case, if the information is encrypted, we now no longer have access to that non‑voice communication that exists on that device. We now no longer have access to the location data where that device was when the crime may have occurred.
And what I'll also point to is we tend to be our own worst enemy when it comes to the media and the entertainment industry because in the U.S., we refer to injuries as having the ‑‑ juries having the CSI effect. Crime scene investigation. It's a popular TV show where these investigators go out and they solve the most complex crime in 30 minutes. And they have everything that they need right there. And the prosecutor's happy. And all is good.
Well the reality is that when we now go to juries in the US., they often demand that very, I would say dinner on a plate. They want everything. They want it right there. Because we saw that on TV. That's the way we saw it. You guys should have access to it. We watch everything that happens with the NSA. Even though you're a local police department, you should have access. And oftentimes that ends up complicating our investigation because we don't do that. We don't always have that access. And then we do, in turn, at times have problems with prosecution.
>> NICOLAS SEIDLER: Thank you, Frank.
Carly, Ted, I think you wanted to respond and then questions from the audience.
>> CARLY NYST: Just a really brief point on Michael's provocation around security. I think that we have to be firstly very clear that we obviously cannot eradicate all security, insecurity from our societies. I think the motivation of eradicating all insecurity that provides to surveillance techniques is very frightening in the post 9/11 era. That we can stop all terrorism as a means of justifying greater surveillance of ordinary people is a really pernicious thing.
Secondly, I think we should, as Frank pointed out, really try to divide up this debate as much as possible and not talk about security as one thing and encryption as another and means for undoing encryption as a third. Because as Frank made the point, the device, device encryption which he's talking about which is posing problems with investigation has nothing at all to do with 9/11, the terrorist threats, to be able to use bulk data to ascertain movements, and greater surveillance capabilities nor was encryption a barrier to preventing 9/11. And I think we have to be really careful to conflate these separate issues. I think it would do us all a lot of good and obviously the people on this panel are involved in this debate to really talk about what are the targeted threats posed by greater encryption in particular areas of law enforcement intelligence? What are the alternative ways we can access data in those circumstances? And I think Ted's point is entirely right. Are we talking about returning police to the situation prior to encryption but when they were just having free for all access to data? Or are we talking about returning law enforcement to the situation prior to this massive explosion of surveillance that was never in compliance with laws, that it simply outpaced legal development?
And I think we should be asking ourselves the questions. Did you have access to that information prior to the advent of smartphones if not, there's no necessary imperative for you to have access to it now.
>> NICOLAS SEIDLER: I think Carly introduced managing risk in proportionality.
We just have 15 minutes left. I would like to take some questions from the audience here and then I know that we will have a few ones from the remote audience.
So please, sir.
>> Hello. My name is from Mohammed from protection from UAE in India. How much are we advanced in cracking down the total network? I hear some news NSA and MIT can track down the total network? Because a lot happen through the toll network. So are we still in control of the toll network? Or still it is fully encrypted and cannot being cracked down?
>> NICOLAS SEIDLER: Actually thank you I will take just two or three questions. So question on toll. Bob, you had a question?
>> BOB: I had a comment. Should I go ahead?
>> NICOLAS SEIDLER: Yes, please.
>> BOB: The question of government having access to encryption algorithms it seems clear to me in that governments like everyone else are not ‑‑ don't have perfect security and IT practices. I mean, the recent thing about the U.S. government, the whole database of the U.S. government employees was stolen. And so if they can steal that, they're going to steal the keys. And so it's not just going to be governments who have access to this. So I'm strongly in support of encryption where only the end‑users have the keys. And there isn't some central database or some override key or something like that because it won't just be governments.
And then this even ignores the part as we've heard in the Freedom House report, some governments have better records of protecting the privacy of their citizens' information and some just the opposite. So this is both having encryption algorithms with backdoors will be used for reducing freedom, not just solving crimes. So I think the law enforcement community is just going to have to do their work without this. And if they have focused investigations, they have other ways of collecting the data.
>> NICOLAS SEIDLER: Thank you, Bob. Maybe I'll just take just a few more questions. The sir over there at the mic? You, sir, and then gyrus and then I'll go to Raquel for the remote questions.
>> RAJESH BALLAL: Thank you. This is Rajesh Ballal. Association of Competitive Telecom Operators from India.
This is already discussed in length that encryption is a very much important issue. At the same time from a government perspective, what do you think because we are also in discussion with Indian government for many years and we are kind of asking for and lobbying for higher encryption. And government is happy to give them higher encryption as long as we give them a practical solution on how they would be able to decrypt whenever security threats are there. We haven't seen any real solution we can offer. So you have panelists which are experts on this. Can someone say what are those options what we can recommend to the government in terms of decryption? Because today we have a practice of 40 bit encryption and an approval process where you can share keys. But since there are dynamic keys, you cannot really share with the government of India every now and then. So it's kind of a solution where no approvals have been given so far.
So I'm really looking for some kind of a practical solution what we can suggest or recommend to the government in terms of decryption which could be protecting a criminal.
>> NICOLAS SEIDLER: All right. Is there any kind of special access that governments can have? I'll take two questions and then we'll have the first round of responses. Sir.
>> PATRICK CURRY: Patrick Curry, EU project mapping, Internet Governance.
I'm going to suggest that the situation's actually moving faster than this and that we're having yesterday discussion here.
So the first point I'd make is that encryption isn't the only measure for protecting information. There are other metadata extraction which changes the paradigm even more in a different way.
But the point I really want to make is that new technologies coming into smartphones particularly around trusted execution environment, trusted user interface and other things in there are going to shift the balance for the user for trust into their hands in a way we've never seen before.
Right now, the toolkits to support that and the isolation of the operating system from the application layer are going through final certification and will be mass deployed on some technologies in the very near future I'm talking months.
The side effect of that is things like bit coin transactions and chats will be secure end‑to‑end. There is nothing to see in the middle.
So the question then comes: As you give the user, a bit like giving them a democratic vote, can you use that technology wisely? Not just for themselves and the consumer society but also society at large and this begs much bigger strategic questions about the risks associated with giving so much choice to users under the assumption that they're always going to use it wisely.
So my question here is: How are we going to move forward in our anticipation of these societal or technical changes rather than having yesterday discussion? Thank you.
>> NICOLAS SEIDLER: Thank you, sir. So many questions. I'll let the panel jump in and address those. Ted, I think you wanted to jump in first then Frank.
>> TED HARDIE: This isn't technical comment, but it was a marvelous analogy to say it was like a democratic vote. Who are we to decide they shouldn't exercise their vote?
>> PATRICK CURRY: If I may, the question comes the duty of a nation, the first duty of a nation is to protect its citizens. We could have a debate about that but that is its first duty. And clearly lots of other rights that you have, we can talk about freedom of expression and so forth, completely agree with. The problem is treating these in isolation creates tensions which one has to look at in specific cases.
So the challenge I would have here: None of us wants to be in a situation where we see harm done to an individual that is close to us caused by somebody else who meant us harm that was completely malevolent.
>> So, Mr. Curry, we talked earlier about proportionality. And I think it's a very, very critical idea. When we look at proportionate, one of the things we're look at is there are hundreds of millions of citizen of my own country who are engaged in conversation about a whole variety of things, some of them business‑related, some of them personal, some of them societal. If you create a system, which as Bob pointed out in the case of third‑party encryption which reduces the overall security of those communications, makes them subject to interception, makes them subject to potential misinterpretation by others, because you believe there is some chance that at some point there may be some threat to their body or their family, you are giving up an enormous amount, a completely disproportionate amount compared to the threat you've articulated. You can invent, and people commonly do, 9/11 scenarios or child pornography scenarios or a whole variety of other things, but if you look at what you're losing compared to the instances of those, you really have to strain yourself to believe it's proportional response.
>> PATRICK CURRY: Forgive me I want to make it clear, there was a point there. I'm not advocating mass surveillance. That's not what I'm getting at. I'm just trying to point out that the challenge is changing. The question is how we address that that's all I'm saying.
>> NICOLAS SEIDLER: Thanks a lot. I'll take just a response from Frank now, then I have to get to the remote participants' questions. Many people want to take the floor.
>> FRANK PACE: I'll keep it quick so Michael can make his comment, as well. Regarding the request of government to retain data, this letter that I'm holding right here, this is from the OPM. So them letting he know that my information was compromised. So that trust is certainly not just within those outside the government. It affects all of us.
However, that being said, when we talk about issues related to tour, there was a study done by the Oxford institute in 2013 that showed actually democratic countries are the strongest users of tour with Italy leading Europe at about 76,000 users per day. And in the U.S., about 126,000. So tour does work.
And then if the question that I missed was related to what we could access via those that use tour? I can't talk to the specifics of that but I can say the obvious would be any artifacts that could exist on a device on which the platform was used.
But then again we go back to what I had mentioned about not necessarily trusting government to retain something that could provide the access it needs, but talk about the cooperative efforts with civil society organizations and other NGOs.
>> NICOLAS SEIDLER: Thanks. I'll take now about three, four questions and then we have to wrap and I'll give a few final comments for each panelist. Raquel, first with the remote questions and then Nels and the gentleman here.
>> RAQUEL: Thank you, Nico, we have two questions from the remote participants. First one from Chip Sharp from Cisco. Law enforcement can get the communications just like it did before. The difference is that the communications are now encrypted. Current technology requires all traffic to be derepresented in order to be used. For example, mail servers must derepresent mail headers in order to route email. Users must decrypt mail body to read it.
Please explain how encryption prevents law enforcement from getting the location information. Isn't determine nation of location of mobile phones independent of the user data transport?
He also mentions of course there are difficulties other than encryption. For example, jurisdiction.
And he asks for an answer directly on the question put up in the agenda, how to achieve legitimate law enforcement needs for criminal investigation in a world where encrypted Internet traffic is the norm?
The second question we have is directly to Frank Pace. It's from Oscar Padilla from Mexico. "I think we must work technically two ways to communicate over the Internet. A totally monitorable through court orders and other ‑‑ do you know any document about good practice in combating Cybercrime?"
That's all. Thank you.
>> NICOLAS SEIDLER: Thanks a lot, Raquel. I'll take two, three more and I'm really sorry for the other runs. Nels, the gentleman here and gyrus.
>> NELS: Thank you. I would like to push back on some images that were sketched that we could prevent terrorist attacks by beating strong encryption because everything indicates right now there is actually more ‑‑ too much information. In the case of 9/11, the Boston bombs, attacks in India, information was there but the hit was too big. So there is not really a reason to make the heap of information better. The law enforcement agencies perhaps should just get better at analyzing the information.
And by also ‑‑ and then there was the remark that when we get the Internet of Things, it will be much easier to monitor anyone anyhow. That's replacing problem on the other, and that infringes on the freedom of privacy and association and ‑‑ we should not be doing that at all. The trust should be in the Internet. If we want to keep one strong Internet as a network of networks, intelligence is end‑to‑end where the users can communicate with each other without someone else correcting or censoring their information or injecting other headers or other information in their communication, we really need to work on other measures.
And I think the law enforcement agencies are selling theirselves short. They can do a lot of work. And before digital age they also did their work really well and a lot of crimes that are committed offline also have an offline capacity where there still can be a lot of work done. So I don't think that we need an online capacity for everything to be done if we want to keep the Internet as a strong, open and free network that is a global communication factor.
>> NICOLAS SEIDLER: Thank you very much. Sir?
>> NAVEEN TANDON: Thank you. My name is Naveen Tandon. I'm from Association of Computer Telecom Operators in India. I just wanted to bring up perspective into the discussion and see if there are any comments. When we talk about encryption, I think the common thread which binds all the stakeholders involved in the discussion is the legitimacy. If we talk about government, they have a legitimate right to have lawful access to the data that moves on the network.
If you talk about the nongovernment participants like the operators or the consumers, for example, who have a legitimate, again, right to protect for the network for encryption.
So I think it will not be fair to either talk about having higher limits or trying to find a silver bullet for any of the questions relating to encryption because the way technology is improving, today if you talk about thousand bits, next 24 hours we'll be talking about 22,000 bits.
So I think there is a way that if you can approach this matter so as to address the requirements of law enforcement agencies as well as protect the legitimate right to privacy for the consumers as well as the operators to protect the network to try and arrive at a PPP mode which is the public private mode, which works on collaboration with the government and all the stakeholders. Thank you so much.
>> NICOLAS SEIDLER: Thank you. And I'm really sorry. I think we need to wrap now. It's 1:00 p.m. And I'd like to give a chance for all panelists to respond.
I think that, well, many issues were raised in the questions from the audience. And I think that I will have one final question, as well, to each of you.
Do you think that we'll be able to create a trust relationship between law enforcement and other stakeholders? Because it seems that obviously law enforcement does a job of protecting citizens. And the goal is to ensuring security apart from some countries in the freedom of the net report, I guess. But do you think we'll be able to reconcile those two sides, both protecting the privacy and confidentiality of users and allowing law enforcement to do their job? I'd like to give each panelist one final word on that. Xianhong Hu?
>> XIANHONG HU: Thank you. The discussion today reminds me that classic debate dated back to 10 years ago on IGF when I first came and whether cybersecurity and free expression and whether there is conflict, how they reconcile. Still it's the same rationale.
From UNESCO point of view, that we see the security in a sense because we also look at the security of the individuals on the Internet. UNESCO protects Safety of Journalists. Our general condemn every case when journalists are killed. They are under attack every day. Imagine if we have the ubiquitous encryption for everyone including journalists. We have saved their lives. They are good people. They have lost life because they have no encryption.
So I think in this case, I think the encryption really eventually this will contribute to the overall security for everybody. It doesn't conflict. So we should be wise enough to find a way to account for all of this security for everybody.
>> NICOLAS SEIDLER: Thank you. Carly.
>> CARLY NYST: Just very quickly say that I think one of the keys to advancing this debate is seeing encryption as pro security rather than anti‑security.
>> NICOLAS SEIDLER: Short and effective. Thank you. Mike?
>> MIKE NELSON: I'm glad you asked the last question. It's the fundamental question. How do we build an infrastructure we can trust? Build law enforcement mechanisms that work in this new digital age.
One of the most exciting things about this scenario that you describe, for me, is that in the future, we could see much more of a digital community watch. Today we expect 1 percent of our population to protect the other 99 percent. David Brin likes to refer to them as the protecter caste, and his idea in his book "Transparent Society" is that the 100 percent protect the 100 percent. We saw that with 9/11, actually. The one act that was thwarted was because a bunch of people, very brave citizens with cell phones, found out that they were headed on a suicide mission to Washington and brought the plane down. In Mexico we have people who could be doing more to fight the drug gangs, but they're not sure that they are telephone calls and their Emails aren't being monitored by the drug gangs. Give them strong encryption. Let them be part of a community policing effort. Maybe not talking to the police directly. Maybe talking to third parties. Maybe talking to journalists. But that would actually build more trust and bring law enforcement into a community effort rather than seeing this as us versus them.
So just to build on that a little bit more, there are things going on today to make sure that third‑party providers, like banks, are building their systems in ways that will support law enforcement. Interesting that the recent transpacific partnership trade agreement has a couple provisions that say that banks have to be able to assist law enforcement despite how much encryption they deploy in tracking where money goes that might be used for illegal activities.
So we don't necessarily expect to have end‑to‑end encryption in all positions. Certainly if I run a company, I don't want my employees to be able to lock up all of my corporate data behind unbreakable encryption. I want them using systems that have escrowed encryption that the key people in the company can decrypt.
>> NICOLAS SEIDLER: So, thanks, Mike. You think a middle way is possible. Sanja?
>> SANJA KELLY: I will make a similar point I made before. But sometimes when we speak in these panels, it seems like the conversation evolves around only the debates that occur in western Europe and the United States and the rest of the free world.
And as I mentioned before, actually most Internet users live in countries that do not respect Human Rights, where actually the law enforcement is the main perpetrators of Human Rights abuses. And even when we look at the most populace country and the country with the most internet users, they live, that is a country that does not respect some of these basic rights.
So I think, then, the conversation becomes very different depending on the setting in which we are in.
I just want to reiterate that Freedom House supports the conclusions of the UN Special Rapporteur on Freedom of Expression that encryption is necessary to protect freedom of expression, freedom to privacy. And I think the bottom line is that in addition to the interests of the states to protect its citizens, it's also one of the main obligations of the state is to respect Human Rights. And I think that's essentially the bottom line. And encryption ensures those rights.
>> NICOLAS SEIDLER: Thanks, Sanja. Context is everything. Ted and then Frank.
>> TED HARDIE: So I feel like after all of us have arrayed ourselves against Frank on one of his missions, we're all trying to array ourselves with him on a different mission. And I think I would agree with Carly and Xianhong when we say, look, what we're doing is deploying something that helps make sure some kinds of crime never happen, that some kinds of threats are not possible and also protects confidentiality and a lot of other consequences. But we hope we can find common ground there.
When the Internet Architecture Board made its statement encouraging the technical community to move encryption from being commonly available to the default, we did not do that because we were arraying ourselves against law enforcement; we did that because we felt that we trust user community to use the Internet for the purposes for which it was built and evolving require that, and we still do.
>> NICOLAS SEIDLER: Thanks, Ted. And, Frank, you'll take away from this discussion, do you think we'll be able to reach a sort of trust framework?
>> FRANK PACE: I do. And that's the main reason why I'm here today. And I think this is only one example. Myself and several other colleagues participate in other endeavors with academia and other civil societies organizations to establish these types of foundations that we can build upon. And that's truly one of my objectives, not in my professional career but also in the future so that we do find these solutions and it does involve us sitting at the table and bringing up these issues.
Regarding some of the questions that were asked, I believe the gentleman, Jim sharp from Cisco, I would clarify that, yes, we can get access to location data from call data providers of the those are call data records that we have. That's true.
However, on the devise themselves, they usually do retain geolocation data that also supports our precise placement of an individual. And when we present that to a jury, we can then have a complete picture. And so we often like to have both of those. But he is correct.
And ultimately whether information, whether that be email or otherwise is encrypted and ultimately is decrypted when it comes to its source, if that source happens total encryption on the drive, we're then faced with the same obstacle.
And then there was another question regarding the use of surveillance and talking about trying to prevent mass tragedies and law enforcement's need to get back to its roots and I would concur. I think this has been brought up in U.S. often that we have lost our ability in some aspects to actually employ human intelligence. And that's certainly an area where we do act ‑‑ we provide ourselves the ability to truly focus on a particular point, that we should be looking at as opposed to trying to analyze many, many, many volumes of data in the hope that we find that needle in the haystack.
>> NICOLAS SEIDLER: Thank you very much. And I think it was good to have everybody around the table. Clearly the discussion is not over. Actually, CIGI will organize a workshop at 4:00 p.m. today on the politics of encryption. So you may have an opportunity to continue the discussion over there.
So thank you very much to all panelists and to you all in the audience.
[Applause.]
[End of session.]