IGF 2016 - Day 2 - Room 4 - WS 96: Free Expression & Extremism: An Internet Governance Challenge


The following are the outputs of the real-time captioning taken during the Eleventh Annual Meeting of the Internet Governance Forum (IGF) in Jalisco, Mexico, from 5 to 9 December 2016. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 


>> LARRY MAGID:  Good morning, everyone.  I want to give you a heads-up.  I think a lot of people got stuck in the security line.  We will begin soon.  Don't get comfortable.  We will break the IGF by breaking into three breakout groups.  It will be explained.  There will be one group in this corner, one group in that corner and another group by the coffee station, where you just walked in.  Give us a few minutes and then we'll get going.


>> LARRY MAGID:  About two more minutes and we'll start.  We want to let a few more people come in the room.


>> LARRY MAGID:  I would like to welcome you to the session on extremism and free expression.  Thank you for joining us.  We will do something a little different today.  We want to have as much conversation as possible.  We are going to break into three groups at some point.  Not right now.  Don't get terribly comfortable.  We will talk amongst ourselves on the panel and ask people to go into corners of the room.  If necessary, one group can go downstairs so we can have conversation among ourselves and people have a chance to talk.  At 10:00 o'clock we'll come back into this room and convene as a group and report on what the individual sessions did and wrap it up.

Hopefully this will be a good combination of talking heads and more importantly an opportunity for folks in the room to make their contributions.  My name is Larry Magid, the CEO of connectsafely.org, a Silicon Valley organisation.  I'm the technology analyst for NBC News and the BBC World Service.  Those of you familiar with my voice, that's why.  I'm actually one of the tech analysts for the BBC World Service.

I'm not the expert on extremism.  My two colleagues are.  Carl Miller, the Research Director for research and analysis of social media.  And Jon Russell is the head of Quilliam.

And my thoughts of this, as somebody who follows social media closely and writes about the phenomenon, is to emphasize how important a topic this is.  Extremism not only affects the person who is directly affected, ie. the man or woman, typically young man or woman who has been radicalised online or off, but his or her family very directly.  Regardless whether anything happens physically in the world, anything dangerous actually happens as a result of that radical saying, but if in the horrible event it does happen, then the impact is much, much wider, of course.  The victims and their loved ones are directly affected.  The communities where it takes place, if anybody here has been ream to Paris or Brussels or Orlando, Florida; or San Bernardino or any other community where an act of terrible terrorism has taken place, knows the impact far and wide within that community.

But also in the larger global community.  I don't think it's beyond speculation that the reaction to extremism has affected elections not just in my country but around the world.  It really has impacted, as we will talk about today, our willingness as a free society to allow full expression.  One of the tragedies of 9/11, one of the many tragedies was not just the horrible situation of what happened in New York that day, but the impact it had and still has on free expression and human rights in the United States and around the world.

And we are seeing that repeated and amplified over and over again.  But I also want to make sure that we focus on the victims themselves.  By that I include the people who themselves become radicalised, who cross the line from legitimate and appropriate free speech, political activism, the right of assembly, the ability to grow and express themselves and protest what they consider to be unjust, but cross the line to a point where what should have been political activism and free expression becomes a crime and inhumane acts towards others.  That's the line that I plan to explore in my breakout session.  Where is the line between legitimate free expression that I believe we personally want to encourage even if we don't agree with that expression and that moment when it turns into something horrific.  That will be my topic.  I'll allow my colleagues to talk about their background in their talks as well.  Start with Carl.

>> CARL MILLER:  Thank you.  Thank you very much.  Very good morning, everybody.  Thank you so much for having me here.  This is such an important topic for us all.  As a researcher, the research world has been scrambling over the past years to understand what on earth is driving extremism online.  Is it the technical infrastructure, things to do with the human being?  Is it the services we use?  Are there wider questions of international politics?  What causes it and how do we undermine those drivers?  I want to lay out four different themes which we know are driving different kinds of online extremism, whether it is far right radical Islamist, misogynistic or otherwise.

Theme number one is echo chambers.  We jump into the Internet, we jump into digital tribes, people have been washed away in a sea of user generated content.  As our networks, our friend networks become the mediate ors of information for us, increasingly defining the online experiences we have.  We are realising, of course, that those networks are actually quite similar to our own, similar to who we are.  In these digital tribes, these echo chambers that we have online we often see the same world view reflected back to us in a thousand different ways.

So you get 20 Democrats in a room, you get 20 Republicans in the room.  They all talk their own politics.  Chances are that they all leave that room either more Democrat or Republican than any of them were having first gone in.  Having that same view at home time and time again, saying you're right, you're right.  It causes a shift.  We become more and more convinced that our world view is right and, secondly, it convinces us that the people that disagree with us are ignorant or evil.  As digital tribes become hardened online we see people increasingly believing that the others on the other side of the table are not legitimately disagreeing with them, don't mean well but must not have the facts at their fingertips or must have maligned or malicious purposes.  Whether it's conspiracy theory, radical echo chambers, these worlds are shut off.  Any kind of dissenting information echoing that single world view, they flow further from the mainstream.  Driver number one of extremism is the echo chambers, repetition of a single world view time and time again and the elimination of doubt or dissenting information.

Second, the disinhibition effect.  If you put a computer between two human beings, they tend to treat one another less civilly.  Lots of reasons.  Anonymity is part of it.  Also the absence of human cues, the absence of the idea that the people you are speaking to are human beings.  They hope, dream, believe things just like we do.  Online disinhibition effect, the role of computers in human discourse.  Provides all kinds of cognitive biases which cause people to be less inhibited.  They get more abusive more quickly.  And there are the number of times in any online discussion, the number of exchanges before someone calls someone else a Nazi.

One of these effects is an important part of I think a piece of the puzzle when it comes to trying to understand extremism.

Thirdly, it has never been easier to find someone that you massively disagree with online.  As easy as it is to find people that share your beliefs and agree with you and want to change the world and change society in the ways you do, it is easy to find people completely different from you, completely opposite to you.  We analyze social media to understand this.  It is radical dynamic is getting more powerful.  People from radical Islamist groups and from the far left are winding each other online.  They are cherry picking conversations and throwing them into their own echo chambers, to show how evil, wrong, flawed the opposite points of view are.

Lastly is not a driver.  It is really the absence of something.  That is the idea, if can, of digital citizenship or norms online.  As the generation has come through who uses the Internet really as the primary way through which they learn about the world and do political activism and find people to try to change society in the sale way they are, people that see the same problems in society that exist as they do.  Well, they haven't been taught, I think.  And the idea of how you treat one another civilly online as you do offline.  It is that normative layer that we all kind of are taught from our parents and in schools of what you need to do to be a responsible, reasonable member of a society.

To avoid offense when you can, to try and be productive when you can.  The idea of digital citizenship, that hasn't translated into the online space yet.  So in a sense it is a normative wilderness where there is a wider degree of understandings about how we can all act.  What norms we should reflect.  And how we should treat one another.  That is an absence which I think is also driving all kinds of behaviors which are really socially problematic.

Those are the four:  Echo chambers, online disinhibition, reciprocal radicalisation, lack of digital citizenship.  Each of the breakout groups, I would love for them to deal with those four themes and see how they can be undermined in ways we find acceptable and consistent with civil liberties and the other social standards and policies which we cherish. 

I'll end with two kind of provocations:  The first is block chain.  There have been other sessions on this.  I don't have any time to really talk about it, but we are about to go through another potentially very profound rewriting of the way the Internet works.  And what do we do with extremism when we actually in some important senses lose the companies that sit at the heart of the Internet services which we use every day?  What do we all do?  What do governments do when they can't pick up the phone to Google or Facebook when we have social media platforms, when there is nobody in charge.

With the idea of those four themes and this evolution of the Internet which is staring us in the face I will be doing a breakout session on censorship.  Not a popular topic at the IGF, I know.  I want to talk about whether blocking of content, kind of the more coercive end of counter extremism, does that have any kind of place in the future of fighting extremism online?  Is censorship something we should get rid of?  If not, how do we make it something which is kind of broadly publicly supportable?  Something which all different corners of Internet Governance community and Internet users can buy into and have confidence with.

>> JONATHAN RUSSELL:  Great.  Good morning.  I'm Jonathan Russell, Head of Policy for Quilliam.  We are a counter-extremism think tank and spend a lot of time researching terrorism, extremism and how to counter the phenomena.  Surprisingly, considering online approaches of extremists and how we can best counter them online has come across our desks frequently over the last decade.  What I would like to talk about today is how similar lots of the trends that Carl has just put forward are inherent to human behavior and inherent to extremist operations both online and offline.  I know there is very little difference between how we should counter extremism in those two very much connected domains.

And therefore, for us not to be shocked or surprised or feared by this seeming new problem, and to consider how lots of the discussions we have been having about balancing human rights and national security priorities, civil liberties an national security are as applicable online and offline.

And how the definitional difficulties when it comes to these phenomena are equally applicable to both online and offline discussions.

We broadly stay out of the space that should be dominated by states.  Of course, Quilliam has a role in advising governments as to what they should and shouldn't do in this space.  Predominantly talking to a generation of policymakers who have never used snap chat or Twitter themselves and they have teams to do that for them.  And trying to get over some of the generational struggles that we might have in this policy area.

One of the things that we find in offline counter extremism which I think we need to port over to the online world is this full spectrum of approaches.  So this full spectrum includes the soft end, primary prevention.  Some of the digital civility stuff that Carl was talking about just earlier: Education, critical thinking skills, building resilience among vulnerable populations and trying to change the atmosphere in which extremists might operate in the first place.  Talking about identity, talking about immigration, talking about integration.  And considering whether there's a role for online activism and online work in this space to try to improve that domain.

Slightly to the right of that we think a lot about targeted intervention, how we can stop signs when someone is going down one of these difficult pathways, how we stop the train and get it going in a different direction.  We think most prominently about couples and counter speech.  That is the area that I focus on at Quilliam.  I believe that extremist organisations are affected because they are effective communicators.  I don't think there is anything more complex than narrative and the stories that they tell themselves, the stories they tell vulnerable people and the stories vulnerable people tell themselves when it comes to radicalisation.  Those are three important things.  Whether we are looking at blocking and censorship or softer stuff, being in charge of a narrative and thinking about how we communicate with each other is surely central to this opinion.  And that's why I think social media seems to have had such a profound impact on radicalisation and extremism.  It is the optimum mode for communication.  Therefore, social media facilitates, if not drives, radicalisation to that end.

But I don't think we should rest at that.  Counter speech and the work we do at Quilliam is all about thinking how we can use the tools that extremists exploit to communicate more effectively, to reduce vulnerability, to penetrate the echo chambers that Carl talks about and provide alternative or counter narratives to extremists and their ideologies.

The issue we come across, though, is not that extremists use the Internet.  It is not what they communicate online.  We know these things.  Much more concerning is why these narratives resonate with Millennials and why extremists are able to change the attitudes and behaviors of Millennials.  To get them to act in a way that they otherwise wouldn't.  And on the flip side, to consider how we can use the very same tools, communications tools and tech tools to achieve a similar source of resonance.

Someone said to me recently that wholesome content never goes environmental.  But it is impossible to get your counter narrative to really take hold.  It is much easier to be inflammatory.  Much easier, as Carl said, you get more re-tweets by saying something at the pole of your spectrum of opinions.  You will achieve more followers by preaching within your echo chamber or by cherry picking those controversial statements from opposing echo chambers and painting them as indicative of the other.

So counter speech is clearly a big challenge for counter extremism offline and the biggest challenge for us online when it comes to this space.  Of course we have what I call negative measures and blocking censorship and take-downs and the role of that within counter extremism.  I agree with Carl that there's a role for that, but we get into difficult territory particularly when it comes to the human rights and free speech aspects of these things, to consider definitions of extremism.  If we are going to take surveillance or imposing legislative approaches, whether that's in the public sector or in the private sector with these things.

And how much safer it is to consider the much broader spectrum of approaches in primary targeted and counter speech interventions that we can do.

So just to close, I also want to provoke a few things for our discussions.  I want to know really who should do what in this battle.  You know, is it entirely up to governments?  Is there a role for the private sector?  Or should civil society be leading on this?  If so, how can we all work together?

Second provocation is, is this assertion that extremists are good at using the Internet and are better than we will ever be because they have to fight to survive.  Can we with our collective brains in this room and those watching prove them wrong, to show that we can use the Internet more effectively than extremists can?

And third, the challenge that we need a magic solution to countering extremism online.  I want to know how we can continue and evolve rather than spark a revolution in counter extremism, just by taking offline measures online and making that more effective.

But I look forward to chairing a breakout session and hearing from you and reporting back.

>> LARRY MAGID:  Now is the exciting time when we all get to move, because this room is big enough for two sessions but not three, I am going to ask Carl to take his censorship discussion downstairs, but don't hesitate to go downstairs.  All of us spend too much time sitting in front of computers.  So the walk is good for you.

Jonathan will be in that back corner in this room.  I will be in this corner right up in this room here.

Now is the time to get up, walk around, try to make it as quickly as possible.  If for whatever reason if your group is too crowded, know that we will touch on all of the subjects in each of the groups, but focused very much on -- Carl suggested a show of hands.  How many people want to do the censorship panel?

>> CARL MILLER:  Should we go for that?  I'm on censorship.

>> JOHN CARR:  I'm going to focus on speech.

>> LARRY MAGID:  I'm going to lead a discussion about the line between legitimate political activism, thought and extremism.

Show of hands, censorship?  You are not favoring it by raising your hand. 


>> LARRY MAGID:  Jonathan's discussion about counter speech, et cetera?

And the other discussion about the line between extremism and legitimate ... sounds like we are reasonably evenly balanced.  Very good.  Upstairs, downstairs.

>> CARL MILLER:  Censorship people, follow me!

>> LARRY MAGID:  Enjoy the walk.  The discussion on counter speech will be in the back of the room and the discussion about the line between extremism and legitimate discussion is at the front of the room.

(During breakout groups, captions will be suspended.  Captions will resume at 10:00 o'clock.)

>> LARRY MAGID:  All right, if we can get settled down, we can finish up.  Hello?

Can you hear me now?

All right.  Thank you, guys, very much for engaging in those conversations.  I only got to go to one.  But if the others were nearly as successful as the one I was in in terms of contributions from folks in the room, I'm sure we had a great session.  At least we did and I'm sure the rest of you did as well.  Where is Jonathan?  Have we lost him permanently?  Is there a group still going?  Oh, there he is!  Obviously that was so successful they couldn't even get their moderator to ... okay.

Carl, you want to go first?

Jonathan, are you ready to go?

All right.  So neither Carl or Jonathan is ready to go; but I'm Chair, so I guess I have to.  We talked about the line between acceptable speech and what this topic today's workshop today is extremist speak.  Typical of IGF participants, we couldn't even agree on the definition of extremism.  Nor could we even agree whether that was a legitimate term to begin with.  Whether in fact that was the appropriate term to refer to what it is that should be concerning us.  One member of the group suggested perhaps hate speech would be a better term.

Just as I earlier said I'm not sure I would agree that radical is necessarily a negative term.  Not everyone in the group agreed that extremism -- I remembered because I'm old enough remember, a quote from a Presidential candidate in 1964, Barry Goldwater.  I didn't agree with him and he didn't win, but he said that extremism in the defense of liberty is no vice.  But at that time extremism was an acceptable concept. 

Having said that, the group mostly agreed that the line is drawn when it comes to some kind of hateful or harmful activity.  Excellent example was given of what is largely referred to as pizza gate.  A person who happens to live near the restaurant in Washington, D.C. where it was alleged, if I can recall accurately, that Hillary Clinton was sponsoring a pedophile ring that was operating out of this pizza restaurant.  First of all, there is absolutely no evidence that this pedophile ring exists and that Hillary Clinton had anything to do with it.  It was fake news.  It was a lie perpetuated online.

Yet some individual, because he was so emotionally and perhaps intellectually upset about the notion of a pedophile ring, took it upon himself to go to this restaurant with a gun and start shooting.  I don't believe anybody was harmed.  I didn't catch up on the news, but in fact it was a terrorist act perpetuated as a result of someone's speech.  Was the person who spread that false information a terrorist?  Or an extremist?  Someone argued that the fact that it's false is irrelevant.  That is not the relevant point.  It was one of many examples.

I think the question that our group struggled with -- and I don't think we frankly with all due respect concluded an answer to -- is the level to which one's speech should, the speaker who says something that is not directly inciting violence, whether or not that person should be held responsible if someone chooses to be violent.  If someone were to say:  I think that all people with red hair are evil.  Therefore, if you encounter someone with red hair, you should kill them, hit them, harm them, that would fall under extremist speech.  If someone just said all people with red hair are evil, that's the question mark.  Is that extremism?  We would all agree that would be hate speech.

Does anybody in the group have a very, very quick additional comment based on something I may have left out?  I don't have the most perfect memory.  Apparently I have a better memory than I think I do.


>> LARRY MAGID:  Okay.  Who here is ready?  All right, okay.

>> CARL MILLER:  All right.  Well, thanks firstly to all members of my group.  That was a brilliant discussion and a rare thing to get people from so many different backgrounds together talking civilly on a topic where there is disagreement.  We had diverse views on censorship.  You don't often get to see that in one conversation.  Everything from the idea that censorship is illegitimate and shouldn't be done in any circumstances than online.  Not just terrorism but child pornography and forceful speech.  Technology morally speaking must make sure that the gardens that they maintain are right.  Sorry, jumping in on that?

>> (Speaker away from microphone.)

>> CARL MILLER:  May.  Certainly have a right to even if they don't exercise it.  Certainly different shades of gray of many contributors around censorship in some sense being necessary, but also carrying with it lots of dangers.  And so on this point of the dangers around censorship, I think there was a specific feeling of the acuteness of the danger around the censorship of terrorism.  I think that was partly driven by the, as ever, the lack of a definition, global definition of terrorism and extremism.  If one goes to terrorism conferences, that's important.  Every conference begins with a debate about what terrorism means.  It means different things to different governments and used to illegitimise certain voices.  Inexorably and unavoidable a definition, whether you agree with it or not.  But the censorship around terrorism was particularly difficult because of the politicized nature of it and the way in which counter terrorism laws around the world have mission creep, have been misused and in ways that they never were intended to be used for in the first place.

There was also, I think, a lively debate around who should do the censoring, if you need to do censoring.  There perhaps we moved closer towards consensus.  I think there were a few people even a kind of governmental stakeholders that thought the government necessarily needed to take the lead in this.  There was quite the clear idea that the actual service providers themselves where possible should be taking the primary responsibility for controlling the services.  In the context of the conversation with governments and with civic society as well.

The majority of the people in our group -- how many did we have there?  Fifty, 60?  The majority are not content with the status quo, especially I think civil society.  There is a clear sense they do not like the current State of play around censorship.  I think the governments and technology providers are more content with the current status quo.  Where there was the most consensus is around what needs to change in order to bring everyone up to an "ish-ness" state of contentment.  That was process and transparency, greater oversight over how censorship happens and who gets to do it.  Even with oppressive or dictatorial regimes around censorship.  Capacity building, finding common ground and lastly conversation, which is what we just had.  That was brilliant.  I don't think many people changed their mind in that.  But if all you know about the operation of echo chambers, even the introduction of dissenting views can lead to doubt and you will be a tiny less sure about how right you may be.  If we did that, that was half an hour or so really well spent.  Thanks, everybody.

>> LARRY MAGID:  Thanks, Carl.  I want to point out in America, the First Amendment, the document that defines censorship, applies to the government, and Twitter, and other social media do sensor.  It is not a question of whether they sensor, but how they should sensor, I think.  Go ahead.

>> JONATHAN RUSSELL:  We also had a very, very interesting discussion.  The starting point for it all was taking Carl's four trends online and with social media behavior and saying, well, is blocking and censorship going to change those four behavioral trends online or can we use couples to do that?

Broadly we considered the role and value of social media in counter speech.  And I am not going to do this justice, but I'll try to pull out four things where we thought the answer was yes.

So first in accepting that the problem isn't online but is inherently human.  And that social media and technology more generally can help us A, reach key target audiences.  And B, link up different people with opposing views or even similar views that may be hundreds or thousands of miles apart.

We looked at how ISIS is not simply a terrorist organisation but a social movement that brings together people from different countries who happen to agree on the same thing and we in this group as a multi-stakeholder environment do something not dissimilar though perhaps not quite on the same scale and not to the same ends, and considering transfer the role of social media in finding collaborative solutions and working out who has a different role to play in this sort of stuff.

We, therefore, talked about the value of that approach in getting out of our own echo chambers and breaking this.  The key provocation in that is, do ISIS manage to get out of their echo chamber or do they will manage to penetrate other echo chambers and recruit other vulnerable audiences too?  If so, how do we learn from them?

The discussion went on to learning how we can learn from other domains when we any about counter terrorism, can we learn from political campaigns?  Can we learn from militaries and influence operations in that space?  Can we learn from the private sector and commercial advertising work when we are thinking about changing behaviors and changing attitudes?  The answer broadly we thought was yes.

But the key was not just reaching people.  It was actually achieving resonance with those people as well.

We had a very interesting discussion around the continuum between mainstream media and social media and differing roles and responsibilities in counter extremism in that space, considering how we are all journalists now and have an opportunity to shape narratives in a similar way that the mainstream media do.

We considered also the role of the private sector in this.  What they are and what they are not.  And so we broadly thought it was unhelpful to see the private sector as the police or as gatekeepers of this stuff.  Thinking that it was unhelpful to ask them to set their own definitions beyond the community guidelines that they set for their own platform.  It is not up to Twitter and Facebook to decide, we discussed who is an extremist and who wasn't.

It was a bit of negativity around the private sector too, considering whether the private sector can be expected to act for social good in this space or whether they will be forever driven by profitability and only considering putting forward counter extremism in their markets and whether we set ourselves up for a fall by expecting the private sector to do this and then less governments that may have a different definition of extremism and a more militant or repress I have approach to tackling extremism could demand private companies to follow their lead and act in their behalf.  So we go back to something that Carl was talking about earlier about definitions and the politicized and securitized nature of counter extremism more generally.

We discussed a whole host of other things, but broadly we returned to the need to see this as an offline problem and maybe to think about the role of technology and social media in changing those offline behaviors and tackling the root causes rather than simply tackling the current manifestation of extremism which is inherently online.

And so we tried to cover a whole host of things like that.  If I didn't cover anyone's particular comment, please do raise your hand or shout out now.

>> LARRY MAGID:  I want to open this up for further comment, question and discussion.  You can address here or question any of the panelists or make a general comment about what you did in your group or whatever else is on your mind.  Hopefully no censorship in this room.  So open up for any comments or questions?


Yes, sir.

>> AUDIENCE:  Thomas from the German Foreign Office.  I want to make a short comment on the first two presentations and the first two Working Groups.  I think a lot of the problems about not having a definition about terrorism and about whether censorship is the right term in the first place goes away if you really don't think of terrorism as such but look at what is criminalised in your legal order as supporting or committing a terrorist crime.  That is how we do it in Germany. 

So if somebody commits a crime in calling for terrorist actions or supporting the terrorist organisation, then it is in the German Penal Code.  Police steps in and it is REIT on treated as crime.  It is not censorship but prosecution of crime.  For that we don't need a general definition of extremism or terrorism but we have what is in the German law as interpreted by German courts.  That narrows, of course, what we are aping at.  But it makes it more precise and you get full judicial review of it possibly rather than having a general censorship or counter speech campaign.

So that's legal precision would be very helpful in this debate.

>> LARRY MAGID:  Yes, sir.  If you care to give your name and where you're from?

>> AUDIENCE:  (Speaker away from microphone.) We need to go to the root of the problem.  Terrorism, extremism.  I hear that downstairs.  You know, the lack of social justice is a bigger problem.  We have to address that, especially in some countries in our region and also we have this religious teaching in Arabia.  That is a bigger problem and we need to address that.  It is not just about technical issues or about taking online, it is about going back to tackle these really two important factors and counter terrorism efforts.  Thank you.

>> LARRY MAGID:  Okay.  Thank you very much.

Any other comments or questions?  I can't see, but somebody in the back?

>> AUDIENCE:  Okay.  My name is (Weiser Yoo) from Nigeria.  Now, I get a little bit worried when we programmatically always look at national security with extremism.  There is a way in which we don't fall into the trap where our governments seize on the narrative and they are against human rights.  I think it would be more useful to unpack what we mean by national security.  Are we talking about survivor?  Are we talking about government stability?  Are we talking about citizens security?  I see the impact of more on inciting violence against ordinary citizens.  I think that for me that it would be more useful to be the Kabuzi human security or citizen security.  When we talk about hate speech and extremism rather than focusing on national security.  Thank you.

>> LARRY MAGID:  Excellent point.  Thank you.  Yes?

>> AUDIENCE:  My name is David Sullivan from the Global Network Initiative, which brings together tech companies, civil society organisations, investors and academics to work on freedom of expression and privacy.  This issue of extremist content online is one that we have been working on for nearly a year and a half.  I wanted to say we released a report last week with some recommendations for governments, for companies, and for the specific thorny issue of when governments refer content to companies as alleged violations of their terms of service with some human rights-based recommendations for how both governments and companies can address this challenge together in a way that respects rights.  I have a few hard copies here.  It's also on our Web site, globalinitiativenetwork.org.  Thanks.

>> LARRY MAGID:  We have time for a couple more questions.  Back there?

>> The mic?

>> JIM PRENDERGAST:  My name is Jim Prendergast.  I want to take a quick raise of the hand.  The IGF is looked for new and innovative formats, hence walking downstairs in the breakouts.  If you liked it, raise your hand.  If you didn't like it, raise your hand.  Won't hold it against you.

That is what we are aiming for and what we tried.  Hopefully you found it valuable.

>> LARRY MAGID:  (Speaker away from microphone.) I certainly enjoyed having a conversation that would have been more difficult in this room.  Since there are not a lot of burning questions, I will turn it back to my panelists to make closing remarks.  Go ahead, Carl?

>> CARL MILLER:  Okay.  During our discussion there was a quote which kept echoing around in my head from the Declaration of Cyberspace Independence by the Electronic Frontiers Foundation.  In that document they said:  Beware, you weary giants of flesh and steel.  You have no sovereignty where we go and you are not welcome here.

From the beginning there was an idea that the Internet wasn't the property of nation states.  It was an opportunity to evolve beyond the nation state as the fundamental unit of all human organisation.  And the sense, I think the censorship debate is just another skirmish in what has been a decades long debate now or perhaps a war around who really controls the net.  Is it nation states?  Is it the people on the Internet?  Is it technology companies that provide the most popular services for the Internet?  Or is it something else?

I will conclude with a worry of mine around how this future war will be fought.  My worry is that discussions like this, as useful as they are, discussions in courts as important as they are will become less and less and less important because if you go to the people on both sides of the basic question of where does power sit on the Internet?  Who should really run it?  You see whether they are in anarchist tech notice communes outside of Barcelona or whether they are in innovation hubs within governments, really the real weapons that are driving this fight forward are not the weapons of debate.  They are not even really the weapons of legal sanction, per se.  It is tech.  On the one side, great surveillance technologies.  The other side block chain encryption and all the other weapons fighting back.

So my worry for the future is that the basic way in which human beings have decided on really controversial questions like counter extremism, like censorship, have always been politics, always been debate, making an argument and someone else makes another argument.  Over this long, messy, complex process somehow we get to something that is consensus which barely anybody likes but most people can see some kind of stake in.

But that is not what is going to happen next, I think.  What is going to happen next and what will eventually dictate this is which of these two technological development trajectories wins out.  Will encryption become a standard for everybody that will never be broken or surveillance technologies, various way of exerting control over the Internet become more proliferated and belligerent and which will break block chain and encryption and centralized organisations and everything else.

That is not a sunny note to end this on which is unfair because I think the session has been brilliant.  That is the final thought I will take away.  How do we make sure that conversations like this matter in the future?  How can we make sure it is not simply the powerful few that build the technologies that define the Internet that have a stake in what the Internet of the future will look like.

>> LARRY MAGID:  Jonathan?

>> JONATHAN RUSSELL:  I want to draw the parallel between media and social media.  The two old adages in every newsroom around the country and the world.  Number one, that sex sells.  That is clearly as applicable online as it is in the mainstream media.  The second is, if it bleeds, it leads.  And that means that I think online when extremists communicate, it is going to reach a bigger audience than it ever does before.  I am reminded of a different quote, actually.  Brian Jenkins in 1988, a terrorism researcher, communication specialist too.  He said the aim here of terrorists is not a lot of people dead.  It's a lot of people watching.  I think social media enables an awful lot more people to be watching than ever before.

And I don't think we are going to be able to get in the way of those two inherent human trends.  If it bleeds, it leads and that we will have a lot of people watching rather than a lot of people dead.

Therefore, when we come to thinking about communications solutions to extremism, we've really got to start this offline and we've got to break apart the echo chambers and make sure we don't create our own chambers.  Let's not fall into one of the traps that Carl identified of seeing all of the other as being stupid or evil.  Extremists may well be both.  But I don't think setting that up is conducive to effective tackling of them.

And just to finish because it is often a sore spot on some of these discussions.  It may well be that most of what captures our imagination is ISIS because it bleeds and therefore leads in our imaginations too, but it is certainly not the only extremism out there.  There are many other worrying trends of right wing populism, of anarchist terrorism coming through as well.  I would urge us to see all of these discussions and solutions that we've come up with today as absolutely transferable across the spectra of different extremisms.  There we go.

>> LARRY MAGID:  Excellent point.  I want to add to Jonathan's comment about social media.  It also provides the people who are doing the terrorism or their organisations a direct media outlet that they get to define and control, so that they can have their own narrative.  That has never been possible on a global scale in the past.

I want to thank everyone for participating.  You made this an excellent session.  I want to point out special thanks to Jim Prendergast for organizing and coordinating the event.  I really appreciate everyone's participation.  Thank you again.


(The session concluded at 10:30 a.m. CST.)