The following are the outputs of the real-time captioning taken during the virtual Fifteenth Annual Meeting of the Internet Governance Forum (IGF), from 2 to 17 November 2020. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record.
***
>> MOIRA PATTERSON: So I see the audience would like to hear what we have to say. So let's get started with that.
Hopefully, we'll have the opportunity to provide the full session because our third speaker has a lot of expertise that we really would like to highlight on the topic.
So we'll just give a quick glimpse here and get started.
So hello, everyone. Thank you for joining us on a Friday. I am Moira Patterson, and I'm the Global Market Affairs and Community Engagement Director at the IEEE Standards Association.
We'll shorten what we were doing to do. With that, maybe I will still invite everyone to maybe just type in to the chat window what kind of stakeholder group you represent and also where in the world you're joining us from. We would like to get to meet you. This is also the feature we'll use for any questions.
You can see the chat. I've seen that a lot of people have already put in chat. If we could just see where folks are from and what type of organizations you represent so we can get a sense of who is here. That will be great.
So we have someone from Brazil with ICANN and small‑medium enterprises. Dynamic Coalition on Data‑Driven Technologies. I see people still typing. More from Brazil.
>> JOHN HAVENS: Australia, Russia. I know two words in every language. So I'm trying to use them all.
>> MOIRA PATTERSON: Excellent. Great. All right. Well, really a global audience. So that is very exciting.
So welcome again, everyone. We're happy to be here with you, even under the slightly more unusual circumstances.
So with that, maybe I will just ‑‑ well, I'm not going to do formal introductions as much as maybe I will just introduce myself and then ask John to quickly introduce himself.
I'm Moira, like I said before, Internet Standards Association. I support a lot of initiatives in the areas of digital inclusion, identity, trust and agency, as well as digital resilience. We're looking at initiatives around creating trustworthy experiences for children in the offline and online space. Those are the themes we were wanting to highlight today.
John, please give a quick intro.
>> JOHN HAVENS: Sure. I'm Executive Director of IEEE Global Initiative on Ethics of Autonomous and Intelligence Systems. I'm also the author of two books. One is called Heartificial Intelligence and Hacking Happiness. I have a passion for personal data issues and well‑being. I'm glad to be here. Thanks for the invite.
>> MOIRA PATTERSON: In our summary, we talked about personal sovereignty and the importance of thinking about personal sovereignty and addressing what we believe is a misconception, maybe, that personal agency and privacy are necessarily intentioned with some of the ability to address some of the issues around COVID‑19 or other major issues, and that really ‑‑ in fact, we need to be looking at how we can balance these different aspects.
So with that, maybe I will start with the original topic that we wanted to talk about, which is that as we've seen during COVID‑19, we have really seen digital technologies be a lifeline for many of us. This applies around the world and to many different situations.
And so in that environment, though, what is the key message that you have. I will start with you, John, about personal sovereignty, data governance, and identity.
>> JOHN HAVENS: Sure. A couple of things. I just want to honor ‑‑ forgive me if I mispronounce this. Againi Atonkic (phonetic).
You asked: I guess we need to shape ‑‑
Moira, I just want to let everyone know the question is basically just asked.
>> MOIRA PATTERSON: Yep.
>> JOHN HAVENS: I guess we need to shape the definition of "personal sovereignty" while we're also using the term
"digital sovereignty" beyond traditional understanding when we connect it, first of all, with owning and maintaining the objects of critical infrastructure, servers and other technical equipment? Could you give your definition or meaning of personal sovereignty, even if it looks very obvious?
Again, just perfect alignment with what Moira just asked.
I also want to give a shout‑out because it's late on a Friday. Kaliya Young, you're here, right? I feel like playing air guitar while Eric Clapton is in the room.
Kaliya is brilliant and really has driving so much of the personal data or sovereign industry for years. I'm not worthy, if I can say that in Wayne's World parlance.
I will tell you, for me, at a very broad way of defining how I talk about the vision of data or personal sovereignty, recognizing that on this call, including Moira, there are multiple experts like Kaliya who have a much deeper technical expertise. I really want to stress, please, please follow Kaliya and her work. She's an amazing thought leader in the space. From me, from a larger vision, I'm going to go with a vision versus ‑‑ I recognize the technical challenges for what I'm going to say. Here, I'm going to ask for grace, if that's the right way to say it and say, Here's the vision. Knowing that it's not a silver bullet and an instance to create.
For about the past 10 years, one thing I've been keenly aware of are things like augmented and virtual reality. Ten years ago, for some reason, I just got very into this idea of having lenses in front of our eyes, like my glasses, that could have data overlaid in those glasses so I could, in one sense, be in two different worlds.
What I know many my work with data, the algorithms that track us and track our data are in a very real sense very similar. These are lenses that are in front of us, but most people don't see them. We don't see the physical algorithms that track our data, but that doesn't mean that they're not there.
Now, in terms of things like privacy and things like GDPR, regulation that's deeply powerful and really important. Maybe it has flaws, but it's there to help, at least EU citizens, have their data and identity being protected while that also means businesses can use the data. Right? Fantastic. Great.
Data sovereignty, the term, how I use it ‑‑ and, again, forgive me if it's not the perfect definition, but as an end to be tracked externally. We're tracked by government and businesses to hopefully, if things are working the way we want, to have protection in terms of governments ‑‑ and, of course, we should talk about surveillance and all that, but protection from governments and with businesses, trusted and responsible ways to find information out about us so we can buy stuff and have better experiences with their brands. However, that's A to B.
My vision is there has to be B to A or parity. I don't mean everyone in the world has data ownership, per se, because that's incredibly challenging, but it does mean that all people ‑‑ this is my belief, John's belief. I'm not speaking for all of IEEE.
We should have access to our data and have what I call personal terms and conditions at the algorithmic level for any individual. And we have a standard, IEEE's P7011 that was inspired by the amazing Doc Sheryls out of Harvard. If humans could have their own terms and conditions that could be available at an algorithmic level, then at one sense, it brings a parity where those algorithms surrounding me can connect and identify with the algorithms that are tracking me.
It is not perfect. It's a longer conversation to talk about, decentralized identity and all that stuff, critically important stuff, but the overarching vision is just that every person understands my data and identity is so precious that I should be equipped with these tools to speak with parity back to the immersive web.
>> MOIRA PATTERSON: That's an important point, John, to give people these tools. I think it aligns with another thought, maybe, that is very important from the side also that we talk a lot about in the DIITA program, the Digital Inclusion Identity Trust and Agency program, which is dignity and the human‑centered focus, right? The technologies that we develop should focus enhancing on people's experiences and enhancing people's lives and making things better and easier, and that includes, I think you said invisible ‑‑ there are invisible risks. So sometimes we may feel something gets better, but there are invisible risks that we're not aware of. I think it's very important to address those things upfront. That's one of the things we're trying to do in our different programs.
So thinking about designing these technologies with certain values in mind and giving people really control and treating people with dignity, I think, is a key aspect that I want to add.
>> JOHN HAVENS: I agree. And one thing, you know ‑‑ by the way, Wolfgang just asked a great question.
He said, isn't it a question of ownership and what happens with my data?
These are very nuanced, depending on the audience. And Kaliya is an expert and created this space. The people here, I don't know what level of expertise you have. I'm going to speak for myself and the work I've represented at EEE. I'm building off your idea of dignity.
When I talk to my friends that are not technical experts. By the way, I'm sure that all the attendees here are experts in one area or another.
Sometimes I think we don't have to be that complex sometimes. Like, a buddy of mine created a company called Glimpse, Gliimpse. It's in the United States. It's about six years ago. What he did is his tool let people get their health data from multiple different places, hospitals, their doctors, whatever, and the patient had to give permission for data to come from those different places.
Five or six years ago in the States when HIPAA, the relation allowing people to just even access their data, made it so that we could access our data as Americans.
The challenge was that sometimes the hospital would have XML data, and a doctor would have whatever, a VAX. So the data was available, air quote, but it wasn't usable by the patient.
So what my buddy did, the company he created, when you gave him permission, through triple encryption and all that, when you went from a doctor to a hospital, you were able to get the data instead of from a paper form or fax, into an app where the first thing ‑‑ I know you asked about ownership, Wolfgang, but the thing that I always took for Neil ‑‑ when I worked with him ‑‑ I did PR consulting with him ‑‑ was the word "portability," having access to the data where I go to the doctor and they hand me this. You can't see it. Maybe it's a piece of paper. It's portable until I rip it or it gets burned. Right? But if I was able to put it into an app, it becomes lifesaving when I can take it into the app and give it to my next doctor and it aggregates. So the portability is critical. So the word "ownership" is pretty complex.
I'm going to get the nuances wrong that people like Kaliya would get better than me. I'm a dad, right?
If my child has an illness, cancer, God forbid, and I have to go to each of these different places, the dignity is a great word to think about, but the quality, the life‑saving need, the tool. I control the tool. I want to be clear with GLIIMPSE. I still pay my doctors. I still own my insurance. I don't pay for intellectual property of theirs. Okay. They should have the intellectual property, but can I take the data to the next doctor and hold it up quickly, and then what I do get to do? I get to talk to the doctor about my child.
So I use that analogy a lot because it's a lot esoteric sometimes about data. Also, when my friends are like, Who cares? The horse has left the barn. I tell people about my shoes and my feet. I don't care. I have nothing to hide.
The thing I say here is, Well, first of all, this is not about hiding. This is about accessing. It's about portability. There may be times that you want to know how you're going to share with certain people. This is not hiding. This is now curating. Right?
As parents, I have to sometimes give medical data to a teacher at the school. I trust the teacher, but I don't trust a certain brand.
Sometimes I tell this story, and they're like, who cares? GLIIMPSE. Well, they were acquired by Apple. My friend's company was bought by Apple. Obviously, it must be somebody that's useful to have patients and citizens have portability.
Anyway, that's my answer. I will try to keep it shorter. I know it's Friday, and we talked about being short.
>> MOIRA PATTERSON: Thanks for that, John. It's a very relatable example.
One other thing I want to touch on in our shortened version, the role of standards. John and I both work for the standards association within IEEE. The role of standards in helping to create these ecosystems and tools, I think, is really a very important one that we need to, you know, help promote and educate about because standards are building blocks that can make best practices more accessible to all the different actors.
And, also, some of the spaces we're talking about, one of the spaces we're increasingly looking at is age‑appropriate technologies. And we see technologies that are intended for use by children but that have complex terms of service and where they collect data about the children and other things like that.
There's a lot of questions about that, as you all know. There's regulatory movements and in Britain, for example, they've started with the age‑appropriate design. In other places, they are looking at similar things.
Technical standards are one way that we can really help translate some of the protections and some of the principles into implementable solutions and make sure that all developers have easy access to that.
So I don't know how many of the artists are involved in standards or regular users of standards, but I think we always evangelize the importance of that because it's open and it's accessible, and it's a way that shared knowledge can get into outputs that really help others apply the same learnings, apply the same guidelines.
(Captioner will disconnect in two minutes)
>> MOIRA PATTERSON: I just wanted to highlight that and also mention IEEE has a lot of standards in these spaces, some under development, some out there being implemented. We are looking at standards that are applicable to data governance, to data processing as well as, for example, for age appropriate terms of service, you know, there's guidelines for people who make services for children and how to develop those in an age‑appropriate manner.
In the interest of those types of activities, please look us up, and I will put my email in as well.
John, I don't know if you want to talk about any of the standards or related activities.
>> JOHN HAVENS: Oh, sure.
To echo what you just said, Moira, I just put a working group I was proud to help get formed. And by "help get formed," I mean just part of the role Moira and I have is when someone has an idea for a standard, then we can bring it to the Standards Association and sort of help shepherd it to becoming a working group. But it's the volunteers who come with the ideas and the working groups themselves that actually do the work.
I saw two women speak at a university about Indigenous data, meaning First People's data. They were brilliant, these two women. I said, look, this is something you would want to think about having a standards working group, and they did, which was awesome. So I just put information about that standards working group in the chat.
Part of the reason I find their work so fascinating is I use the term "First Nations." At least in the United States ‑‑ and this is where I may be ignorant, so forgive me if I get some of this wrong, but the sovereign nations of the different Indigenous People within North America are just that. They are sovereign nations. So in terms of how their data is being accessed and exchanged, again, this working group is going foundationally important work, not just for the First Nations people that they represent, but, also, how data can be exchanged when a certain group of people want to exchange it. They're doing really fascinating work. I have no formal association with them other than I think they're brilliant.