IGF 2017 - Day 1 - Room XXVI - WS57 A Playbook for Gender Equality: How to Harness the Power of Digital Media and Emerging Tech

 

The following are the outputs of the real-time captioning taken during the Twelfth Annual Meeting of the Internet Governance Forum (IGF) in Geneva, Switzerland, from 17 to 21 December 2017. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid to understanding the proceedings at the event, but should not be treated as an authoritative record. 

***

 

(captioner standing by for audio).

>> VALENTINA HVALE PELLIZZER:  ‑‑ I hold a Western European passport, but there are many other bodies, the majority of the bodies of the world that has led what they share in the intersectionality of discrimination, many discriminations because of their sexual orientation, because of their class or (?), because of their language, because of the skin, color, because of the sexual orientation.

My work in the last six months was around LGBIQ community doing research, (?) research in India, Nepal and Sri Lanka and the survey talked about the discrimination and censorship that people experience, and everything is really related with that, because how our body becomes individual, what's our body in the digital?  Our body in the digital, in the technology are just that, you know, and this is a continuum, and data is political because data is from any kind of information‑management theory, the rough material that someone makes sense of.  When you make sense of, then you classify and then you use as evidence or you use for deciding about policy.

But this body is this body and the data of this body are not equal and are not privileged, so which decision will be making under that name.

So, I think if we talk about equality, we need to understand that the majority of the bodies that are in line have not been able to be part of the conversation.  They've just been defined by someone, just been defined by someone.

And there are bodies that are invisible and bodies that are visible, but everything is about the norm because classification and making sense of data, it's about the power of naming things.

So, if I name the sexual (?) like the research of erotics in India show, a classification done by in the criminal nation of record on Internet criminal act, what are the sexual things?  Who are the sexual freaks?  And the variation, who are the freaks and geeks, there is also the geeks actor, what interpretation are they acting?  And if we look at bodies and we know that the female body and gender body and not the same share of power.

And so if we build all this intersectionality in the continuum, what are the emerging technology, artificial intelligence, machine learning?

And who is at the beginning of those technology?  Humans with their bias, humans with their power, humans with their specific economic model, and we need to be aware of all this and see the continuum, and so body, S data, data S political, and in this circle we really need to understand how we can make sure that the issue of data are addressed from a human rights perspective, and so that the body making hyper visible to then being punished because of their diversity have really no access to rights.

And the bodies that are invisible because of privilege become visible because the privilege become dismantled, and I stop here for now.

>> TARA DENHAM:  Thank you.  The next speaker is Dhyta Caturani, working on issues of human rights, civil justice, civil liberties, human rights and violence against women.  She's based in Indonesia.  How do you assess the digital economy in media and ICTs today and what do you consider to be the most crucial?

>> DHYTA CATURANI:  Good afternoon, everyone.  My name is Dhyta.  Because English is not my first language, I've written just a few talking points and I'm just going to read it.

If you ask me how we could set the gender equality within the ICT and digital media, I think we know the data, we know the statistics, and you actually come up with the gender divide is still very big or very, very large, and so I want to talk about cultural aspect that hinder women from using ICT and digital media.

So of course, we all know like in all domains, it is very (?) in the political system, from the beginning women can access less education which makes it hard to access and understand tech, especially when tech itself, is not from the beginning.  It is not built with women in mind.

Like, for example, you know like the algorithm with the bias, it was built with bias, like what (?) has said.  I read a story in 2015, a (?), a pediatrician in England, he signed up for a gym, but her gym membership card cannot access the women's locker.  When she tried to ‑‑ when she complained about it and the gym tried to find out what happened with it, and it was actually the software that managed all the data of the members, coded the doctor as male, so that's why, Shelby, so yeah, she cannot access the women's locker room.

But also, there is international medicine, lists a study report in March 2016, saying that Apple, Samsung, Microsoft phones, artificial intelligence the phone, didn't understand the word "rape" so basically, Siri and friends were not programmed to help women in crisis.

And then also, the cultural norms that shape the perspective that it's a man thing, which in many ways it creates some kind of tech (?) around women.  I have seen women around me, if uneducated one, always say, you know, I don't know anything about tech.  You know, they just use their smartphone or laptop just as minimum as possible, as they need it.  But even when they try to install things or update softwares, they will ask their male counterparts to do it for them.

This, although there is a lot of women, now use ICTs and also digital media, the number is still less than men, but it is still very heavily, urban, educated, middle‑class‑centric.

I have worked with Women Former in the last four years, fighting against a cement factory, they never use smartphone, they just use, you know, that old phone that can just make a call or send text messages, because what?  Because they were not ‑‑ first of all, the economy of the family doesn’t have ‑‑ they don't really have money to buy that kind of, you know, precious things for them.  Even if they do have a little bit of for that, it's because gadgets or cell phone or laptop or computer are not being seen as like, you know, a gender stereotypical needs for women, and so those women will not be able to access them.  They will not be allowed by their husband or by the son to buy those things.

So, the way me and some of my friends work with them, trying to build this new culture within the movement because they really need it.  It was the women who were fighting in the front lines against the cement factory, but their voices were not heard by the words.  It was always the young, especially the younger men, who owned cell phone, who owned laptops who were heard by the people outside the village, so, it was not (?), of course.

Then when women get online, there is this ‑‑ we all know this misogynistic behavior online that make women have to access violence, not only under political views, but if they are online for their political view, they were not attacked based on that, but based on their gender or sexuality or physical ability.

I think this is also a visual use problem in the obsession towards women's body that makes the government and atypical become the gatekeeper of women online driven by the porn panic and in some places, the reduced norms like for example in Indonesia, the norm is so very strongly, so the censorship, the filtering further hinder women to access ICT, and even in a lot of times, silence women themselves.

So, this many layers of power are very critical issues to me.  The power ‑‑ this power needs to be challenged and addressed, including and engaging women from the start, like for example, from building the infrastructure.  If we do that, it will increase the potential of more women to have meaningful access to ICT and digital media, and then they can use it to empower themselves.  I think I'm going to stop there.  Thank you.

>> TARA DENHAM:  Excellent.  Thank you very much Dhyta.  Okay.  The next speaker is Irene Poetranto, works at the The Citizen Lab in Toronto, notably regarding Internet censorship, filtering and surveillance, and for Irene, we asked what effect do cyber policies have on women and girls and what digital and emerging technologies hold for gender equality if we do not address some of the trends we're seeing today, and how are Internet censoring and filtering in some states affecting women?  Irene?

>> IRENE POETRANTO:  Hi, good morning, everyone.  Thank you to Global Affairs Canada for putting this panel together and APC.  As was said, my name is Irene.  I work with The Citizen Lab and work with the intersection of cybersecurity and human rights, and we conduct research on topics like Internet censoring, filtering and surveillance.

I was given a number of questions, such as what effects do cyber policies and regulation have on women and girls?

As was mentioned earlier, we know that women and girls tend to be behind in terms of gaining Internet access, and cyber policies and regulation therefore have the potential of advancing their access to information, however, our research has found that they can also make them more vulnerable.

For instance, our research has documented the use of malware against civil society groups and activists, some of whom are women, as well as the implementation of overly broad Internet censorship policies that result in the blocking of websites belonging to women's groups.

I'm going to talk about a series of reports that made the front pages of The New York Times on an NSO group, an Israeli cyberwarfare company.  They sell spyware, for instance, to Mexico, which has purchased about 80‑million dollars’ worth of Pegasus Spyware, and what it does is infiltrates cell phones and monitor details of everyone's life, calls, contacts, calendars, microphone and can be used to turn on the microphone and camera for surveillance, turning it into a personal bug.  It's accessible to governments and used against extremists, terrorists and criminals, but an investigation by The New York Times and The Citizen Lab found it was used against academic, lawyers, journalists and their family members, includes a teenage boy, and a few of those targeted are women, including a journalist, Carmen Aristegui, Stephanie Brewer who was a civil society member, Alexandra Burke, as well who was a civil society member, and Karla Salas, a Mexican lawyer and rights defender representing the families of Mexican journalists killed in Mexico City.

The second example of Internet censorship, we look at a number of companies that manufacture and sell is Internet filtering products to block objectionable content, some of them are Blue Coat which is an American company based in California, Smart Filter, another American company based in Minnesota and Sweeper a Canadian company based in Ontario.

And so what we have found is that based on how the Internet filtering softwares are configured, they have found they have blocked legitimate content, we think, that involved women.

So for instance, gandhiwomen.org was blocked.  An action Filipino female migrant, was also blocked.  A website belonging to a Filipino American woman association in the United States was blocked, and so as a result, we should not ignore the large number of sites that are blocked in censored countries simply because they've been misclassified by the manufacturer of the blocking program or whoever is maintaining the blocking program being used because they have real implications for women organizations.

And so, what are the implications then?  So, we found in our research that women are often being targeted using tools that governments would purchase in the name of combating crime, violent extremism, or terrorism.  Women are also often used as bait, for instance, through the use of female avatars and fake match‑making profiles on websites to attract other targets.

And because I pointed out earlier, women typically lack access or knowledge about accessing the Internet, and they can be more vulnerable as targets, and also because they typically lack knowledge with regard to digital security.

As I mentioned before, we need to be aware of automation due to miscategorization resulting in legitimate context being blocked.  And in some, there needs to be better transparency and accountability as to how government purchases or how manufacturers make sophisticated tools with the risk of infringing on human rights.  So, I'll end there.  Thank you.

>> TARA DENHAM:  Thank you, Irene.  Our last panelist is Farhaan Ladhani, CEO of Perennial, Canadian startup dedicated to understanding the world with data and making it better with digital tools, and Farhaan was asked the question are emerging technologies with AI such as gender equality, what are some of the risks related to these technologies.

>> FARHAAN LADHANI:  Thanks, Tara.  I'm happy to be here today.  I'm the CEO of a company to encourage citizens to participate in civic action to make communities and the world a better place.  An important part of that platform is the use of machine learning to encourage people to participate, all people, men and women, and so recognizing some of the challenges in front of us with respect to machine learning, we had to be really thoughtful about how we were going to develop technology to overcome some of the inherent problems that this environment faces.

I'd like to start by laying out a picture for you of what some of these challenges are, and then end with some potential opportunities that these challenges present and hopefully kick us off down the road in a conversation around practical applications.

Whether we realize it or not, algorithmic‑based decision‑making processes now interact with our lives on a daily basis.  Think about your morning.  You got up, you brushed your teeth, maybe you're using one of those new electric toothbrushes that claim to use machine learning to determine whether one part of your mouth is more clean than another.  But did the algorithm incorporate the latest data on hypersensitivity findings from the latest studies that demonstrate a difference in the effects on electric toothbrushes between men and women?

Maybe you turned on the TV and caught a few minutes of news.  The commercials you saw are increasingly the result of programmatic buy‑in that selects content based on a range of user characteristic, that means you, and that includes gender.

You turn to your smartphone, and I know I'm being generous here because at least half of the people in the room use their smartphone before they brush their teeth.  And if you're like most of us, you probably touch the smartphone somewhere between 1, 600 and 2, 300 times every single day.

Much of that time is spent on social media and on social networks where the news articles that you read are clustered and ranked according to a range of characteristics over which you have very little control.  Your gender most certainly had a role to play here.

Maybe you made yourself a shake to get a nice healthy start to your day.  I had eggs and sausage, but that's not for this discussion.  If you used the mega hurricane mixer, a product targeted to men rather than women, you used a product that was by all accounts the same as any other, but the language used to describe it made it more appealing to men, based on a language model with characteristics we'll speak about in a moment.

Maybe you sent out a few resumes, IGF got you really excited about opportunities you may have.  Gender is being used to screen the resume.  Once you make it through screening, even these are evaluated by assessment algorithms, you decide to stop at the market before making your way here this morning, you used your credit card and your score might determine whether or not it was going to flex your available spending limit to buy yourself that extra special treat or whether that loan approval will actually come through.  That's a pretty typical day for people living in North America attending an event like this, and key algorithmic decisions in virtually every single one of those circumstances had gender impacts, whether intended or otherwise.

The same metaphors are relevant for those not living in North America and around the world as well.  What we're learning from the research is that subtle gender bias, something we heard about a few moments ago, is entrenched in the very fundamentals of many of these algorithmic encounters.  Why?  The prominent datasets used to teach everything from language skills to photo recognition for AI and machine learning have been demonstrated to generate gender bias right from the very start.

In one prominent dataset that was intended to help train programs, the programs themselves to understand images, kitchen objects such as spoons and forks were strongly associated with women, while outdoor sporting equipment like snowboards and tennis rackets strongly associated with men.  Training of computer programs to help support these algorithmic decisions is intended to represent the relationships between words as mathematical values.  This helps make it possible for machines to generate connections between different words and understand the relationship between these words.

Researchers from Boston University and Microsoft Research found popular datasets using the construction of many algorithms, considered the word "programmer" closer to men than women, and that the most similar word for women is "homemaker" as the systems are more capable and widespread with utility in every aspect of our lives increasing, the sexist point of view could have incredible consequences.  The real challenge is machine learning software trained on the datasets don't just mirror the biases they pick up, they amplify them and reinforce stereotypical relationships over time like with women and the kitchen.

The complexity compounding all of this, the complexity of heuristics and algorithms required to do everything from matching medical students to placements or advertisers in content distributers with their ideal consumer.  It often means that the creators pick shortcuts to limit the number of variables they assess; male data points are often available online due to the disproportionate volume of male traffic and thus easier to infer.

This is a critical issue when it comes to access and use.  The purpose of many of these algorithmic encounters is to help predict some sort of outcome, to increase the business efficiency of many of these products.  The challenge is that accuracy and prediction is directly related to the volume of data that may exist on which to base a specific prediction.  Let's think about that loan again.  Where there is a lot of data about certain groups, the predictions on default, on loan default may be reasonable accurate, but what if the data doesn't really exist?  What if there isn't a sufficient volume of useful information?  A predictor in this case might end up flagging individuals where data is limited as high risk of default, even though they typically pay back their loan.

The increased trouble when it comes to gender is when access intersects with data availability, the lack of data may inadvertently lead to injustice or prejudice because they have less access to tools that create data and thus less predictable outcomes, but all is not lost.

These are certainly some of the risks, but there are also many opportunities to consider.  Recognizing the gaps, I noted, is a big step forward and many are now gripped with reducing the gender bias inherent in many of the tools and platforms we interact with daily.  There are a number of companies leading the charge.  Biases longstanding well before algorithms had become a subject of discussion can now be explored, observed, and corrected.  In in the process of generating greater awareness, we may find opportunities to demonstrate inherit social and biases.  We have seen benefits of use of augmented reality in these at risk, and might they be useful for gender as well.

The treatments for limiting the ill and often times unintended effects of bias algorithms have the potential to generate greater social inclusion at large.  If we use the opportunity to think about inclusion right from the start, we have the opportunity to focus on engaging and including people who may be harmed or negatively affected by products and services right from the design phase, something that's critical to limiting bias from the outset.

Other approaches may also point us to the mandatory of inclusion of ethics into computer science and other STEM‑related fields of studies.  Finally, if we increase recognition, the user platform for products and services we interact with every single day can be enhanced by forcing requirements to have algorithms able to be actively explored by researchers and we can better understand the impacts, particularly with respect to gender and marginalized communities.  The less we know about how algorithms work the more profound the impacts will be and the inverse may also be true.

Large organizations are coming to terms with the needs of understanding the biases in the products they create, encouragingly companies like Microsoft and Google have provided resources to understand and ameliorate their impacts, a key challenge for us collectively is to increase other stakeholders including the companies that develop and apply machine learning systems to limit algorithmic bias, and I look forward to digging into these and other issues during our breakout session.  Thank you.

>> TARA DENHAM:  Thank you, Farhaan.  So, over the last four speakers, we've gone through a narrative around how do you translate the continuum of our physical body into online spaces and what does that mean in today's world.  Opening the thinking about cultural ‑‑ addressing cultural and social norms that define women's use and reflection within ICTs and pushing into cyber policies and how the applications that are developed and the regulations around them, but then how are they actually applied in reality, and starting to think about emerging technologies and our toothbrush, (Laughing).  But really, I mean, what are these algorithms and that they are amplifying the biases that exist, and thank you, Farhaan, for starting to point out some of the things that can and are starting to be done with it, but a lot of the conversation in these forums, as I said, usually at the end of the conversation people are saying, yes, and what can we do about it.  We wanted to talk the bulk of this session to talk about that part of the conversation.  I want to open it up for a minute or two to see if there are clarifications or chasers of what was said in the panel or otherwise we'll move into the next portion.  Is there anything that was said that people wanted to clarify?

Excellent.  Clear as everything.  Perfect.  Okay.  So the next part, what we have done over the last two months is sent out a global survey.  I think it was back in October, and it was asking people what ideas did they have that could contribute to a more gender equitable digital future, and asking for respondents to give concrete action items that could be done which could contribute to a playbook.

So we've received those results, and during our analysis, we were trying to figure out how to frame them because that was the basis for this conversation.  We wanted to get some opinions and then try and validate them here at the IGF and build upon them.

What was interesting and what came out in the session before this as well is people talk about the different players, civil society, activists, government, private sector, and always the thinking about which role do people have?  And there was a comment in the previous session that, in fact, in a lot of the areas that have to happen, whether it be education or addressing cultural issues, all of the players have a different part to play in those sections, but they have to figure out what their role is and how can we be complementary to each other.  That's actually what came out in the survey.  We were trying to break it out into these sections were recommended for private sector and these were the actions recommended for government, and we couldn't dissect that way.

How it came out, there were four main themes that were resonating in the proposal.  One was around access, so ensuring equal access to and inclusion in digital technologies was one area.

The second one was around culture, overcoming traditional thinking and deconstructing social barriers, again, which was raised this morning on the panel.

Education, how to improve education and training, which has come up a number of times, and governance regime building an international framework, what does that look like?

So, what we'd like to do is break into four sessions and each group will tackle one of those areas.  We're going to do that for 30 minutes.  Each of the panelists will chair one of the groups, and in those groups, you will get to see what were the thoughts that came out from the global survey.  So there is, I think, about 5 to 10 suggested action areas in each of those for areas of engagement.

We're going to invite you to validate and or add to those action areas.  The assumption being that there is a lot of people in this room, there is a lot of knowledge, either first hand of things that you've done or ideas about what you would like to see done, so we would like to invite you to build this playbook with us, and we will also be asking for people that want to share their contact information.  This playbook will be going in ‑‑ going out on Google Docs after the holidays so we can continue to invite people to build on the playbook before we release it, so for those that want to continue to be engaged, please share your contacts.

What we'll do now is break into the groups for 30 minutes.  The facilitators will each walk you through what was actually distilled from the global survey and we will have a quick rotation at the end where you can go and see what was put in for the other party of engagement and then a closing part of the session.

So what we'll do is, the first one is access.  The first table, Group Number 1, that is Farhaan ‑‑ no, sorry.  Hvale has Access at Table Number One.  Two is Culture, which is Dhyta.  Number Three is Irene on Education.  And Number Four, Farhaan, is at the back table doing Governance Regime.  If there is one of these areas you find most intriguing that you would like to contribute to the most, please go to those tables.

Number one, access, number two, culture, number three, education, and number four governance regime.

(breakout groups off mic).

(silence).

>> TARA DENHAM:  Hi, everyone.  Sorry to interrupt.  We have about 20 minutes left.  What I would like to invite, for those interested, if you want to stay in the group you're in, that's fine, you can keep the conversation going.  I also invite people to move to one of the other groups if they want to also input to the other groups, just for about 10 minutes, and then we're going to wrap up.  I'm just going to remind people of which groups are here.

Access, which is ensuring equal access to and inclusion in digital technologies is Table 1.  Culture here is overcoming deconstruction and overcoming barriers.  Number Three is Education, improving education in training.  And the fourth group at the back is the Governance Regime, building an international framework.  So, if you would like to move to one of the other groups for about 10 minutes just to provide any insights, that would be great, and then we'll close out the session.

(breakout groups off mic).

(silence).

>> TARA DENHAM:  Okay.  I'll give just one more minute to wrap up.  Thanks.

All right.  If everyone can wrap up.  And I'm going to ask everyone, we're not going to do a readout of the sessions.  If I could get everyone to just stand around this middle table, I'll tell you what we're going to do and then that's it.

(laughter).

Get excited!  We're going to do the wave.

(Speaking off mic).

(silence).

(beep).

(session completed at 6:25 a.m. CST)