The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> MODERATOR: Hi, we will just take a few more minutes. The camera is being set up. Thank you for being patient.
Good morning, everyone. So I'm going to start with a brief introduction about DEF. With the model of inform, communicate, and empower, The Digital Empowerment Foundation is a not-for-profit registered under whose effort has been to find sustainable information communication technology solutions including digital and new media to address the digital divide in underserved and under reached regions and communities.
Since our inception in 2002, our broad areas of engagement have also evolved. As an organization established in the early 2000s, much before the expansion of internet infrastructure and digital equipment, our initial focus was on connecting the unconnected regions in India through wireless hot spots and innovative and low cost internet infrastructure.
Over time we have been focusing on bridging the digital gap in India not just in unconnected areas but in under connected regions that are not able to meaningfully use internet and internet services.
A recent area of intervention is on the problems emerging out of a significant outlook on the all-encompassing data injustices which is posing as a challenge towards the holistic empowerment of individuals, groups, and communities in totality.
From the inception to the design to its implementation, data justice requires critical engagement to critical dialogues. We at DEF recognize the need to have the holistic approach to access, agency and therefore empowerment. And today's report launch, which is based on a comprehensive study conducted by DEF on data justice in India is a reflection of that.
Today we have with us our Dorothy Gordon as our esteemed chief guest. She is the chair at UNESCO's Information for All Program and is a board member of the UNESCO Institute for Information Technologies and Technologies in Education and associated with JIPAC.
And then we have our speakers, these are the authors of the report. We have Ananthu RA, who is a researcher at DEF, and he is going to open the discussion. Then we have Jenny Sulfath who is a researcher at DEF, Osama Manzar is a CEO Director and founder of DEF. Then we have Vineetha Venugopal who is a consultant at DEF as the reporter of today's discussion. And lastly, I am a researcher at DEF and I'm going to moderate the discussion. Now I would ask Dorothy to speak a few words.
>> DOROTHY GORDON: Thank you. Thank you. And let me start by saying to everyone namaste, good morning, good afternoon, good evening, wherever you are joining us from around the world. It is really a great honor for me to launch this extremely interesting report.
I believe that all of us are aware of the massive datafication of society that is taking place at the moment. And we know how important knowledge is also in terms of achieving the sustainable development goals. But what we have observed over the past years, it is that technological developments do not necessarily mean that we have reduced vulnerability. In fact, in some cases we actually increase vulnerabilities.
It is for this reason that I find DEF's increased attention to this area of data and especially how it affects the most vulnerable to be extremely important for all of us to pay attention to. We have to pay attention to this because it is affecting communities all over the globe. And we can see it even reflected in the Global Digital Compact that the Secretary General has come out with as well as, of course, the road map.
So we see that we have that awareness growing in terms of the importance of data for achieving the Sustainable Development Goals. The fact that we cannot take for granted that the new developments will reduce vulnerability. And at the heart of this and the heart of -- at the heart of the DEF approach is the recognition that our policy approach must be grounded in the internet universality principles. That is, it must be rights-based. It must be open. It must be accessible to all and nurtured by multi-stakeholder participation.
And so really this publication is allowing us to take a deep dive into how data is affecting or how it is impacting on some of those principles that all of us have espoused at the level of the UN.
Data justice plays an extremely important role towards recognizing what could go wrong. And making sure that we take the appropriate steps so we must look at how data is being designed, used and implicated. And let me just mention two quick examples. And there are many such examples in the publication.
First of all, you could say that because of the pandemic there has been a lot of hastily planned implementation of solutions for e-government, for example, as well as in education all around the world here. I am not Indian, I'm African and so I'm speaking from my African experience. But after reading this wonderful book, I can tell you that there is some good examples from India as well.
So when we do things in a rush, we are not always able to map out all of the implications of what could go wrong and who would be affected. And what we see consistently globally is very often people who we already know are vulnerable such as people living with disabilities or marginalized groups are further marginalized by the implementation of these solutions.
In education, we see that more and more parents are being encouraged to go for solutions by themselves outside the school system and very often they do not realize what is happening to their children's data and the implications. For example, a child may eventually not be able to get into the university they want on the basis of all of that data gathered and analyzed and profiled in terms of what kind of student they are.
And that also affects us within the school systems. And it is very interesting to see that in some geographies we now have child appropriate by design legislation to protect children from being exploited in terms of their data.
So one of the words that I hope that we will all become more familiar with after reading the report is algorithmic exclusion. I think it is one of the key concepts that is focused on. I want to end by just saying that we can reimagine a better future for ourselves. But that means that we can no longer sit on the sidelines. We have to become far more deeply involved in understanding how these new technologies are playing out in our societies.
I think that the release of this report on data justice in India should be an inspiration for people around the world to produce similar reports and do similar kinds of analyses. And here I throw a challenge out to our academic communities all over the globe to get involved and start doing this kind of research.
Let me just end by saying rights-based, open, multi-stakeholder participation, accessible. We can do this. And I hope that this engagement will translate into a more inclusive and citizen-centered data norms in times to come. I am the Chair of UNESCO's Information for All Program. And we have been looking into these kinds of issues for over 20 years.
I'm happy to say that India sits on the bureau of the UNESCO Information for All Program and is very deeply involved in the shaping of our priorities. I can assure DEF that we will continue to work with them very closely. And let me just say we will continue to focus on the transformative power and potential of data equity and justice. Thank you all.
>> MODERATOR: Thank you so much, Dorothy.
Now for this report launch, I would kindly request Dorothy and the speakers to kindly hold the report up for a quick picture and release it. Okay. So they are online, but we have Jenny and Dorothy. Yeah. I would kindly request the speakers to turn on their cameras.
>> DOROTHY GORDON: Osama, let's see your book.
>> OSAMA MANZAR: Can I show like this?
>> DOROTHY GORDON: Excellent, excellent. So I hereby declare this excellent report on Advancing Data Justice Research and Practice, the India Report, launched. And may many benefit from its contents. Thank you very much.
>> MODERATOR: Thank you so much, Dorothy. Okay. So we can start the discussion. So our first speaker is Ananthu RA. I have three questions.
One, how have you looked at data justice in your report? Two, how the study was conducted. And three, can you speak a bit about -- particularly about your insights that you gained from talking to people who are innovating AI-based systems for social intervention? Ananthu, you have six minutes. You are on.
>> ANANTHU RA: Hi, I hope I'm audible.
So the work has been introduced so I will just continue from here. So basically we had continued the work as like policy valid partners of the institute. And the research team at ATI had over the course of their work tried to provide a broader frame to the idea of data justice. And not limited to the most discussed problems of privacy and security, which are in itself real problems. But they tried to expand it and tried to fill the gaps with their very good research and practice of data justice. So most of our research was guided by their work.
So they -- and they had identified six pillars of data justice. Six of them which are not separate and like mutually exclusive each of them but have their overlaps. So like short of time but the pillars are explained in detail in our book and in the website, but to briefly go through it. They are power, equity, access, identity, participation and knowledge.
So power is like understanding and compacting the existing deep rooted patterns of dominance and structures of power.
Equity is seeing the long-term patterns of inequality and attempt to transform to be closer to reaching social justice. Access is access to data resources and innovation.
Identity is critiquing and uttering of identities and exposing binaries. Participation is meaningful in a presentation and inclusion of the world's population. Knowledge, which is not limited to the dominant version of knowledge but in a pluralist version that acknowledges the varieties of knowledge.
So we worked with several organizations from the global south to expand this scope and definition and understand issues so that they are not limited to Europe or parts of the global north. So we for our part have used these pillars to try and look at the cases some -- set specific cases that we have identified in India and that either involved data or AI-related tools or narratives of exclusions in a civilization that was driven by data.
We talked to communities affected and developers and policy people who sought to implement these. We had conversations with them mostly over interviews that were conducted digitally because this happened mostly during the third wave of the pandemic, which limited us in meeting with a lot of personal meetings.
But through the interviews like policy makers mostly tried to defend the choices and explain how they were for larger benefits while developers talked about the issues they faced while they were working on these tools. And we also listened to narratives of discrimination and racial and biases that the communities faced as these tools were rolled out.
To answer your third question, also very briefly, our conversation with the developers had some interesting mixed results.
To put this, let's say, briefly, there were several really innovative attempts of using AI for social interventions. Like there was a pest detection system which tried to detect pests and tried to control its spread, which worked efficiently from the statistics.
However, most developers were like from what we talked to other people they were unaware and not trained in the social impact of how the -- like how the technologies that they build are related to deep rooted social problems and how they impact the public. And some of them wanted the public not to interfere in the technical processes.
Back -- hello?
>> MODERATOR: We can hear you. You are audible.
>> ANANTHU RA: That was mostly it. But like I will also point to one example of the software on TB.
So there was another software, AI for social good program, I tried to find a TB detection software that analyzed and made the TB detection possible. They later figured out that the problem was not per se -- the problem was not with the detection of TB and has good programs that detect TB like there is no issue there.
But the problem is something more deep rooted social where there are no welfare policies that enable people to get nutritional food or resources to help them recover once they are detected with TB or to enable them to continue getting medicines once they have TB.
So the developers, what they needed to do was have a different social lens to understand this was the purpose.
That's mostly what I wanted to say.
>> MODERATOR: Okay. Thank you so much, Ananthu.
Next we have Jenny. For you, Jenny, I have two questions.
First, in India, the question of statelessness has become a prominent issue with the centrality that the National Register of Citizens has assumed especially in the recent past years. So in that regard, you first talk about, and I will say briefly and then how does the justice become relevant there, one.
Two, we also have a significant population who are homeless, often migrating. Can you briefly discuss why data justice becomes relevant in that context? You have six minutes, and you are on.
>> JENNY SULFATH: Thank you. So I will start with the issue of statelessness first and then how it disconnected to the main pillars of data justice which are power and identity.
So the exercise of Assam and National Citizenship Registry was done twice, once in 1951 and then once in 2013. So I mean it was -- I mean order to start -- should I sit close?
Okay. Okay. So and once in 2018. So to be recognized as a citizen under the National Citizenship Registry, you have to trace your legacy back to the 1951 NRC, National Citizen Registry in Assam, or the Voters list from 1961 to 1971.
I will give a brief overview about the historical context of National Citizenship Registry in Assam.
The Bengali immigrants, or called Bengali immigrants were brought to Assam from the East Bengal by the British at the time of colonialism as part of a project to grow more food because the land in Assam was fertile and they wanted to bring workers to grow more food. And the workers who came who were brought to Assam were mostly Bengali Muslims.
And this particular context has a lot of relevance if we are looking at the data collection which is related to NRC. It also has to be noted that a lot of Bengali Muslims who came to Assam were also settled in the river islands which had seasonal floods and they were a shifting population.
So the first set of exclusion happens -- as I said that your entry into the National Citizenship Registry is through the 1951 NRC which was actually not conducted in a lot of parts in Assam. And even when it was conducted, the state actually lost the data about the NRC. So even when people had documents to prove they were in the Assam NRC, the government did not have data. So the first set of exclusion happens here.
Secondly, a multinational corporation in India was deployed to have a software deployed to actually collect and sort this data and then come up with an algorithm, a family tree algorithm.
Now, interestingly, in this family tree algorithm, everybody's name, everyone who is tracing their legacy from a particular ancestor -- for example, if I am tracing my ancestry from my grandfather, my cousins, my nephews, everyone in that entire family their names needs to be matched, their addresses needs to be matched. The spelling of their names needs to be matched.
And if there is a single error in one of these names, that scrutinizes everyone's citizenship status.
It is also important to note that the people in the island, the Assam islands, the river islands, also have the lowest rate of literacy. And most of them had to actually travel back to their village to get the details of the names and addresses of people who are part of the family.
It is particularly painful for people to go back to their village who were estranged from the family, who did not have any connection to the family, or who had a chosen family like trans folks.
Because it was difficult, a lot of people were excluded. Another layer of exclusion happened at this level, too.
Now the third layer of exclusion happens through the D-Voter's ID which is the doubtful voter's ID which was made by the election commission where people were termed as doubtful voters. Now, even if you have a registration in the NRC to prove the citizenship, if you are part of the D-Voter's list you are not going to -- you cannot claim the status of citizenship.
Our respondents also reported that the border police, the Assam border police, a body constituted specifically in the State of Assam, also had a list and the biodata of people who are suspected as foreigners or illegal immigrants and they were also further excluded from the list.
>> MODERATOR: Jenny, you have a minute.
>> JENNY SULFATH: I will just get to the question of homelessness.
Now the river population in Assam is a shifting population. Homeless people do not have an address proof either. And then in India all of the welfare policies are accessed through an ID. Even though it is not required to be hospitalized it is required to register death and birth.
And citing this reason, a lot of homeless people are not admitted in the hospitals. Now that also needs an OTP to verify all of your welfare services. And homeless population, for them the forms get stolen. So a lot of them do not have a phone where they get the OTP. And even if they can register it through the shelters, they'll have to keep going back to the shelters o access these benefits.
I will also give one example of the TB case which Ananthu discussed. For TB patients, most of the homeless are TB patients because they don't get the necessary nutrition. Now what happens is that they have an allowance of around $10, 500 INR to have nutritious food. Now to have this money they need to have a bank account.
To have a bank account, you need an address proof or aadhar card. What it shows is that how data injustice is accumulating and if you are going for the automation of welfare schemes and in governance, we actually have two rating.
>> MODERATOR: I'm sorry, I will have to ask you to wrap quickly.
>> JENNY SULFATH: That is it.
>> MODERATOR: Okay, thank you so much, Jenny.
Next we have Osama. Osama, okay so for you I have two questions.
One, in the many years that you have been working on critical digital rights issues in India, how do you see something like aadhar fit in the narrative of digital empowerment?
And two, how does the discourse of data justice threaten the social intervention programs for digital empowerment? Okay, Osama, you have six minutes. You are on.
>> OSAMA MANZAR: Thank you very much. I'm glad that you asked me a much broader question rather than very specific.
I would try to bring in the perspective that, you know, digitalization and the process of datafication and the process of digital identity may have given a lot of efficiency to the governments and a lot of efficiency to the even welfare distribution.
I mean I have heard so many stories on the ground that after UID many people were able to get their ration much more properly who might have been stolen and things like that.
But at the same time, the purpose of digitization and the purpose of, you know, data identification, the idea is not to bring in only governance efficiency. The idea is that each and every person should be efficiently solved.
What Dorothy earlier mentioned, that it is very important that we do citizen-centric, you know, datafication process or data policy and things like that. Even one person left behind or one person not getting their right is not a proper policy of data policy or a digital policy or an identity policy, right.
And so there is the perfect example of how we have adopted datafication or data policy or digital policies in such a way that we are -- we have given more weight to the government sufficiency than the citizen service sufficiency.
I'm sure there are many examples, there are many very harsh examples. For example, just less than a month back I heard that there was a new scheme that there was a pregnant lady with two very small children. She went to the hospital for delivery. And they said that since you don't have aadhar or UID we will not admit you. And she lost her life along with the pregnant, you know, child. So you know, this very -- you know, why should it happen?
The second is also weaponization of datafication or weaponization of digital, you know, policies.
For example, UID clearly says that if there is no identity available, UID available with the person, you can use any other form of identity or other identification. For example, maybe some ration card or anything or birth certificate or anything.
But no, I mean the bureaucracy and other people start using it as a -- as a means of exploitation, as a means of subrogation, as a means of exclusion. The policy has to take responsibility of how does it empower the citizen. And that is where especially in the current scenario where you are trying to identify each and every thing through data, each and every thing digital, it is extremely important that you also identify that whether your population or citizen have got access to digital identity processes or data identity processes.
You know, do we have public access points? Do we have mobile in each hand? Do we have, you know, mechanism of people who can help you to get those services through digital. We don't. And that is the reason why, you know, in most of the digital policies or data policies it is becoming more and more difficult to solve the people who don't have -- who have a right but don't have access.
And I talked about this yesterday also that it is very important and when we are upgrading ourselves in creating policies which is very, very digital driven and data driven, it is very important to keep the citizens' convenience at the forefront.
I mean, for example, half of the population in India is not connected. And the same thing with Bangladesh and Nepal and many other countries, right. The people who are not connected, how do you serve them with digital efficiency?
I'm not saying that, you know, digital is not required. But how do you serve them in such a way that you have a very strong public access point with digital enablement and digital literacy and digital skills so that they can serve the people at the doorsteps. That is very, very important.
And now that the AI is coming where most of the programs are driven by AI and programming by AI, you know, we have another level of challenge that how do we treat people's requirements. I would -- I would say that there are many, many examples.
There are five major things that we always think about. Public welfare, access to health, access to information, access to education, access to food, you know.
>> MODERATOR: You have --
>> OSAMA MANZAR: And these must be designed in such a way that the data policy and digital policies actually serve the people with efficiency and without violating their identity and privacy in a manner that it doesn't become risky. Thank you.
>> MODERATOR: Thank you so much, Osama.
Now we have a very short documentary film based on the interviews taken for this research with the platform workers. I would kindly request for Pranita to share her screen and play the film.
Can you allow Pranita to share her screen, please.
>> JENNY SULFATH: Pranita, are you able to share your screen?
>> PRANITA VARADE: Yeah. Should I?
(Video with subtitles played)
>> MODERATOR: Thank you, Pranita. Okay.
So now based on the discussion that we have had, the insights that all of the speakers shared with us as well as Dorothy, and the film, I would request all of the three discussants to kindly go one by one and speak about the recommendations in how we can use these insights and employ them in meaningful ways towards the creation of equitable societies and more citizen-centric policies on data.
Because of shortage of time, you guys have less than one minute each. Yeah. Jenny, you can go first.
>> JENNY SULFATH: Based on the case I discussed, I think one recommendation I have is to have national level audit of the datasets that we have. That data ticket systems are clearly not to capture the -- to include a lot of groups in the process of data collection or sorting or even in the data analysis.
So before we get into automation of welfare services, we need to have consultations with the groups who are excluded or misrepresented again from the data is biased. And it is only through this that we actually can achieve data justice in the context of India.
>> MODERATOR: Thank you, Jenny. Ananthu, you can go next.
>> ANANTHU RA: From the software developer conversations that I was talking about, what we -- what we gathered was an issue of representation or the participation pillar from the pillars.
So there is a problem of demographic composition of the workforce and the decision makers which needs to be solved. And this needs to be something mostly done institutionally.
So one of the recommendations that we developed was for an ethics committee that can like ensuring the diversity that can help tackle this exclusion. Also similar to ethics committees, there needs to be courses on the software development. Not just AI, but the tech institutes. They should have included the component of the social sciences, with more particularly on data justice. Curriculum level themes is needed to make the development team aware of social impacts of the coding work they are doing and how they can change their work so to help challenge existing status of power and create a more equitable world.
>> MODERATOR: Thank you, Ananthu. I would ask Osama to go next.
>> OSAMA MANZAR: I will come from purely at the bottom of the pyramid. I would make a suggestion that as an individual what is my right to data, right to data to share, right to data to use, right to data to be known and so on and so forth.
So if we are creating a situation where everything is becoming digital from access to sharing to efficiency in everything, and governance, it is important that as an individual if my data is being collected what are my rights, you know. From the perspective that my rights not only at the time of collection but at the time of yours being using the same data again and again and again and again and by anyone. So I would say that would be very important.
The second most important is that how the data can be defined from the literacy and education perspective so that the whole mass should have a profit with the literacy almost at the level of community. This is actually we are 20 years late. If we started working on 20 years back on digital format of data, we are already too late. So this should be incorporated in the policy and into the system.
>> MODERATOR: Okay. Thank you, Osama.
I would like to thank to all of the speakers. So because we don't have much time, I will quickly open the floor for any questions that audiences have online and offline. Okay.
So I think we don't have any questions. In that case, I would thank everyone for joining. And thank you all the speakers for being here and contributing to this enriching discussion.
Thank you, everyone.
>> OSAMA MANZAR: Thank you. Thank you, bye bye.