The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
***
>> SUNITA GROTE: Good morning, everyone. Welcome. It is my honour to welcome you to this session on behalf of UNICEF. My name is Sunita Grote and we explore the children how we can make the Internet better for children. I'm joined by my co-moderator online Josianne online.
We will look at bringing a diverse perspective on a unique role that stakeholders can play on creating a safe, online environment for children.
In the second segment, we'll be looking specifically at how to close equity divides that we still find in the online world and in the design and implementation of digital solutions. We'll be focusing particularly on the gender divide and looking at how to design and implement solutions for women and girls.
For the first segment, I'm going to hand over to Josie who will take us through that discussion online now.
(Pause).
Excuse us while we try to sort out the technology.
(Pause).
There we go. That looks promising.
>> JOSIANNE GALEA BARON: Hi. Thank you so much, forgive the technical issues, it's great to be live now, and it's great to see everybody in the room. Thank you, Sunita, for setting the scene. Let's get started with the first part of this open forum.
As has already been introduced, the first part of our forum today will focus on one key question, which is how can different stakeholders from across sectors play their part in delivering a digital world that works for children. One that keeps them safe, respects their rights and positively supports their well-being.
This task represents one of the defining challenges of our times, as we know with countries around the world grappling with some critical questions like how will digital technologies, including generative AI, impact children's lives positively, negatively, now, and in the future. Or what is the right age for children to participate in online platforms.
Or what emerging risks and dangers await children as they explore new digital environments.
These worries about what happens online speak to real-world harms and challenges experienced by children today that demand urgent action.
Research published by UNICEF earlier this month found that children who experience online sexual abuse or exploitation and online bullying have significantly higher levels of anxiety, more suicidal thoughts and behaviors, and are more likely to self-harm. These are real-world challenges.
Having said all that, meaningful access as we'll hear more about later are often critical times for opportunities. Our task is to protect and empower children as active participants and pioneers of the digital world, as opposed to protecting them from the digital world.
With that, I'm delighted today to be joined by four distinguished speakers to help us understand and explore these issues. Our panelists will take us on a tour around the world from China to South Africa to Norway and the Netherlands to see also different perspectives representing distinct stakeholder groups from regulators to investors to brands and beyond and explore how these issues are being addressed in different contexts.
We have limited time, so we'll do our best to engage. We will -- we will ask you to put questions or comments in the Zoom chat only. If you'd like to engage with us. And we'll do our best to address them at the end of this Part 1 or follow up with you later.
So with that, let me turn to our first distinguished speaker, Ms. Zhao, associate professor and Secretary General of China Federation of Internet Societies, which is a body that convenes the industry under the leadership of the government.
Ms. Zhao graduated with a PhD degree in political science in theory from School of Government Peking University and she's has many years of public management experience in the field of education and a great deal of international experience as well.
Ms. Zhao, thank you for joining us today. With the world's largest population of child Internet users and a rapidly growing ecosystem, we'd love to hear from you what measures is China putting in place to guide its Internet industry to respect children's rights in the digital age?
Over to you.
>> ZHAO HUI: Hello, everyone. Good morning. It's a great on this address this forum.
The China Federation of Internet Societies started in May 2018. We're honour to hold special [?] with the UN ECOSOC. Currently we have 528 members, including major Internet corporations like ten cent, [?] and the Alibaba. Our goal is to promote the development of the digital economy and the informatics carry out public welfare initiatives and [?] other within the Internet industry.
China has the world's largest population of child Internet users. We have made sustained efforts in protecting minors online. First, strengthening legal safeguards in 2020, law of the people's Republic of China on protection of minors introduced a dedicated online protection chapter.
In 2021, the personal information protection law further specifies special protections for children's personal data.
In 2024, the regulations on the protection of minors in cyberspace addresses critical issues such as cyberbullying, data breaches, and the Internet addiction.
Complementary regulations like the [?] on the cyber protection of children's personal information and the interim matters for the management of [?] of future intelligence services have been enacted and they're under the guidance of the Cyberspace Administration of China.
Second, enhancing government oversight. The CAC conducts annual campaigns to regulate minors online environments, particularly during summer breaks urging platforms to standardize useful communities, leading Internet companies have improved minor mode functions.
Establishes reporting mechanisms, adopted restrictive [?] against cyberbullying.
Collaborative efforts among public security, market regulation, and the cultural and the tourism, [?] game, accounting, trading, and anti-addiction system circumvention.
Third, we have launched the public welfare initiative protecting the rights and the interest of minors.
The online security and digital clauses we have produced have received over 10 million views. Compared and the published the minors online protection annual report 2024 promoted the preparation of national standards such as guidelines for the safety of AI technology involving minors.
Enhancing the protection of children's rights through industry [?] discipline.
Fourth, deepening international cooperation. China has learned from international best practices and shared the resource of the experience. CFIS partners with the UNICEF in 2024 to launch the safety by design corporate case collection activity.
The selective cases were introduces progress to global Internet companies.
Distinguished guests, building an inclusive future that respects children's rights is our shared responsibility. In September this year, we will hold the international conference on the child online protection in China.
We sincerely invite all of you to come to China at that time to share in this cause, practical experiences, and the future prospect of children's online protection with us.
Let us collaborate to create a safe, empowering online environment.
Thank you. Thank you.
(Applause)
>> JOSIANNE GALEA BARON: Excellent. Thank you so much, Ms. Zhao, for clearly and succinctly outlining, and crafting strategies across a range of different levers and building blocks. The role of building digital literacy skills, the role of lawmakers, and the importance of knowledge exchange and also cooperation. Really appreciate that intervention.
Now let's move from China to South Africa. Turning to our second speaker to bring the perspective of a national regulator for online services.
I'm very delighted to introduce Advocate Lindhorst joining us online who is an executive responsible for research, regulatory development, registration and licensing compliance in the Film and Publications Board. Welcome to you, Advocate Lindhorst.
South Africa Film and Publication Board has taken a proactive stance on child online safety through regulatory frameworks, public education and community outreach. In your efforts to protect children in digital spaces, how do you drive sustainability among digital platforms, how do you equip accessible mechanisms for address when harm occurs? Very easy question for you.
Thank you so much for joining us. Over to you.
>> MAKHOSAZANA LINDHORST: Good morning and thank you very much for having us.
So as the Film and Publication Board, we're an online safety regulator that is really responsible for ensuring that south Africans are protected from online harms.
So as a regulator, we have regulatory frameworks that we have. So streams of games and films are required to register with us. So that gives us an opportunity ensure that we're able to have the checks and balances, but through the law we're able to ensure that the majors that they have ensures their online safety for children.
But more importantly, when it comes to social media platforms, as a regulator, we ensure that whether it's prohibited content, harmful content, that is brought to our attention through issuing our takedown notice to ensure that prohibited content that might cause psychological harm to children, it's been removed.
So when it comes to the issue of education and awareness, that is also our area that we focus on as a regulator, because giving digital skills, just like China, that is our priority. To ensure that south Africans are given digital skills. Especially parents. That becomes the most important part.
Because children have a right to responsible parenting, and they also have the right to privacy. Whilst outside the mandate of the FPB as a country we also have privacy laws that ensures that children's rights are also protected and platforms are required to ensure that they comply with those laws.
But on a day-to-day basis we have a team of dedicated advocacy officers who goes to schools and ensures that children are protected and educated in terms of how they're supposed to conduct themselves online.
We also have toolkits that empowers them, including the teachers and parents. So that is the way that we do as the Film and Publication Board.
We work closely with law enforcement agencies, especially on issues of child sexual abuse material where we have a team of social workers who are also our investigators that works on the case, close with the police to make sure that we compile the reports that we take to court and serve as evidence.
We have a dedicated committee, so if one feels their right is being affected by online content, they can launch a complaint with us and that can be taken to the enforcement committee. Which is a quasi judicial board. And what makes it important, we are not [?] with any other matters through the court process. This is a dedicated quasi judicial board that's focusing on issues of the online safety and other compliance issues within our mandate.
>> JOSIANNE GALEA BARON: Wonderful. Thank you so much, Advocate Lindhorst, for giving us that very different perspective of a regulator. You're using different keywords that tease out such important groups of people and professionals that play an important role in this. Social workers, law enforcement, investigators, these are very important elements for us to think about.
But with that, we're going to take another shift of gears. When we think about children's online experiences, it's natural to think of regulators or the responsibility of social media companies or other players in the tech sector.
We also have our two next speakers who will help to shed a light on the important roles and responsibilities of other players and influencers in the broader ecosystem that have a very critical part to play in delivering positive change for children. And they will also share about work being done with UNICEF.
I'll turn to Caroline Eriksen who joins you in the room who is the head of the social media and active ownership at Norges Bank Investment Management or NBIM. Caroline heads the social team within the active ownership area of NBIM. In this role, she leads the work on policy development and engagement with portfolio companies on social sustainability topics including children's rights.
Welcome to the panel, Caroline. As an investor, NBIM interacts with a large number of companies around the world. What is the role that investors can play for investing in child rights and safety online and how is NBIM addressing this in the digital space?
Over to you, Caroline.
>> CAROLINE ERIKSEN: I'm delighted to be here. Thanks everyone to following in the room and online. NBIM is a global financial investor in almost 9,000 companies in 70 markets. So we own a small slice of most of the companies in the world. As a global investor, we depend on well functioning markets and a long-term sustainable development in economic, developmental, and social sense.
We want companies to operate responsibly and with a long-term horizon. And to ensure the long-term returns, as a minority shareholder, we can use our leverage to influence companies and improve market practices.
Child rights in the digital environment is an important topic for us. We are fund for future generations and about 30% of today's population are children and they spend an increasing amount of time online.
Failure to respect children's rights can be a material risk for companies that can entail legal, financial, reputational risk and they can affect their license to operate. And it goes beyond digital companies, right in?
As Josie mentioned, for example, think about retail. They may not consider themselves a tech company, but they may impact children's rights through digital advertising, through other ways of their operation.
So it's a topic that's relevant to companies across sectors and across markets.
It's also an area where we see an increasing amount of regulation. Can mention the EU digital services act or other types of regulation in other markets. This is regulation that asks companies to be open about how they're addressing and managing impacts on children's rights online.
This is why we have entered a partnership with UNICEF over the last few years addressing exactly this topic. Together, BSR and UNICEF have done research on company's reports and looked at more than 200 reports to see how they address child rights impacts online.
They found that only a few of them actually meaningfully address how the companies impact children's rights.
We also experienced this in our bilateral dialogue with companies. They see it as a material topic, but it's how to address this in reporting is still an area of development.
So UNICEF and BSR together with industry, academia, and Civil Society have developed a set of disclosure recommendations and a guidance that we will soon publish and share with everyone.
In fact, I can give a little teaser today. We will launch this officially on Thursday, the 27th of June. I'm really excited about this. It will be a webinar and I'm going to show, yes, here on the screen you can scan this QR Code and sign up to attend the launch webinar.
And really excited about this. Watch this space and thank you so much. Hope to see you there.
(Applause)
>> JOSIANNE GALEA BARON: Thank you so much for that intervention. I think it's so important for us to expand our minds and think about the different roles that actors who might be more of the, quote unquote, usual suspects, the very influential role that they can play in advancing the goals that we're talking about. And I hope to see all of you joining us in two days, Thursday, for the launch of the disclosure recommendations which will also include the research that was described.
So with that, I will move us on to our final speaker, but before I do a quick reminder, if you are joining in person or on the Zoom, you are very welcome to make a comment or a question in the chat and hopefully we will have time to address some of those at the end.
So again, this is something also that Caroline gestured towards, we're thinking about the rights of child and safety online, the deployers of digital technologies are also key stakeholders. Not only those that develop digital technologies.
Alexander Galt joins us online. He's the digital ethics leader for Inter IKEA group to develop digital ethics practice across the IKEA value chain. He's the experience of embedding digital ethics through his work with the UK Civil Service in Deloitte.
Alex, what does Inter IKEA Group have to do with child rights and safety online? This is a question that I'm sure you get asked in different places. And what actions are you taking? How can other brands play an active role in understanding and addressing their impacts on child rights and safety online.
Over to you, Alex.
I don't think we can hear you, Alex. If we can test the mic. That looks like it's working now.
>> ALEXANDER GALT: Thanks. Thanks, Josie, and good morning to those of there you go Norway and hey to everyone joining online.
Thanks for having me here today at IGF and great speakers that we've had so far and thank you to the fellow panelists.
Echoing a lot of what Caroline has just said in terms of being responsible business, first of all, what does IKEA have to do with child rights? We made the commitment to integrate child rights into everything that we do across all the IKEA organizations. We take that responsible to ensure that children come to no harm as a result of the direct contact that we have with children and the indirect contact that we have wherever we do business in the IKEA value chain.
And we think that come together stores is a key part of the IKEA experience, we know that contact with people happens increasingly in an online context. Searching most of our shopping journeys start online, as the more established eCommerce websites and applications that we have, but also through platforms such as social media and other types of emerging platform engagement, whether that's reselling marketplaces or social gaming.
And where we have direct contact, we put measures in place to protect child rights. So whether that's our policy and processes on child safeguarding, the data protection mechanisms that we put in place when we engage with children when we're developing our products, or for the technologies that we put in the home. So this is our IKEA smart product range where we've taken an inclusive design approach with diverse families to hear the voice of the child in relation to design choices that we should make when we're developing those products.
And as mentioned there by Caroline a bit, marketing and digital marketing I think is a big area where we have contact with children.
We have a strong point of view already for many years on how we portray children in marketing children. Vis-a-vis the images and the media content that goes out into the world.
We make sure that we show and express children in a respectful way, in a way that brings them their [?] and shows them in a positive light that they need to be.
So this largely what's in our control and directly governed within the IKEA value chain. However, we want to take a child rights perspective in the more indirect contact bases that we have across the value chain. And that needs a more nuanced approach based mostly on the relationships that we have with partners.
Again, on this marketing topic, we're pretty clear or very clear on what we require when we are addressing children. There is no direct [?] of children in our commercial messages.
However, we don't just stop there. We know that a lot of child rights harms can also come from other types of practices, other than direct targeting. So whether that's new monetization techniques like gamification, or new models of engagement from influence in marketing or content create marketing, they represent a new form of child labour.
We know this happens on the platforms we're engaged with and we want to take responsibility of how that conduct happens.
And brands like IKEA are the company's [?] this digital eco marketing system. We buy the media space to engage with people who want to buy our products. That comes with responsibilities and ability to influence.
And we want to influence because we know that there is a growing knowledge that these interactions have the potential harm.
So in the partnership that we've had with UNICEF, we've wanted to research the terms and understand how to best counteract them. Especially through the complexity of the different actors in the digital marketing ecosystem. It's not just a binary brands and platforms, there's many actors in the space, many leverage points that we need to engage with.
This allows to us make informed decisions when we're partnering with media agencies and the partners that we choose to engage with. And we pass this through our value chain and how we've been governed our brand and marketing operations.
So this is a toolkit that breaks down the responsibilities that each actor in the value chain should take. So us and those that we are giving our money to.
Helps to us move from this binary thinking yes, no, good or bad, should we be online or not and the what, when, how we should engage. So this will be openly available to all organization who's want to use it, but especially encourage brands who are at the start of its value chain to adopt it and work with the organizations that they outsource their actions to in order for them to adopt the requirements and recommendation and we can uplift the child's rights perspective through the whole value chain of social media and digital marketing.
Thanks very much.
>> JOSIANNE GALEA BARON: Thank you, Alex, for rounding off the panel of different perspectives, right. I think that you tease out such an important element that we return to in so many different areas of business and human rights, which is to take a value chain approach to not only look at one particular actor, but also the bigger picture of the connections and the industry, the different leverage that different actors might have on other players and the importance of human rights due diligence, which is certainly a very important red thread that goes through the different resources that we've discussed.
And again, I hope that engage with many of you in the room and online around the digital marketing work that we will be issuing later this year as well. So definitely watch that space.
We do have some moments for engagement, so please do engage with us on the chat. Thank you to one participant who's sharing some research that they have. This is -- this is wonderful. And if you are in the room and would like to engage, then please also do it in the chat.
I will give a moment or two to anybody who would like to participate. And if no questions, I will very happily hand over to Sunita to guide us through Part 2 of the forum.
Fabulous. It was all very clear. No questions at all. I will hand over back you to, Sunita. Thank you so much, very much looking forward to the Part 2 of this forum.
Thank you.
>> SUNITA GROTE: Thank you very much, Josianne. I'm going to ask you both to exit the stage on that side. And we'll move over to the second segment of this forum. We are, again, going to be joined by two panelists online, as well as two in the room. So I'll ask Silje and Lisa please to join me on stage.
So in this segment we're going to be keeping our focus on the importance and concrete actions that various stakeholders can take, particularly to ensure that the digital world, the value and the impact that had has to offer is more accessible and available to those who could benefit the most.
So for this panel today, I'm joined here on stage by Lisa Sivertsen, director of human development at Norad. I'm joined also by Silje Dahl from the ambassador of Sweden in Pretoria.
And online we're joined by two speakers, Tawhida Shiropa, from Moner Bondu, and Annina Wersun from OpenCRVS.
So we're going to focus our discussion further to an often exclude and overlooked, women and girls.
30% of women world wide are not in education, employment, or training. 740 million women in developing economies remain [?]. 1 in 5 adolescent girls are married before the age of 18. These are some of the wicked challenges and big problems that many of us in the room and on stage are working to tackle on a day-to-day basis.
Digital technologies do in fact hold significant promise to help us address some of these challenges, but we're falling far short in realizing that potential. In fact, we see that not only are we missing solutions to address those challenges often completely in the market, when they do exist, we're not able to access them. We're not able to scale them because they are behind often closed or prohibitory regimes. And we often see that they fail to meet the needs of women and girls when they are there.
For instance, we know that almost half of publicly documented bias in AI systems is bias against women and girls. We know that only about 2% of medical research funding goes towards pregnancy, childbirth, and reproductive health. If we look at the private sector, only 3% of investment going into digital health is actually focused on solutions that are addressing female health challenges.
Similarly, in private capital, only 2% of investment in 2023 went to companies that are cofounded or founded by women. And we, at the venter fund, we run a vender run the to developing countries really had to learn that the hard way. We realized that when we did investing as usual, we just mirrored what we saw in the private capital industry.
But at the same time, the investment opportunities in women's health in particular are massive. The industry's projected to grow to $1.2 trillion by 2030. So clearly for all stakeholders around the table, there are massive gains to be made, if we start focusing our efforts more deliberately on meeting the needs of women and girls.
So what can we do to close this gap? How can we make inclusive technology for everyone? How can we leverage this opportunity?
So I'd like to start with you, Lisa, please. Thank you for joining us today. I think you had the shortest commute of all of us to get here. We have partnered for a number of years together, particularly around digital public goods, and we cofounded the Digital Public Good Alliance.
With you be able to share some of the gaps in your work where efforts around innovation in your country have failed to meet the needs of women and girls?
>> LISA SIVERTSEN: Thank you, Sunita. I did a short commute but I was delayed because the trains were delayed. Sorry to everyone about that on behalf of Oslo.
Thank you so much for inviting us to be here and I think your introduction really highlighted so many of the gaps and the structural barriers that women and girls are facing. And I think gender-blind digital tools and infrastructures will really enhance those gaps and disparities and the also the structural discrimination.
So from Norad and Norwegian side, we are very committed to what we can from our side to build -- to build down those structural barriers. And I think even though the gaps really do still exist and even though we also see some very, very negative trends when it comes to gender and marginalization in terms of access to digital tools, but also the failure of designing gender-diverse tools.
And I think also there is a lot of research pointing to the lack of trust in all systems, which I think is a big issue for all societies in the world.
What gives me hope is that there are also so many examples of partners and also governments working together with the private sector, research institutions, Civil Society, women's social movements to close those gaps. So that's -- and if I could highlight some of those examples. There are so many. I think from the Norwegian side and Norad we try to do what we can to support -- so initiatives and new tools. But we also really try to focus on that system approach, what can we do to enhance and support global digital public goods. But also the infrastructure that every country in the world really needs to invest in to make sure that you have inclusive digitalization.
And so we are so grateful about the opportunity to work with UNICEF and also with UNESCO. We work on engaging more women and girls to get education and work within the S.T.E.M. sectors. I think that's a really crucial investment that we need do for the years and the future to come.
And we are also really excited about the digital education strategy of the UNICEF.
Norway has been supporting for 20 years this digital public good infrastructure that is being designed by the University of Oslo. Has a really complicated name, DHIS2. But it's an excellent tool. It was designed to be a low-cost, open-source infrastructure for dealing with health data for lower income countries.
During the pandemic, it actually proved so easy to use and such a safe tool that it spread also and it was taken into use also by a lot of middle income and high income countries, including Norway. We actually started using it our self. So that's an excellent and important tool that we will continue. More than hundred countries are now using it.
And then OpenCRVS is also another system that we have invested in. So that's an open-source system for birth registration and a lot of other official tools as well to use.
Which I think provides -- being open source, it also builds trust and it makes it possible to -- to adjust to the needs of women and girls.
>> SUNITA GROTE: Fantastic. You've set the scene really nicely for the discussion points we'll get to later. We've got Annina joining us from OpenCRVS. And we'll talk more about two of the product owners joining us online, so thank you for what.
Silje, I'm interested to hear from you, regionally what successful approaches you've seen in your region around digital innovations being designed with and for women and girls. Maybe specifically how that's influenced your viewpoint, your engagement, your choices and approaches in the region.
>> SILJE DAHL: Thank you, Sunita, and thank you for inviting me. I think it's great that we have Sunita and Lisa in the room.
In South Africa, we work specifically on integrating the Swedish government strategy for reproductive health and rights in Africa. We work with many different stakeholders, UN agencies, Civil Society, social businesses, entrepreneurs, and I think globally we have about 140 partners working specifically on digitization as well. It's a priority for us and also for the Swedish government.
But I think it's very important that we support programs and partners that are bridging that digital gap and the gender divide as well. And ensuring that women and girls actually can access digital tools that are useful for them as well.
That's one the reasons why we have recently joined into partnership with UNICEF for this specific fem tech initiative, because we know that entrepreneurs are the ones that will come up with the new ideas and tech that will potentially enable economic growth, but lift people out of poverty. Which is our end goal.
And as you very well said, women are very underrepresented in this area. And we see that there's a lot of digital products being developed that are not at all addressing the actual needs of women and girls. And especially within SRHR where we work a lot.
And we also know that in many rural areas in many countries women and girls, they don't have access to digital devices.
They don't have Internet. So how do we reach them with the -- with information.
So for us, it's important also to support initiatives that not only cover this gap, but also work on providing correct data. They might work on domestic financing, for example, or trying to have inclusive policies and legislations in their specific countries.
But most importantly, the solutions that we are supporting are locally adapted and actually are for the groups that we intend to work with. And also for the groups that normally are left behind like the LGBTQI community for example.
A few examples for our region of partners that we support, and I think that in the previous session it was touched upon as well, the backslide of digitization and the harm of children online. So we have one partner working specifically on ending gender-based violence and they have developed different tools on how to work with that.
And they work specifically on addressing technology facilitated GBV, which is a new word I just learned in the Southern Africa region. But it's very interesting, because they also see that GBV has increased online and is posing a risk of normalizing violence. And so they work with both Civil Society organizations and local governments in ensuring that technology GBV is incorporated into legislation and policies.
That's one example of how we can work with that in regards to women and girls. We also have another partner working on improving access to SRHR information in west and Central Africa, which is a region in Africa that is very difficult to work with SRHR and it's sometimes very controversial as well, so you need to find the ways to share information in a way that, yeah, is not super controversial.
And we also know that youth are more vulnerable, for example, to HIV, to other sexual transmitted diseases, and also really unwanted pregnancies.
And with this app, they can gain access and information that is anonymous and also safe and they can get information how to find health facilities where they can get support.
And so far this app has reached about 9 million young people in that region. So just two examples on how we work on that.
>> SUNITA GROTE: Yeah, fantastic. Thank you. You called out specifically of the ingenuity of entrepreneurs and trying now approaches. I'd like to turn now to our online speakers. We have Tawhida joining us from Bangladesh. And she's a recipient of the UNICEF venture fund as part of our health corps that we launched last year. And her team provides holistic well-being services in person and online for children and young adults.
And they've developed an app and looking at developing an open machine model to enhance detection and recommendations to parents.
So Tawhida, I'd love to hear from you, if we can kind of dig into a little more deeper into actual specific solution, specific product. Kind of hear from you how this idea of really putting the needs and rights, like safety, like data privacy and others of women and girls at the centre of how you've developed your product.
Over to you, please.
>> TAWHIDA SHIROPA: Thank you. Thank you very much, Sunita. Very good morning, very good afternoon from my side. Thank you for having me today.
First of all, I wanted to really show my gratitude to the UNICEF intervention team, and I'm here because of the journey.
And now I'd like to talk about more about the technology and the product and more about the passion what we are building right now in Bangladesh.
So at the Moner Bondu where we'll be developing the [?] for all women and girls because women-led companies always are considering the women in our first priority.
And where we develop the technological team and the other things. We always mention that.
So Moner Bondu is one of the mental health and well-being using AI and the machine learning model and human safety approaches including the counseling, the [?] assessments and well-being and techniques and all these assessments as well.
And I'm going to talk about the girls and women's safety, it's something that we always think like the first thing we have to make sure, because it's not like because when we start the conversation from the very [?] and building the [?] especially the app which is for the child well-being and using the open source and very much align with the digital public goods.
And so we keep those in mind and we make sure that it's co-designed to be -- it's not like something we build and like are considering our idea, our thought. But like a thousand [?] girls not only from the Moner Bondu [?] but even the remote girls were taking our services every day.
So we wanted to build the app with them to start with them. We talk like the hours after hours, what are their struggles and the things they have gone through. They told us about they're afraid of reporting the abuse especially by their community, or something happened, is it gender-based violence they face. Even about hiding their anxiety.
[?] anxiety, another anxiety, and sometimes they thought about the cell phone issues. And they shared, like, at the time when they're going through these kind of kids that there's no private [?] that they can share those things. Because they think like maybe it would be public or there will be, I mean, like they will be bullied from their fears or in the social media or the online platform.
So we build and considering they are [?] and when you sit with them. And we always make sure those things like we build an [?] assessment. And of course we design this quick checking space on the app and make sure we choose female counselors. So anytime they can reach out to us through our platform.
And another thing we always wanted to make sure like the safety, you know. The for girls safety is not about the encryption, you know. We said our model or our app is an end-to-end encrypted. But for girls, what we've seen, we have experienced for girls' safety is the emotional safety.
So when girls fill out the check-in -- well-being check-ins or fill out anything assessment, so we are to make sure from our side we prioritize their data security. Sometimes something they are [?] might be so they don't have the access on the mobile phones or smartphones.
So we take the verbal concept from their parents, but the parents are not that much educated sometimes. And those times we also call them and, you know, discuss with them these are the things. So this would be helpful for your child's well-being. And then we get the concept from them.
And of course we want to make sure that the data and security are with that. And with that sense we said like our back end system, and we have the [?] like the [?] in the real time. So whenever we can see any time the gender-based, trauma centric we said immediately our counselors are our female counselors. They respond to the client.
And of course we're [?] all the sensitive escalation protocol, we always maintain those.
And Number 3 I wanted to mention like the privacies. Like when we developed our whole process that even the UX design, we always, you know, got the feedback from the customer and always include those feedback into our site and the app as well.
And that means like what we said, this of course is implemented, very access control by our tech team and we also maintain the global standards. And it's true I think as was mentioned in the first part and I think I heard all the conversation from our amazing panelists, so this is true. Like the AI models can be biassed to some point and that includes also the Large Language Model and the vision learning models.
So we always wanted to make sure that of course this data have to be protected, have to have the [?] data, then we can provide those things. And I think later on maybe that we will discuss more on open source.
But what we have seen so far with deploying this almost 10,000 students and adolescents, including majority of adolescent girls in rural schools and majority of schools in the area, what we have seen we got huge response from their side. And that will help us to create more data processing system.
And of course with that, we maintain this -- we wanted to make sure like this -- the whole protocols, the whole sensitive things would be the gender sensitized. We maintained the gender sensitive framework. And we wanted to really get all those [?] from AI and technology so it can be more accessible for everyone.
And one thing I really wanted to make sure from my side and this is our vision is, like, the app is not just an app, it's not just an upcoming technology, we wanted to make sure it's a promise to all the young girls so they can feel safe, so they will be heard and the most important thing whatever they have gone through, they don't need to be going through it their side.
So this is something that we build for our side. Yes, we are. And thank you to UNICEF, they give us the large insights about the digital public good and the open source. So we wanted to make sure it's very cost effective and low maintained cost and very transparent to what we're building right now.
Thank you very much.
>> SUNITA GROTE: Thanks so much, Tawhida, really good points I think around --
(Applause)
Thank you, you're getting applause in the room in case you can't hear it. Really good points about how it's a necessary part but no way sufficient if we think about inclusivity of the entire programme and making sure we reach women and girls and provide them that emotional safety that you spoke to.
I'd like to dig a little bit deeper into an aspect that several speakers have touched on, which is creating digital solutions with digital public goods. If you haven't already and you're not familiar with the concept, there's a great booth outside by the digital public goods alliance that you can learn more and meet the partner organizations. There's a high-level panel this afternoon that will bring together major players in the public goods space. My little plug.
But here we have actors that have chosen to develop in the open and open-source solutions.
Annina, I'm going to start with you because we haven't heard from you yet, and hear your perspective about what role does this approach and focus on open source actually play in improving inclusivity and safety of solutions?
How does that approach enable us to build that trust and to actually operationalize this commitment that we have?
>> ANNINA WERSUN: Absolutely. Thank you so much, Sunita. Good morning, good afternoon to all of you. It's an honour to join the panel. I feel truly inspired as I so often do when listening to work that other people are doing in this area as well.
Thank you for the work you're doing. I'm from OpenCRVS and we're a digital public good for civil registration and vital statistics. That's the registration for birth, deaths, and it's used for those purposes.
It's not really just a principle, it's something that we can demonstrate in practice. And other digital public goods, OpenCRVS gives countries a chance to see what inclusive digital services can look like right out of the box.
So when a country starts to use OpenCRVS, they will see a preconfigured version of the system based on our default country, it's a fictitious country that we've created, and it supports a number of different vital events. So the forms that they see initially are designed specifically to spark important conversations.
So these countries can then subsequently configure the forms, they can make them work exactly as they need. That's the whole premise of OpenCRVS. That it's configurable for a range of different country context.
But they can see out of the box what is possible. So what's an example of trying to spark these conversations?
Take birth registration, for example. In many countries, the father's details are either assumed or required. But OpenCRVS shows an alternative. And that is a birth registration form that does not mandate the father's information.
And that simple design choice opens the door to discussions even at the highest levels about how to create more inclusive services for single mothers, for survivors of gender-based violence, or for any situation that doesn't fit a narrow norm.
And we've seen in many countries when this comes up, we have the reaction why are the father's details there. Of course a woman would always be married. And these conversations at the highest level really allow to us explore these assumptions and also inform and educate those who may just not have access to this information.
We've had some really, really interesting, engaging discussions where people have come away with a different perspective.
It's a powerful example of how design can lead policy and how a digital public good can educate and influence across government just by showing what's possible.
We also use data to reinforce this, this idea. So OpenCRVS includes dashboards specifically configured to highlight insight into the experiences of women and girls. Insights that too often go unseen.
And a civil registration is the only continuous source of population data, it really is so rich if you just -- you just ask it the right questions. Imagine if a government could pinpoint where young mothers were giving birth so they could direct maternity services to those areas.
Or whether they could understand where and how women are dying to target lifesaving interventions.
Just being able to visualize these options, out of the box, OpenCRVS comes with this option and countries can take away if they don't want it, but more often than not, they see this data and they accept the data. And in fact, then they might bring in another ministry, for instance, Ministry of Women's Affairs in many countries, that they can actually see this data. And countries may explore who do we want to allow access to this disaggregated data so that for instance the partners in country could have this data to be able to target more specific interventions.
And this really, for us, is the power of inclusive design. Backed by meaningful data. It's not just about technology, and as Tawhida said, there's so much more to an implementation of OpenCRVS than just this design. But it certainly allows us to begin shifting systems and showing governments what goods and inclusive is look like. Thank you very much.
>> SUNITA GROTE: Thank you very much, we love that starting point of the conversation opens up way for opportunities and possibilities of what can be include and what could be just a digital solution.
Tawhida, I'd love to briefly hear from you as well. How that focus on open source in particular has allowed you to put a focus on inclusivity and safety.
Tawhida, can you hear census.
>> TAWHIDA SHIROPA: Yeah, thank you.
So yeah, thank you very much, Sunita. And of course the first thing from our experience and we reach more than 2 million people, so we have like I think a thousands of millions of people likes they're getting outside [?]
And committing to that, to a pain source and the digital public goods has been one of the fundamental things to ensure that both inclusivity and when we're working with the vulnerable communities, young girls, adolescent girls, and marginalized communities and women definitely.
So one thing we wanted to make sure and I think using the open source, because when we're starting the working with the open source board, this is very -- I think very transparent, right? What our code like is open. So of course like the data, the encrypted data. So we don't want to show anyone's personalized information. But the whole goal is that it can be open and the communities, stakeholders, including the government can see exactly how the systems are going. How decisions were already made.
Even the safeguard policy, what we have taken so far. One of the vital things we wanted to make sure we did these things, like dealing with the sensitive data, like the mental health and the well-being data is one of the way to personalize and very sensitive data what we're dealing at this moment.
That's open source, one of the things that we believe the open source align with the digital public goods make us very accountable to our customer, to the people. To the young girls. Maybe they don't understand, or they don't understand the data and security, but they understand their privacy, right?
So I think open source and that digital public goods really aligned us to accountable to the girls and who share their data with us or their parents or the mothers, right?
And another thing is like the accountability, you know, that builds the trust. So I think that's a fundamental things when you are -- I think providing the mental health services, there's a trust, the confidentiality. These are things that I think we I think align with that.
And another two things like of course in the localization and the tools like the market, the meaningful things, what we are building.
As community sometimes we go into many communities and deploying and developing the app, so there is many community, they don't have the problem [?] and some of the parents they're like -- they're low income and they don't understand the data and I think everything.
So we contextualized all those things in dealing with sometimes in our language and sometimes it's in their language like more local things. So we make sure that they understand what we are taking from them, right?
So that's the whole this public goods [?] public goods aligns their missions, I think their statements. So we feel like this is something that's very good for us and I think proper things.
I know one of the things when we act the system like local developers and [?] customize those things. There's also the cost-effective. Because they don't need to start these things from the very scratch from their side. So they can, you know, anywhere around the world I think across the world so they can just remodel those things, they develop from I think they deploy in their local context.
So and of course there's this flexibility, I think that's one of the things that [?] digital public goods [?] to a human solution. That's one the things that's very fascinating what we found.
And I think Number 3 which the last thing I wanted to mention, like removing all the license fees, like you know, the when we start [?] and we suffer and struggled with the license fees because sometimes we have the credit but we can't pay from the rules and regulations from the Central Bank of Bangladesh. So we struggled a lot.
I think the upcoming people who are, I think, very much passionate in the tech industry, and especially the females and entrepreneurs, I really wanted to cheer for them. So [?] they can take all these leverages for this, [?] what we are building.
So this makes all those things like more inclusive and, of course, the safety. Because I mention like the -- all the accountability, make sure the safety and the trust. So that's why we're here. And of course it helps with low risk of institutions so they don't need to spend money on that. And they can adopt, they can share, they can scale the tool.
There's all the things that are there, so they just need to [?] those things and make sure the accountable and the trust. And of course it's not about the [?] it's a balance. So the digital public goods have the values.
And they taught us how can we leverage all those, I think, good values and good [?] and incorporate to our impactful business.
Those are the things we're doing right now and we make sure that no one is left behind. So we are accountable to everyone and I think open source and digital public goods are the two fundamental things that can make sure that. And yeah, we are ready to scale those things in our programme and in our app to -- across the world.
Thank you very much.
>> SUNITA GROTE: Thank you so much, Tawhida. I think you touched nicely on the different benefits we see from open source, product design, development, but business model perspective around having to tackle questions around fees and licensing and how that model enables others to pick up your work, adapt it, and scale it more easily. So thank you.
The other piece that I appreciate is that often we associate digital public goods just with open-sourceness. But you spoke nicely and brought to our attention how the digital public goods is around protection and safety and human rights being standards. So that is really something that many of us are trying to hold ourselves to in order to ensure we are respecting human rights in our digital efforts as well as access and inclusivity. Thanks so much for highlighting those different elements.
Lisa, I'd like to come to you on the same question probably from a different perspective, though I don't know whether you've built and the coded open source products yourself, maybe.
It's just to reflect a little bit more with us on what value of you've seen in your focus on digital public goods and open source. How that's been a conduit and a facilitator for you.
>> LISA SIVERTSEN: Thank you so much. It's really interesting to hear from OpenCRVS, but also entrepreneurs that are building these systems. And I think you mentioned it already, how these systems make it possible to build accountability and trust, because they are transparent, they are open for external audits and they are also adaptable.
And from our side, we've actually developed our own open -- open source data policy that we are trying to use across the different areas we work.
And there is a lot of resistance, you know, there's a lot of resistance from different companies that are, you know, selling systems that are not open source.
So this is something that we are really trying to push across areas such as education and health, but also the work that we are doing on environment and climate. So it's -- it's -- it's a bit -- it's challenging, but it's, I think, also it's crucial. It's the only way forward, I think, to build these accountable and open systems.
It's also a lot about costs, because open standards makes it possible for different systems to talk to each other and work together. I think we've all been frustrated trying to access different digital systems that are not working together. It has a lot of cost, it's very challenging for everyone.
I like the term that Annina was using about emotional safety, giving up your data and engaging with digital systems within national ID systems or health and education, it's a lot of very sensitive data. And you need to have that trust that it's not going to be misused by someone at the other end.
And then, of course, open source systems do have that potential for being more adaptive to the different kinds of users. Also marginalized parts of the population in any country.
And we really need that to be -- to be able to continue to work against those gaps.
>> SUNITA GROTE: Sure. Sure. I think particularly some of those privacy concerns are really relevant to those who tend to be more marginalized and excluded. It's almost even higher importance when we work with those sorts of communities.
So as we kind of move towards the last piece of our discussion, I'm going to ask each of the panelists to reflect a little bit on the role of stakeholders that are not present on this panel.
And kind of what their calls might be or what their reflections might be. Before we do that, I wanted to specifically turn to Silje, and as part of our venture fund, we've started exploring how we can leverage the different capital, different financing capability beyond traditional funding and grants to actually move capital flows more towards inclusive technology.
I'd love to hear a little bit from you on kind of CDA's approach on what you see happening in that space.
>> SILJE DAHL: Thank you, Sunita and thank you to all the panelists. Very interesting. I think that this is -- what you touched upon now is so important. I think we all have seen the changing in the donor landscape for the past year, and it's been, of course, challenging both as donors and us partners working on the other side to find financing for the work that they do.
For the Swedish government, it's a very high priority that the Swedish Development Corporation should focus on leverage funding from other kind of sources. And also supporting our partners in doing that.
I think we believe that the part of the additional financial flows to long-term solutions that will empower women and girls and youth needs to come from other, for example, from private investments and not -- and also that our partners shouldn't be so donor dependent, to be honest.
And in doing that, I think we as donors have a very important responsibility as well. And we know that, for example, today there are very limited opportunities for private actors to do business in fragile contexts or local markets. They have very limited risk observation capacity, for example.
So how can we as donors support them?
So we have different financial models that has been quite successful, I would say, and one of them is, for example, is a guarantee. And that's where we will help reduce the risk to a lender. For example, if we have a new start-up, an organization working on health for women and rights in a country in Africa, and they want to expand their business, they want to enter new markets, they want to sell products in markets that they haven't been in before, for example, the local bank might be hesitant saying it's too big of a risk.
Then CDA can cover that potential risk for the bank and enable the organizations to grow and start their business. So that's something we've done a lot in other sectors such as energy and the transport sector, environment. And this year we will start doing it in health and SRHR. Very excited about that.
It's a very successful model for trying to find other kind of funding.
Another model we use it mobilising of capital. That's where CDA has a very important role in being catalytic in connecting different partners with each other. And this has been something we've been working on for many, many years, but it's also a way of creating sustainability for our partners. And less dependency on us.
And also forcing them to -- to find -- to leverage new money. So what we do is that we say if you can find X amount of money from other funders, we will match that. And I think that's a model that has worked really well. And especially now when we have -- when we want to expand into fem tech and work more closely on that and work more on new kind of partnerships, I think it's very important that we look at different financial models as well to create more sustainability for our partners. Which is, I think, should be an end goal. The government donors should be less than the other kind of financing flows coming in.
>> SUNITA GROTE: Thanks so much, Silje. I think there's a long way go to close the financing gap. Appreciate those efforts. It's crazy once you think about how unequal that distribution is and then, you know, I spoke earlier about the small proportion that goes to female-founded businesses. So if you put that on top of each other, you end one a very, very skewed picture of where private capital's currently flowing.
On the UNICEF venture fund side we've seen similar experiences. And we're excited about seeing how it can be catalytic, it can derisk and sort of showcase those risks can be managed, how can they be addressed, what do those markets actually have to offer. How competitive they are from a risk perspective. So we're looking forward to having conversations with private investors to see how they can support some of the aims we've discussed here today.
I'll turn my attention to our two speakers online and Lisa I'll come back to you at the end to just reflect a bit more on the question as a final call to action in a minute about who else should be at this table to move from maybe what are still fairly niche approaches to shifting entire systems and what role do you think they need to play?
Annina, I'll start with you.
>> ANNINA WERSUN: Thanks so much. I think we could probably talk for a long time just on this topic.
But I'm going to build, if I may, just on what Silje has been talking about and I am going to call also to the donors. We're incredibly lucky at Norad, our donors sees the importance of funding the core product. As a digital public good, that's something difficult to find, to find kind of funding that's very flexible and that understands that we need to manage and maintain a digital product that is ultimately supporting critical government infrastructure.
And not tying that to specific implementations, for instance.
But I would call further to build also on Silje's point, the challenge we have as digital public goods is to -- is to be able to invest in -- well, a few different things. One is absolutely to become sustainable.
So we have to think about how we can generate revenue in OpenCRVS where we're spending a significant amount of time on this looking at how we can in a market that thinks and expects us to be free forever, how we can -- how we can change that and how we can start charging for our services.
And in fact, actually, several countries and partners have seen the value that we can bring to the table, but we do have to start operating more like a business. And sometimes that can be very uncomfortable for partners working in our domain, which is totally understandable.
But in the same way to become sustainable, we need to think about creative ways of generating revenue. And just like for-profit companies may -- well, most of them invest in research and development, we also want to invest in research and development. And in fact, we would love to establish a research and development branch specifically for women and girls. Because we truly believe that there's so much potential to unlock, both in their experiences and from a protection perspective, but also in the economy. And we have to recognise also the values that women bring to the economy.
And so obviously civil registration, touching on all these different life events there are so many opportunities where we can protect women and girls and prevent them from experiencing a life that doesn't allow them to become active members, for instance, in society and within the economy.
But we did need donors to be able to also invest in that and in this really difficult donor landscape, as Silje mentioned, we ask that donors take a risk to support digital public goods up-front to invest. And our promise to donors now in the conversations we're having is that in five years' time we want to be self-sustainable.
But in order to get there, we need up-front investments. We need to invest in R&D and invest in our digital development strategies and invest in trying out different business services. It's really not traditional. It's different. It's not necessarily what our industry is used to.
But we're certainly excited about that future, because we want to become self-sustaining. We believe that the value of OpenCRVS is being realized. We’re working with eight countries and there's many more lined up to explore.
And if we think about the potential of, for instance, preventing child marriage, you know, through understanding the age of children or having data to be able to actively take steps in order to do that, that's something we want to explore more within our research capacity. Just think about the long-term effects that could have on women and girls, on the economy, and for businesses around the world as well.
So really a call to donors to -- there are always immediate challenges and problems. The world we live in unfortunately is experiencing too many of them at this time. But we truly believe that up-front investment, we can become sustainable. And we think digital public goods can do that and bring more positive outcomes for those around the world.
Thank you very much.
>> SUNITA GROTE: Thank you very much, Annina.
Tawhida.
>> TAWHIDA SHIROPA: I think this is the question I love to answer. I want to start with change like with the mindset. First we need to change the mindset of the stakeholder and to some point the investor. Because I met many investor, sometimes they're not convinced about the models are like what we're building they want to see the revenue from Day 1, right?
But yeah, throughout the process we learn how we can generate the revenue. And right now we're generating revenue we're on the cash positive side and we're profitable as a start-up.
But I think this journey was quite difficult because access to finance as an entrepreneur, I wanted to mention as a woman entrepreneur is always very difficult. Especially in the banking system. I wanted to mention in Bangladesh, it's difficult to access the [?] especially the banking system.
And we need the investors who really values the impact and who really values alongside with the innovation, right?
And that's one of the things I really wanted to mention. And because like as Sunita mentioned in the whole conversation, like only 2% -- less than 2% of the PC fund goes to the women-led start-ups, right?
So I think we really wanted to change the picture and I think we have all the capacity and the capability, I think as Annina and mentioned, we have the capacity and capability and the resources, we just need to [?] and because we have the scalability option, we have the [?] we make sure the inclusivity, we wanted to make sure that I think proper service to our young girls, the features underserved people.
And yes, this is something I did want to, I think, bring to this table, like I think the PC mind, like should be slightly changed. So maybe that can make a bigger difference and huge impact in this world.
And of course, like, ensuring all this open-source model. And we are a good example, like it can be profitable, it can be generating revenue. So you just need to trust a bit on us.
And thank you very much.
>> SUNITA GROTE: Thank you, Tawhida.
Lisa.
>> LISA SIVERTSEN: Thank you. Thank you. No, I think it's -- we definitely need to continue to develop those public-private partnerships in order to mobilise funding. We need to engage in emerging donors more.
It's great to have IKEA and others on board, but we need more participation from the private sector. But also the tools and the systems will not be really meeting their goals unless they also engage the users and diversity of users.
Also finding ways of how to engage the marginalized communities and people to make sure that the systems actually work for everyone. I think also one last point from me, we need to -- to recognise that Digital Public Infrastructures and goods are essential infrastructures for any society.
Just like roads or electricity, we need those systems to be able to build inclusive digital societies.
So I think that's a recognition as well that we need to continue to champion across any geographical context.
>> SUNITA GROTE: Thank you so much to all of the panelists, including in the first segment, for all of your openness and anecdotes and reflections today. I think it was a very rich discussion.
And from my side, I hope that as we go about our business here at the forum and back home that this discussion maybe gave each of you one specific action point that you feel you can take home with you as you look at either designing or building products. And we heard very much around the need to be deliberate in design and research to ensure diversity, to be deliberate in how products are designed, what code to choose, what models we choose, what datasets we choose.
And also how important it is to be able to look under the hood of a product. To be able to create actual safety, actual data privacy, but also that emotional safety, that empowerment so that we don't just have passive users, but we have empowered, informed user that can engage with the digital solutions we're putting out there in the market and can question us.
We heard from Tawhida about accountability and how that raises the bar on accountability.
And maybe not surprising, we're all facing unprecedented challenges when it comes to the financial landscape. We heard so much about how and where each dollar, each kronor and euro was put and to what extent they're builders and owners of what's put out there today in terms of digital landscape.
I encourage you to approach those panelists in the room for any further discussion that you might be interested in. Thank you very much for choosing us and for spending your precious time listening to this discussion.
I for one really enjoyed the dialogue, and maybe at the next IGF I'll be facilitating a panel of four men talking about the importance of women's health and looking at inclusive solutions.
That's my hope as I walk away from this panel today. So thank you all and enjoy the rest of your day.
(Applause)