The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> MODERATOR: Hello! Morning, afternoon, evening. We're facing some technical issues, but we're starting soon. Okay. We beg your pardon, and in a few minutes, we are starting.
>> Hello, I'm actually using a different Zoom account, is it possible to change my name.
>> You can change your name. Click on the three dots, when you see the picture, click on the three dots and it will appear the option of rename and you can change your name.
>> I don't have that setting for some reason.
>> Same for me as well, you're not the only one.
>> I'll ask the tech support how or if we can do via different way. Okay. Just a few minutes, please. I'm sorry. I'm sorry, guys, for these minutes we are a little bit late. It's because of technical issues. Okay. But everybody is already ready to start. Just a little issue. Okay. Just a little more time and we will Start. I'm sorry again.
>> MODERATOR: Hi, guys. I think we're ready. Okay. Okay. Let's start. Hello again, all. Good morning, good afternoon, good evening! Welcome, I'm sorry again about the little delay, but now I think we are ready. We're starting the workshop addressing children's privacy and edtech. I'm Maria from Institute and NGO in Brazil that aims to honor children and it's a pleasure to be here and moderating this panel with me is Joao Coelho keeping up with the discussion online and Julia Mendonca from data privacy research organization that is co‑hosting this workshop in Alana and here on site we have Thais also from Alana.
To warm up the discussion we're appropriating, I would like to present some contextual elements, starting with the predominant business model in the digital services, which has from the past years, been based in massive data collection for providing users and targeting them with personal. A recent report launched by Human Rights Watch points out most edtechs and platforms are inserted in the state economy model and many collect student data and sell to brokers, such as data brokers exploring personal data collected against best interests. If this edtech industry is growing, the COVID‑19 pandemic pushed it even forward, as it forced governments and schools around the world to adopt remote learning strategies for all students including children. At the same time, children rights standard localized their right to full education and right to having their data handled according to best interests. The student’s data is however largely unrelated in the Global South where data protection regulations are still not there. Stemming from this scenario, we need to discuss how the current predominant edtech model affects student privacy as well as alternatives that are more respectful to the rights.
So, to kick things off I will give the floor to H, the researcher responsible for the important report named how data is keeping to ply variate life from Human Rights Watch and so Hye thank you so much for being here today and can could you give us an overview of the edtech industry and privacy invasion reality? Please, you have the floor and 10 minutes.
>> HYE JUNG HAN: Wonderful. A quick mic's check everyone can hear and see me okay? Thumb's up? Wonderful. Hi, everyone. It really is an honor and pleasure to be here with you today, and as you know over the past few months, and soon in the next few months, millions of students have and will return to school in a new academic year, and they're largely using technology adopted during the pandemic.
So, just a few months ago, we at Human Rights Watch published a blockbuster investigation on how these education technologies, otherwise as edtech endorsed by 49 governments for kids to use during the pandemic, and because we looked at 49 of the world's most populous countries, our research ended up covering the majority of kids in the world who had some access to Internet and a device.
And the other thing that I'll quickly note is that, I began this research deliberately a year after the pandemic first start, so in March 2021, and the reason for this was that, you know, we weren't looking to penalize governments who are trying to make the best of decisions in the worst circumstances. The idea here was that all governments had a year of experience with delivering online learning to kids, and really seeing if they were keeping kids safe in online classrooms.
And then to give you a sense of what this work entailed, I cracked open the code of 163 edtech websites and apps and investigated 290 companies around the globe, and I was really trying to answer three questions. The first is to figure out what kind of data these products were collecting about kids. The second, how they were collecting it. And then the third is to see who they were sending kids' data to.
So, what I uncovered really was shocking, even to me, and the rush to connect kids to online classrooms, every government except for one, authorized the use of at least one online learning product that surveilled children online outside of school hours and deep into their private lives. For the first time, we now have evidence, technical evidence that the majority of these online learning products used around the world harvested data on who children are, where they are, what they're doing both inside and outside of their online classrooms, who family and friends are, and what kinds of devices families could afford for them to use.
Some of these products actually digitally fingerprinted children in ways that were impossible to avoid or get rid of it without throwing the child's device away in the trash. And this last point really bears repeating. The fact that the online learning websites and apps, again, designed for kids and for online learning, were using such sophisticated tracking techniques to the point that, even if you were the world's foremost information security expert and you wanted to protect your child, there was really no way to protect yourself or your child from this kind of surveillance. That's how disproportionate and insidious this was.
So, then to get to the third question of, who they were sending kids' data to. Most of the products sent kids' personal and sensitive information to advertising tech companies. These companies analyze this kind of data to guess at who a child might be, predict what they might do next, and crucially, how they might be influenced.
By using children's data extracted from them in educational settings, to then target with personalized content and adds that track and follow them across the Internet, and these companies not only distorted kids' online experiences, but as risked influencing their opinions and beliefs at a time where they might be highly influenced.
So let me give you an example, a concrete example of what all of this means. In the course of my research, I had the amazing opportunity to work with Rodun the bubbliest 9‑year‑old that you'll ever meet and lives in Istanbul in Turkiye with his family, and with his and his mom's consent, I got to follow him during a regular school day online. I watched him as he logged into class at the beginning of his day, and he waved hi, to his teacher and his friends. I saw him fall asleep while watching a math video and try to upload his history homework on a social media platform and just go throughout his day.
At the same time, I saw what was happening behind the scenes. Within milliseconds of him logging into his platform to say hi, to his teacher, swarms of trackers immediately hooked on to him and began recording every single tiny bit of behavior that he was doing. The platform that he used to upload his homework, started collecting his geolocation data. And to put that into simple terms, this app was collecting Rodun's precise coordinates, and taken over time, if you remember all of us were under lockdown in various countries at home, and kids were not traveling outside of their home, so this app over time could not only figure out where he lived but actually where he spent most of his time during the day, which in his case his family's living room. That's how sensitive and precise the data that the online learning products were collecting about this child.
So, now that I scared everyone about some of the findings of this work, what was particular about this piece of research was that it showed for the first time that children were really forced to pay for their education with their privacy, and children and parents were largely kept in the dark. You know, everyone operated in a kind of blind faith that governments were choosing products that were safe for kids to use, but it actually turns out governments themselves violated kids' right. 39 governments made built products, websites, and apps themselves that violated kids' rights in some way. Some governments made it compulsory for students and teachers to use their product, and which then made it impossible for kids to protect themselves by opting for an alternative, even if they knew what was going on.
I think that's also the critical point here, is that these products did not allow students to decline to be tracked. Most of this monitoring happened secretly without the child or the family knowing or consenting, and to make matters even worse, all of this took place in educational settings where kids and parents couldn't give meaningful consent because in order for you to attend school or to be marked as attending school that day, you had to use the edtech product or website or app that the school told you to use, and it was impossible for kids to opt out of surveillance without opting out of school and giving up on learning all together during the pandemic.
But there is a way out of all of this. It is indeed possible to deliver online learning to kids without compelling them to give up their privacy, and I think we'll hear a little bit about that from the other panelists in the session. And I will just say, conclude by saying that as a result of this work, we are calling on all countries and governments to pass child data protection laws because it's not the responsibility of a child or parent or teacher to know how to protect themselves, but it's the responsibility of governments to keep both governments and companies accountable to how they handle kids' privacy and ultimately to be able to protect them online.
>> JOAO COELHO: You know, this research is huge and encompassing and it's amazing to hear your thoughts, so thank you again for joining us and giving us insight on it. It really goes to show this is a global issue we all need to address and there is no better place to address it than the IGF. Thank you very much, again. I immediately give the floor to the next speaker who is joining us online as well. He's Professor, so I hope you can hear us and if everything is fine the floor is yours. If you can give a little insight considering your technical background on how those business model and how the tracking mechanisms work on edtech apps that would be really amazing. Thank you for joining us, I know it's still really early in Brazil, so the floor is yours. Thank you.
>> RODOLFO AVELINO: Hello, everyone. I will share my screen. One minute, please. I'm sorry I have a problem with sharing my screen.
Maybe I need to restart the Zoom. It's working? We are seeing your screen. Yeah, we can see it. You see my presentation?
>> MODERATOR: Yes.
>> RODOLFO AVELINO: I'm sorry. First of all, I would like to apologize for any further mistake about my English. I have been studying to improve it. I appreciate the invitation to participate in this event. I am Rodolfo Avelino, Professor and specialist in cybersecurity. I carry out research related to privacy and surveillance. I have been researching web behavior tracking technologies and increasing reliance on personal data for the past few years to sustain data drive in economy. In a sense, seeking to understand this complex phenomenon, I have been developing a theory that evolves from a failure to look at the concept of data colonialism by (?) and advancing to better theory of traditional colonialism very much in line with Michael. My definition of digital colonialism is digital colonialism can be analyzed from the technological in the digital ecosystem of electronic devices, network protocols, infrastructure of the Cloud computing and programming language. This system is the part or path that allows Internet to communicate transfer and process personal data systems and services.
It is intrinsically collected with the big techs, monopoly, which influences the technological and services standards about new people. It is still possible to evaluate this process notions and of the rapid growth of asymmetrical powers relations, especially in the United States and China.
It was intensified in the 90s, mainly with the popularization of the personal computer, the Internet, and cell phones. During this, responsible for expanding digital colonialism around the world. Silicon Valley also emerged, and headquarters of the main companies that have hosted the largest personal databases in the world, Silicon Valley, located in California was made possible thanks to the public funding of research and infrastructure from the U.S. government. The trace is produced by cyber‑mediated transactions sustained most of the big tech’s revenues.
In the business model the information collected about our behaviors is the fundamental of how materials for algorithm to predict what will do in series of situations. For this economy keep growing, corporations need to expand and distractions of there how materials that is data collections.
For these large companies such as Google, Amazon and Facebook created strategy to expand on new services, and most of which are free and users do not need to be ‑‑ do not need to pay cash to use them.
However, they need to allow companies to collect and use your data. It is in this context that Google, for example, started to develop productions and services for the most adverse peoples. Knowing this, we have browser search engine, email, Cloud storage, GPS, online translator, video platforms and stream of music among others.
The services and productions available and the more Google is able to collect data able to use in the business strategies, and increasing it power over comparatives and the profitability. The main ambitions for personal data by the big techs indicate that it is in the main field for the expansion of their business, based on growth and expiration based on the capitalist‑centric in data. They do not restrict themself to their core business and instead seek to increasingly expand on the extractions of data from different sources.
Off a benchmark example was originally a search engine company in 1998 under the name Google and has historically looked for opportunities for new acquired to expand data collection compatibilities.
In 2015, the company created a holding company called alphabet and since then in addition to controlling Google, it has been diversifying its business for beyond search. Alphabet has become one of the largest technology conglomerates in the world.
The table below presents some acquisitions by Alphabet in the last decades. It is possible to observe it is expanding its business diversity from different types of data to get. For example, I'm sorry ‑‑ no problem. For example, company Nest type of business is marked for productions and Google acquired Nest in 2014. Other example is company double‑click at serving in manage am solutions, and Google acquired double‑click in 2008.
Business intelligence and data analysis softwares, 2019. Other company ways of mobile navigations. And, FitBit, fitness devices and app, Google acquired the Fit Bit in 2021.
In this table that I developed I tried to compare the characteristics of the layers of the TCPIP model, the layers of Internet governance demand models of Cloud computing and finally of which the main platforms are working. It's possible to observe that Amazon and Google are present in all layers of the Internet.
To conclude the processing of the student dataflows, the same logic often of other types of data that can enrich the large database of this corporations. This is largely present a survey carried out in South America by the University of Para which analyzed which universities have email services of big tech. More than 78% of universities in South America hosted their emails on big tech services.
Finally, most of them are hosted on Google services, and with this were digital colonialism has been concentrated in this company. That's it. Thank you very much.
>> JOAO COELHO: Thank you so much for bringing this perspective of digital colonialism to our discussion. It's really important that we understand the power dynamics between the north and the Global South to really get a full grasp of the debate. I thank you again. And we're going straight ahead to our if he cans speaker, who is joining us on site, it's Marina Meira from data privacy Brazil research association. Thank you so much for co‑hosting and organizing this panel with us. It's a pleasure to have you here as always. The floor is yours. Thank you.
>> MARINA MEIRA: Yeah. Hi, everyone. Thank you, Joao. It's always a pleasure to be with you. Good morning, good night. I think from Brazil, it's good night. Hello, everyone. So, as Jaoa said. I'm Marina Meira the Head of research association a nonprofit based in Brazil, as the name says, that promotes the protection of personal data and other fundamental rights in the face of emergence of new technologies, social inequalities, and power asymmetry.
I'm a lawyer by training, and I'm also the coordinator of the working group on data and childhood of the Commission for the Defense of Children's Rights from the San Palo division of the Brazilian bar association. I've been working with this topic of the defense of children's digital rights, including those before the edtech industry for quite a while, and my work has been focused and has been based within two premises in the last years. So, the first one is the need to centralize all actions around the concept of the best interests of the child, and I'll get more to that ‑‑ I'll get more into that soon.
The second one ‑‑ the second premise arises from the fact that children cannot be treated as a monolithic group. In my case, that means going deep into the particularities of the Latin American context and how that affects the relationship between children and technologies in general, and children and the tech, in the edtech industry in particular.
And when I talk about this context, I want to highlight how Latin America also as part of the Global South is marked by profound inequalities. The access to the Internet and to technologies is still far from being universal. There is a large number of mothers, fathers, and families who did not have access to digital literacy or who have several jobs in order to sustain their homes, and therefore cannot support their children while they use devices and digital platforms, education or not.
In the case of Brazil, it is a context in which zero‑rating policies are in place in the favor of big techs, which are WhatsApp and Facebook, for example, to largely widespread means of communication also among children from a very early age.
Unfortunately, also my home country's federal government conducted murderous policies during the pandemic which actively left thousands of children as orphans, increased the child learning deficit, and impoverishment of the entire population. At the same time, in this emergency pandemic scenario, we saw big techs reaching out to education ministries and Secretariats in Brazil to offer their products for free. Well, as we've been talking about today and as Han and Rodolfo presented, the for free serves their interests, and I mean the industry in many matters.
Keeping that in mind, I want to start addressing as the lawyer of the panel, one of the lawyers, the question of why exploiting children's data and student's data commercially is a problem. And it is problematic, especially ‑‑ it is problematic as a whole, but especially in the educational context, and to start this conversation, I think we can all agree, I hope we can all agree, that we want all children to fully access education and we want this education to be actually emancipating to them.
So, first, this commercial exploitation has a purpose issue. The reason why we use technologies in school is supposed to be, in my view, to support the development of emancipating education for all students. This goal is in no way related to mass surveillance or to the use of student data to targeted advertising. On the contrary, there is a clear purpose deviation. This opaque and after all, profit data handling practices are assigned that children's right to education is being captured by harmful technologies as they are reinforcing their insertion into the attention and data economy in general.
This pairing of the edtech industry with the attention economy and targeted advertising industry as it's clear from Han's research has been promoting a clear violation of the student's rights to privacy and to the protection of personal data as well. On top of that, as also Rodolfo start mentioning, it impairs children ‑‑ and as we're still unaware of present or future or individual or collective impacts, and how is that so?
So, children who are going under a developmental stage, they need to be able to make mistakes and learn in those mistakes, and also to experiment throughout this development in order to understand and mold their own personalities. I mean, who hasn't had a face when you were younger and that should now look at and you're like oh, my God, what was I doing.
The logic behind the database business model and attention economy in general, well also totally linked to the surveillance and commercial use of data, is mostly based on profiling techniques that will aggregate people with similar interests into groups, and will then constantly target these people, the type of content that is understood to be interesting for them. And then the content here, I mean ads which are really problematic when we think of children being targeted with ads, as children are vulnerable to, well, to ads in general, but as I mean content in general as Rodolfo also mentioned. I mean after all the idea for database business models that is adopted by most digital platforms and unfortunately is being fed by the edtech industry is to sell advertising, but as to make the platforms display. And so apparently interesting, and therefore keep all people online for as long as possible so that more data is learned about them, collected, and the targeted content can get even more precise.
And when it comes to children in particular, this purpose of keeping them online and on to their devices for as long as possible saw that the attention economy cycle can keep on turning, relates to a problem of addiction to screens. I mean as children are beings going under developmental stages, learning to deal with desires and instincts, they might have a tougher time, and I'm saying tougher because us adults already have a tough time sometimes, resisting those patterns add addictive platform traits, and that can impact directly on children's mental health, and I mean we do know that the content of most digital platforms is not appropriate to children, and also impact on children's physical health like it can be related to eye sight problems or to more sedentary lifestyle, for example.
The problem is still much broader than this, and the need for children to experiment on their personalities is completely undermined by the attention economy and its profiling and aggregation data techniques. In the end, what we see today is that the content that reaches children online and therefore will influence their personality shaping is to some extent being dictated by private and commercial interests. So, besides behavioral manipulation, this aggregation and specific content targeting can also reinforce discrimination, and I mean for example, we know that the advertising industry is still sexist and reproduces gender stereotypes. Usually, I don't know, we see a lot in Brazil, for example, ads targeted to children, but then when it's to boys, it's cars and adventurous toys. When it's to girls, we see for example, little kitchens and stoves. That is also an issue related to all of this industry.
And while the problem is ‑‑ I mean there are a lot of problems with that, but we're letting the edtech industry be a part of this. And if that's not scary to what an emancipating education is, then I don't know what is. And as Han did, I don't want to wrap up in a pessimistic way so I want to raise some possible paths on how to move forward and tackle this craziness, especially from a regulatory point of view.
First of all, I understand that in order to face this problematic current scenario of the edtech industry, we need to understand that the protection of children's rights will only be actually reached once it's shared among all of society. So much is often said about families being responsible for educating children to use digital devices and services, but we have to remember those inequalities I mentioned at first, so families who don't have access to the Internet, families who haven't had digital literacy training, or families devastated in the pandemic. Families, of course, should support children in the use of edtech apps as much as possible, but that cannot be all. We cannot stop the talk there. Also, because depriving children from education is not an option as Han also mentioned.
So, we need to think about how states choose edtech app tools to be adopted in public education and how schools themselves choose tools to be adopted in the private sector education, especially.
And we need to address the responsibility of the private sector as a whole, and then I mean the edtech companies themselves but as the companies from other sectors which are buying students data from them and profiting on student data in general.
That said, when addressing the responsibility of states, schools, and the private sector, we need to bring the concept of the best interest of the child to the table, and that's not me, Marina saying it, but it's the UN Convention on the rights of the child which is the most ratified international treaty in the whole world. It says clearly that all actions that directly potentially affect children must be undertaken in order to fulfill their best interests.
And what does that mean in practical terms? The UN's committee on the rights of the child tells us that in General Comment 14 which explains that the best interest of the child is a three‑fold concept. First of all, it's a substantive right. Children have the right to have their best interest fulfilled and prevailed over other interests, any other interests. And if the edtech industry is favoring its profits over children's rights to an emancipating education, children's rights to information or self‑determination and to free development, children's best interest is not being fulfilled.
Second, the concept is a principle to be evoked whenever a legal provision is open to more than one interpretation. And necessarily the one that most effectively serves the child's best interests should be chosen.
Last but not least, children's best interests also unfold as a rule of procedure. Whenever a decision is to be made that will affect children, the decision‑making process must include an evaluation of the possible impact of that decision on the children concerned. When it comes to the edtech industry, that can be translated into the obligation to states, schools, and edtech industry itself to conduct impact assessments before developing or deploying technologies that will handle student's data, and that in order to assure that risks to children’s' rights, students as a whole focusing on children. Will be mitigated and best interest fulfilled and that will be if the impact assessment is conducted with strict methodology with participation of all of those involved, including children themselves. And if someone wants to go deeper into this conversation, please come find me because we're developing a study on that at Data Privacy Brazil.
So, to wrap it up, this best interest framework is international, and we should also seek for instruments that assure it, but that's not all. We also need to address the need for local regulation to be put in place for the edtech industry, which is mostly unregulated if any specific way throughout the world. And we need regulations that impose concrete rules to assure student protection, accountability to the industry, children's best interests, of course, and regulation that directly dialogues with local peculiarities. That's it as the first round. Thank you very much.
>> JOAO COELHO: Thank you so much. Thank you so much, Marina for explaining so beautifully how the business models collide with best interests, and like you said the best interest of the child is one of the main considerations, one of the main guiding principles of UN's Convention on children's rights, so it is absolutely imperative that we take it as a serious consideration, a primary consideration rather than an empty abstract concept, which is what we see in practice a lot of the times when we're talking about regulation. So, thank you once again.
Last but certainly not least, we have Nidhi Ramesh who is Youth Ambassador and also the creator of the Right Angle podcast I strongly recommend everyone check out and is here representing children and teenagers on the panel. Nidhi, I would like to reinforce how important it is to have you with us. There is no way to conduct this discussion without listening to the people most affected by those issues. Thank you so, so, so much for joining us once again, and I would love to hear your perspective on the commercial exploitation of children’s' data and your opinions on how governments can take children's views into account when regulating those platforms and those business models. So, thank you and the floor is yours.
>> NIDHI RAMESH: Thank you so much for having me. So, I just want to reintroduce myself. Hi, I'm Ndihi and a 14‑year‑old student. In India I currently live and study in Malaysia and attend school here. In fact, I've just come home from school add half past 4:00 in the afternoon. It's a great honor to represent the youth on this Forum. Thank you so much for having me. I really appreciate it.
This is such an important topic, especially from a teenager's perspective. As we've heard from all the others, it's obvious that young people extensively use technology in day‑to‑day lives. Now days, online platforms and digital tools are available for anything and everything, whether it's for education, music, videos, socializing, plays games, or with anything that we can think of. We are a generation who is growing up in this digitally interconnected world and we are increasingly becoming ever‑so dependent on it. That is way safety and privacy online is so important and I'm particularly delighted that forums like these allow us children to participate and listen to what we have to say on topics that impact us the most.
So, thank you Alana Institute for having me here. Earlier in the year, I had to privilege of being a youth ambassador on an online panel hosted by Five Arts Foundation London where I was part of an event discussing and launching a children's online safety toolkit for governments and institutions. The opportunity to share ideas and how we could make online safe for kids with leaders from the African union and European Union was indeed eye opening, and it was also a privilege to be on the same stage with Prince Harry and listen to his views on how everyone can play a role in keeping children safe online.
This IGF Forum and this topic in particular on children's data privacy while using educational apps is quite aligned to the overall safety of children online that I was engaged with.
Now before I begin, I just want to say that I'm definitely not a professional and what I say is based on my opinion, and what I think. So, I of course as someone who just turned 14 last month, I should admit that my friends and I use digital platforms extensively. Everywhere I go I see children on mobiles, iPads, and devices. The pandemic, I believe, has especially accelerated this, and online access has become not just common but an essential part of whatever we do daily, inside an outside of the classroom.
For example, I use an online platform called Anchor to host my podcast the Right Angle and beauty of the online tool is reaches listeners in 40 countries, easy to use and I've been using it for three years now and I also write and authored two books and here again when I wanted to publish them, the first thing that came to my mind was publishing these online. Getting it out on an online platform like a digital e‑book on Amazon kindle was easiest fastest and cheapest option I could think of. My books are now available on every corner of the world for people to read. I also have my own website I use to publish high work online. We as this generation also use social media, educational website, language apps on mobile phones, et cetera. We also use digital platforms like Google Classroom to access school lessons, attend classes, share notes, do homework, et cetera. Using online tools is a key part of who we are and how we go about our day‑to‑day life, and this I'm sure is the case with most kids around the world.
Of course, all of this sounds great and very useful, but what we have increasingly realized is that these very companies who provide such wonderful solutions are providing these purely for their own commercial gains, so just imagine all the tools I talked about, Anchor, kindle, wix, Google social media sites, almost all are seemingly free to use and also require very little parental permissions online, so there should be a tradeoff in terms of what they are getting from us children while providing us with these services for free. It is all our data that is being recorded and used for further commercial gains, to the extent of which it has no limits and sometimes can be extremely dangerous, especially when it comes it data that is private and personal, so these so‑called online tools and educational apps can capture children's locations, personal detail, usage patterns, eating habits, data on their devices that may be totally unrelated to what the tool is even being used for. Anything we click is being recorded, our pictures, conversations, texts, everything. The question really is how are these organizations utilizing our digital footprints and the data captured while we use their apps and platforms.
Technology companies do argue that they have sought the permissions through their elaborate terms and conditions, but practically who reads that 10‑page long list of TNCs and most children aren't aware of the implications in what they read and accept in the terms and conditions. I can't remember the last time I read and understood any T and C and the consequence of that. Even we see adults not reading these, let alone children. In a way, forcing users to accept the T and Cs is the best excuse for companies to capture all of what they want. There are instances where data is leaked or even sold to third‑party companies to sell related products, send spam emails or further use data for totally unrelled purposes than what was initially intended when users subscribed for those apps.
The data could haunt the children after many years, it could also get into the wrong hands for online abusers to reach children and hurt them. They may be used to hack and defraud accounts that are linked, et cetera. Trolling, shaming, and abusing are some of the examples that we've seen of late.
I am sure we are talking about this important topic because the answer to providing safety is not easy and straightforward. The ownership of companies and the Internet goes beyond national boundaries, and not all platforms can be brought into the same rule of law and make them accountable. This makes it even more complicated. Of course, the first and foremost way to protect children online is to be aware of what data they provide and whether these apps that they use are putting their data in unwanted hands.
Check the company's reputation, reviews, take advice from parents and teachers, and check online if you're in doubt before using them. Maybe teachers should be trained in schools to help students understand how to keep their data safe. The other way is to ensure that we have the best practices enforced that puts all technology companies that offer solutions to children, only take data that is relevant. This has to be mandated or companies should face severe consequences. This is where the IGF can play a role and convince governments to enforce these universally. Governments should come together and make laws that ensure that children stay safe online and their data is protected.
As is summary, I would like to say that it's obvious that technology is not going away and children are increasingly going to use the Internet and online apps for educational needs and other social media requirements. We have to protect our privacy or else this will go unchecked and spiral out of hands. The Internet is too fast and the people abusing the data are trying to go faster. Whatever is their motive, money or any other, we should work collectively to bring laws across national boundaries and encourage organizations, government agencies, and international institutions like the United Nations to mandate rules that will help protect us online and our privacy. Thank you very much.
>> JOAO COELHO: Thank you, Nidhi. It's so important to have the perspective of someone who is emerged in the new technologies and thinks critically about it. Thank you, again, for this class for teaching us that much.
So, unfortunately, our other panelist, Michael from a nonprofit that provides resources and services to the educational community of Canada, had a technical problem with his Internet and is not able to be here today. The idea was to listen to him about other inspiring possibilities to be used as alternatives to the edtech industry. I hope you have other opportunities I know you have.
Now, we have half an hour, I think ‑‑ no. Less than this? Half an hour? Okay. We have some time to ‑‑ (Laughing), we have some time to ask questions and bring thoughts online and on site. Who wants to start here on site? Anyone here? Here, please, okay.
>> AUDIENCE MEMBER: Thank you. Hello. I'm Rodrigo a researcher from Brazil, and I'm also participant in Brazil's Internet committee youth program, and my question is to Hye Han and I'm sorry if I mispronounce your name, but one of the points if your speech that intrigued me the most is about how all the data collection serves to the final purpose of backing up the direction of personalized advertising of children. Could you briefly speak on how this data is used to deliver this personalized content? I would like to hear more about in what ways companies use that data to like reach the children in a more precise way. Thank you.
>> Okay. If the panelists agree to the questions and then return to the panelists. Is that okay? Okay.
>> AUDIENCE MEMBER: Should I proceed? Okay. Thank you for the chance. Firstly, thank you for this presentation. It's very interesting for IGF because the scenario about children and kids is very interesting for the next global citizen, so it's very interesting issue.
The first question and suggestion for I think Marina, yeah, you present about rules and regulations, especially by governments and private companies and scholars and policy and regulation issue. It's important, actually.
But my doubt is big take at this point is very influencing global government, so how can we working with such technology because they're influencing much of the global citizen and global government and political structure, so how can we try to cooperate with such technology. The problem is because the algorithm by itself is a problem. They know the algorithm and how it works, so the manipulation is obvious. So how can we manage?
Actually, this technology, the customers or the communities are even the global citizen, so how to complement the reality and the next generation, how can we handle it?
As I psychologist, I try to observe such influence like when can I see and younger generation or adolescents use social media, their different communication problems, instant gratification by its nature, instantly gratified by such small things, and even they are ‑‑ the problem is that social anxiety, social use, social media platforms especially and conflict resolve and even emotional problems like emotional intelligence problems affected by such platforms because the algorithm by itself is addict bl and has addictive indicator or formats to use tremendously. So, the how can we govern this kind of contradictions, if you have any idea or suggestion? And actually, the technologist work with such emerging citizen because decide very important that their customer will be ‑‑ the next customer would be. Thank you for having me.
>> Stop there and then open again next session.
>> My name is Tamba information regulator in South Africa. My question is one of the speakers mentioned that it's important to also have privacy laws directly for children. Now, my question is, is this an idea where the privacy laws for children are independent of their existing privacy regulations, where there would be a standalone or there would be an addition to the existing, you know, privacy laws. Because an example in South Africa is where our privacy regulation has put regulations for children's processing of information of children. So, I just would need to understand if this would be an addition or is it sufficient to have it included in the main privacy regulations? Thank you.
>> Perfect. Thanks. Over there.
>> AUDIENCE MEMBER: Thank you very much. I would like to appreciate all the presenters in this session. My name is Caroline from the Communications Authority of Kenya. As part of our child online protection and safety program, we really try to drive availability of productive solutions for young people so that we could get a lot of content and information that young people can access and use in Kenya. However, the use of technology in terms of for education and learning, has become a challenge in that there may be need for us to understand what or how to balance access and privacy issues, vis‑a‑vis protection issues. Let me just give a scenario.
There are some solution providers that have approached us with a hope that we could endorse some of the solutions, but it becomes a challenge because at what point do you say that this content should not be accessed by a child and what should they access. There are thresholds that have been put in place. Are there any matrices that one country could adopt so that we could look at whether ‑‑ and see whether this edtech solution is good or this solution is good for use by children within the learning environment or the home environment that a regulator or government could recommend. That is a question that would be of interest to me. Thank you very much.
>> Thank you so much. We will stop here and then open again. Han, would you like to start?
>> HYE JUNG HAN: Sure. Wow! Thank you all for all of these wonderful questions. I'm actually going to take them in most recent to least recent because that's how my brain works. I'm actually ‑‑ I'm actually going to start with both Caroline and Tamba's questions and I'm really excited to have you both in the room because this research actually looked at of all the 49 countries, includes South Africa and Kenya, and in both cases, both governments built education ministries, they built their own websites which I found were violating kids’ rights in some way or another. I would love to have follow‑up conversations with you in how to better protect kids in your respective countries.
Caroline, to answer your question, it's I think the question of appropriate content that is absolutely country specific and it's, of course, a lot of that work falls on your agency or other government agencies in Kenya to determine what is appropriate content for kids.
That being said, when it comes to essentially evaluating whether a website or an app protects kids’ privacy, there is absolutely universal technical standards and matrices. And to be honest, a lot of these standards that I recommended in my report and also researched and talked with companies and governments about, is these don't require a lot of money. Actually, a lot of them don't require money at all. Sometimes it's just a matter of asking a company to change a line of code to make sure that they don't track kids' location data in a certain way or don't share that data with third‑party advertisers. There is actually very easy and specific things that a regulator could require a company to do and to be able to easily check. For that I'm happy to make myself available for more detailed conversations.
Tamba, on your question about standalone child data protection law, I think Marina and Marie and others will also have opinions on this, but my first sense is that it is absolutely up to you. There are two different ways to do it, right, which is that you can either have a standalone data protection law in addition to whatever existing data protection laws you have for the general public, or can you also put together legal guidelines specific to kids that are enforceable by your regulatory agencies. But in any case, I think the key takeaway is that there needs to be child‑specific either a law or guide line that is enforceable because kids do require specific protections, even more so than adults do. And I can talk a little bit more about that offline.
And to the first question from Rodrigo and I hope I remember your name correctly, about behavioral advertising. I'll give you two examples. I think I'll ask everyone in the room to remember this scary term, Cambridge Analytica, as you remember this huge scandal then Facebook, now Meta found itself, it was discovered that the company was creating shadow profiles on people who had never signed up for a Facebook account, sharing the data with a firm called Cambridge Analytica and the idea was all of this information that Facebook collected about you was being used to make a shadow profile that would determine or guess at what kind of a person you were, again, how you might easily be influenced, et cetera, et cetera.
The reason I bring that case up from 2016 is that there was a specific type of tracking tool that Facebook had built to enable that. It's called the Facebook pixel, and now they renamed it the meta pixel, and this particular type of tracking allows any website, when they send their user's data to Facebook using this tool, it allows two things, the first is it allows that original website to target that person across the Internet and to any Facebook and Instagram and WhatsApp accounts they have and advertise to them there. A second is that it allows Facebook to use that data in whatever advertising purpose it chooses to, including to resell it to other advertisers.
I found that a significant number of edtech products were using this tool, and I was able to document in realtime transmissions of children data being sent to Facebook using this tool. And so at that tool, I gave you a long‑winded explanation but essentially that tool enables behavioral advertising by Facebook and any advertiser on Facebook. That's the first thing.
The second is Google, which is also a very well‑known giant company as Professor Avelino mentioned in his session and talk. You know, I knew going into this piece of research that everyone uses Google ‑‑ any Internet developer uses Google's tools in some way or another, even if it's for a benign purpose like using Google Analytics to measure web analytics on your website.
The problem here is twofold. The first was 98% of the products they use sent data to Google, just generally. But of that number, a significant portion sent information to Google's advertising‑specific domains that are tailored for behavioral advertising. Again, like in the Facebook example, it allowed these edtech products to check their users after they left their platforms to be able to advertise to them elsewhere on the Internet. It also allowed, according to the terms of service, it also allows Google to use those kids’ data for whatever advertising purpose they would like for perpetuity, specifically for behavioral advertising. And just to wrap that up, an example I'm thinking about is from Kazakhstan, no one heard about it before the pandemic and suddenly it hit and this tiny startup with 10 users suddenly ballooned to half of the child population overnight and we know because the CEO founder went on the interview with Forbes and talked very proudly about this thing.
Interestingly, with this new found success, this startup founder said, hmm, well I suddenly have this captive audience of half of our country's children so what can I do with this? And I documented ‑‑ I noticed and documented that on the website where kids are logging in, he put a price list for advertisers and in there he wrote, this product had a website an app version and said you know, if advertisers, if you would like to advertise to kids for one day, it will cost this amount. If you would like us to forcefully send push notifications to our app so that kids ‑‑ so the ad pops up on the kid's phone before they have a chance to decline or deny and consent, it will cost you this much more over this many days, and it was extremely blatant, and to top it all off, the CEO included a client list of previous advertisers that had purchased the services to be able to advertise on his online learning platform, and they mostly included companies and conglomerates that sell like Nestle and McDonalds, et cetera, so just to give you a sense, and then to tie back to the point about the fact that the decisions being made at this kind of level are not in the child's best interest at all. With that I'll stop and pass it over to other folks.
>> JOAO COELHO: Wow. Thank you, Han. Again, I've just remembered that this important, scary, and challenging report Han presented us is available in several languages on Human Rights Watch site, right, Han? Everyone is invited to know it better.
We'll pass to Marina for your considerations.
>> MARINA MEIRA: Thank you. I'll try to address some of the questions, including the million dollars question which is how do we fight the huge international big techs. Well, I wish I knew the answer. I don't think it's simple but this is a great place for us to strategize and think together about how to approach that.
I mean as people working at organizations from Civil Society, sometimes we feel small because it's just some of us and they've got tons of money and the best lawyers in the world, so it is definitely a big challenge. Some of my insights here definitely the big tech business models is completely problematic in general, but it is especially problematic when it comes to children.
So, I think that using childhood and concept of children's best interests could be a starting point for us to do advocacy against this business model as a whole because it can be easier to convince regulators and the justice systems, international and local in the first place that it is problematic to children, and once we do that, I think we can continue going on. I think that's the first thing.
My second point is that I think we should do more international‑level advocacy, so there is this document which is really important for our discussion, and it's the UN's committee for the rights of the child General Comment 25, and it's the general comment on children's rights in relation to the digital environment. They do mention education and privacy and how, well, and every time children are in the digital children, their best interest should be fulfilled, so that's a first step, but it's still not a very concrete document. I think we can perhaps work on international cases built on top of that, and taking concrete cases to an international scale.
I think what would also be really strategic is to choose some key countries and places where we can litigate, for example, so that we get rulings that say that this commercial use of children's data, and perhaps edtech apps are also a good strategic gateway of starting the discussion of why it's problematic. I think once we get one good ruling, we can go after ‑‑ we can replicate that in other countries and not only because of that other ruling, but as because then we can evoke the principle of nondiscrimination that is really big in the children's rights declaration of the UN and in several countries’ legislation, I guess most of the country legislations. That has happened before, for example in the U.S. there was a case a few years ago, New Mexico versus Google. They tried to sue Google saying they were using commercially children's data and it did not work but it did not work because the case was dismissed for formal reasons. So, it is problematic and I recommend everyone studying that case and perhaps we can also learn from it.
Last but not least, the question that was made about specific children's regulation like children data protection, and I think and that's my personal opinion, that I think that all data protection general regulation, they have to address children in specific because children are beings going under a developmental phase. But I think those general regulations will not be able to be complete to actually protect children.
So, me personally, I think that besides having a special section in general protection regulations, data protection regulations, especially DPAs should look into children's data protection in a specific way. We have been seeing that, for example, most ‑‑ well mostly in the global north, up until now, and hopefully widespread, but we have seen DPA, for example, from the UK, from France, from the Netherlands, they're starting to issue their own regulations on children's data protection, and they ‑‑ I think that's interesting because they go over it in a really broad way, so you can issue specific design measures, measures that are not only ‑‑ not only related to rights and obligations, so you can issue specific design measures and also other measures, such as the obligation of conducting data protection and impact assessments when it comes to children in specific. I think also if you issue that regulation within the DPA, it can be a bit less ‑‑ a bit less, and that's in every country, of course, but it can be a bit less subject to political interests in general. So, I think it can be led, perhaps, with a more broad view and with a participation of specialists, of several fields, is not only the law specialists and regulators, but as people from the educational system, from pediatrics and all sorts of areas of knowledge that involve children, and also children themselves, so we have Nidhi here and perhaps she can even say more about that. But we have seen other ‑‑ some DPAs involving children in the discussion to create those regulations, and I think that's very valid as well. I hope I was able to kind of address the questions.
>> JOAO COELHO: Surely. Don't you think? I'm afraid we have to finish now, but I think that if we don't answer those million questions, it's not only. I think we have more kind of work to ensure safe privacy regarding education technologies. So, I want to reinforce that we must face children's rights, digital rights as a priority, especially in the Global South. And I also would like to finish paying tribute to someone who teaches us in Brazil a lot about these issues we are discussing here today, which is Professor Daniel Dovata in a very difficult health situation and we would like him to hear again once more how grateful we are to have him along with us building ways for children to be treated as an absolute priority also in the Internet and on the Internet. I'm so thankful for this session and I hope you have enjoyed it just as much as we did. Thank you to everyone who joined us. Keep in touch.