The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> JONATHAN ANDREW: Welcome, everyone. My name is Jonathan Andrew. We've still got a few minutes and we're waiting for one panel member to join us from Uganda. Hello. I hope you can hear me a little better. Great. We're waiting for one of our panel members to join us via Zoom. I hope you're all connected. I'm not sure if this is your first session or not, but at the IGF, hopefully everyone is able to connect as well via their login and through the Zoom via connection so that you can field questions to the panel. So, if you would like to log in and join the session as well, even though you're here in person, which is great, that will also help those that are joining us through the hybrid mode, so those joining us from outside of Ethiopia that also have a better idea of who is here and who is asking questions, and it will be a little easier for them to follow through the hybrid format.
So, we'll try to get started in just a couple of minutes on time.
>> Recording in progress.
>> JONATHAN ANDREW: We'll be starting in about one minute. You don't get the echo, hopefully that's better.
>> JONATHAN ANDREW: Okay. Good afternoon. Okay. Good afternoon, everyone. Welcome to the session, to this workshop on Access to remedies in safeguarding rights to privacy and data protection. Welcome, everyone, who has been able to join us here in Addis; and also, we are very glad to have those joining us online through the hybrid mode through Zoom.
My name is Jonathan Andrew, I'm working with the Danish Institute for human rights and joined by my colleagues from the Danish Institute, Cathrine Bloch Veiberg and also joined by Jenna who works in the international team and together they'll be assisting in terms of the online format so they will be helping with the moderation and fielding of questions both from the you here with us and in the room, the Press Briefing Room and also those of you with us online.
To run through the format, it's 60 minutes. The session will be keeping my introductions brief as possible because the main point is to hear from the panelists and to gain from their expertise. In joining the session, each of the participants or the four panel members will have 10 minutes to give their presentation and insights on this issue.
We'll have at the end 15 minutes or so for questions and answers and we'll try to stick as best as possible to that format. As I say, can you post your requests through Zoom to our moderators, who will then bring the questions to the panel.
So just by way of an introduction, the Danish Institute for Human Rights is a national human rights institute, and we are working closely with other national human rights institutions, with a number who have joined us here for the IGF. We are working on a number of issues and one of which is access to remedies, which is part of a broader project and initiative the Action Coalition on Responsible Technology, an action coalition funded in part by the Danish Foreign Ministry that brings together different stakeholders from Civil Society, non‑governmental organizations, businesses, and other interested stakeholders who are participating in a year‑long program of events to strengthen the use of technologies responsibly on a global level. And as part of that, we have a workstream on policy coherence and this session forms part of that workstream looking at how regulations and different initiatives on the legislative front are aligning or not aligning in terms of coherence or how policy is developing on a global front.
So, without more ado, I shall start the different presentations by introducing you to our four panel members. I'll start just for a brief introduction and then before each of our presenters gives their presentation, I'll just give you a bit more about the background and biography. We have joining us Cynthia who is on the right from Kenya. We have Stella Alibatesse joining from Uganda, Mosa Thekiso and Maureen Mwadigme. We have Cynthia Chepkomoi speaking and practiced in Kenya and working as data protection counsel, and her specialty is advising and guiding organizations to set up data framework in compliance with data protection laws. Cynthia has extensive experience in rights, data protection and privacy compliance in cybersecurity audits. She has partnered and working with Kick GIZ, and Unwanted Witness in Uganda as consultant and trainer of trainers. Additionally, one of the founding members of the Association of Privacy Lawyers in Africa. Welcome, the floor is yours. Thank you.
>> CYNTHIA CHEPKOMOI: Thank you for the introduction. I'll go straight to the business of the day in terms of safeguarding the right to privacy and data, which is access to remedies, and I'm going to actually make a presentation on how we approach violation of privacy rights in Kenya and give a brief legislative background on how we go about it.
So, when we look at the legal framework in Kenya, we have the data protection act of 2019, and we also have the computer misuse and cybercrimes act of 2018. With the act which is the Data Protection Act, we have the regulations that actually give us the procedural laws on how we should conduct registration of data controllers and processers. It also provides the complaint handling procedures on how you should file a complaint to the office of the data protection commissioner. But then today we're going to look at mechanisms to seek redress where there is a violation of privacy, and it's been a painstaking process to bring to book some of the violators of privacy rights, and I'm sorry to say that some of these companies are big tech companies, and sometimes we learn that we are not ‑‑ the citizens are not aware of the procedures that they need to follow, and that means we need to do a lot of sensitization and capacity building to the citizen on how to actually follow the legal procedures to seek redress.
So, I'm going to actually make a brief presentation on the mechanisms to seek redress, and I'm going to talk about the complaint handling mechanisms and enforcement. And from time to time, we realize the violations are rampant, and at some point, an individual feels like they don't really have the authority or power to bring the perpetrators of privacy violations to book. Then how do we go about it in such a circumstance. What I would like to say is the act governs the processing and storage of personal data by governments and private sectors, and that means the authority that has been established by the Act, which is the Office of the Data Protection Commissioner is the proper authority to handle such complaints, but then previously before that such complaints used to be actually filed at the Office of the High Court because back then we didn't have legal basis for bringing up complaints in terms of privacy.
And it also establishes an intricate system of rights and obligations that operationalize the right to privacy. Data protection authorities, actually have a duty to receive and act on all complaints by individuals, and sometimes the authority on their own motion can investigate issues they have identified. I can give an example of a recent enforcement notice that was released by the Office of the Data Protection Commissioner requiring 40 digital landing companies to actually comply, and the Office of the Data Protection was calling upon them to comply with the law. And the very first stage of actually getting them to comply is the office conducting privacy audits so that they look at their compliance level in terms of data governance, if they actually registered as data controllers or processers. And where they have not registered, it means they are not yet compliant at that point.
In as much as register or the data controllers and processers register as a data controller or processer, then the question is what policies do they have in place to ensure that there is protection of information, personal information, personal data that they process belonging to the data subject. So, the first thing that the ODPC was doing with that enforcement notice was notifying, notifying the processers that there is a new law and you need to register and you need to provide the data subject to the policy stipulating how you intend to safeguard their personal data.
>> CATHRINE BLOCH VELBERG: I'm sorry to interrupt but I think you are not showing your slides.
>> Cynthia CHEPKEMOI: I'm sorry if you are not able to view my slides. I wish there was a way we could project on the screen. Let me share the screen. I'm sorry. Apologies for that.
Thank you for bringing that to my attention.
Then the second point of call with we have a violation of privacy rights is the Office of the Data Protection Commissioner, and this is the authority in Kenya when tasked to actually set the rules and regulations on how personal data is being handled, processed, stored, and actually this is where all the data controllers and processers are required to report to any issue of data breach or data loss.
So, a person who suffers damage by reason of a requirement of this act is entitled to compensation for the damage from the data controller or the data processer, but then how do we get to the point where you are entitled to the damages. It means you have to file a complaint to the Office of the Data Protection Commissioner and how do we do this? The process is in the regulations and complaints ‑‑ filing of complaints. The first, just to give a brief ‑‑ to take you through a break walkthrough of what happens, is first you go to the ODPC portal, which is a website that's available online and can be accessed by anyone at any time. The first thing is you will fill out a form that will actually ask you to state what rights were violated. That is your right as a data subject, your privacy rights in terms of data. How were they violated? Who violated these? Then that means you have to provide the details of the data controller or processer who violated these rights, and you need also to attach evidence to show that your rights were actually violated. It could be a report or screenshot of how the violation occurred. It could be that the data controller or processer actually used your personal data for marketing, which is an offense under the Data Protection Act because where you use personal data for commercial purposes, that is profit‑making, and then it would be a violation of the rights of the data subject. And to talk about consensual sharing of information of the data subject, also say sharing information to third parties is where while in the process of collecting data, the data subject knew that this information is going to be used by this company and you've not consented for this information to be used by a third party or to be shared with a third party, so where this information is actually shared with a third party without your consent, then that would amount to a violation of your rights as a data subject. This is personal data, and not any other type of data. So, in that regard, I would say while we've actually filed the complaint with the office, the office takes around 14 days to respond to the complaint, and there they will be called in to give their evidence, and I know we understand the issues of fair administrative action and each party must be given an opportunity to defend their side. And at this point, if the data controller or processer actually had policies or we call them agreements where the data subject consented to the process, many times realize that you consent to so many ‑‑ we consent to our data being used severely with other third parties, being shared with other third parties, but do we really sit down to read the terms and conditions of the agreements that we normally sign? Most of the time we realize that too long and didn't read. Then when your privacy rights are violated, you cry full, but mostly it's the fault of the data subject that they failed to read the terms and conditions of their contracts that they entered into.
And to curb this, for advice of clients let's say for instance that it's a hospital or a school and you're processing children's personal data or maybe this is a patient who is supposed to be transferred to another, it's important that you have data‑sharing agreements in that regard. And this will actually protect the organization from liability, and actually the court proceedings, and of course the complaints be filed at the Office of the Data Protection Commissioner.
Thirdly, we have the Court which is the last. And this is where actually before the Office of the Data Protection was a while back, then the Court was the only channel where we could file complaints in terms of privacy and basis was actually founded on the Constitution Article 31 of the Constitution on the right to privacy.
So, when we look at the role of regulatory and oversight bodies in terms of enforcement, the first thing that comes to mind as a regulatory body, let's say this is an authority like the Data Protection Commissioner, is to ensure that registration and issuance of certificates has been actually done, and the certificates be provided to the data controllers and processers. And in doing this, this is the first stage of compliance and actually ensuring that data controllers and processers are processing personal data legally, because at the end of the day you need to have a legal basis for processing personal data.
Secondly, we'll have also pull the data subject access request because as data controller processer, many times you find that you process and store a lot of information and how do you ensure that the moment a data subject walks in, they're able to access their information and do you have a system or a procedure in which they can access this information. Remember, this information should be accessed without them being paid, but from time to time, and it's been practiced in this space that sometimes you go to an institution, request for your information, but then you're told for you to access this information you need to pay this amount. But is it right that you need to pay a certain fee to access your personal data? That should not be the case.
Well, another important tool is the Data Protection Impact Assessment that enables data controllers and processers at the intersection of every project to understand that we are processing sensitive data, but then the question is that this is data that might result to high risk to the results of the data subject in case of a data breach. Then the first point of call is to conduct a Data Protection Impact Assessment that gives you an eyeball view of how to secure the rights of the data subject in conducting this project or research.
Transfer impact assessment is another tool used by the oversight bodies to ensure that the rights of the data subject are protected online. Not online, I'm sorry, in the course of any processing or maybe where there is a transfer from one country to another or where there is a transfer from some of the partners in business, it could be from a critical company supplying to a hospital and you need to share some information that could be sensitive.
So, I'll stop from there and then Stella will proceed where I have left. Yeah.
>> JONATHAN ANDREW: Thank you very much, Cynthia. I'm just going to pass on to Stella Alibateese joining us. So, thank you, Cynthia, great amount of progress being made in Kenya and it shows engagement with the citizens informing them of their rights, but as importantly the importance of understanding what channels and what avenues are available to them needs to be a main point of work of those working in the advocacy area.
So, I'm going to ask for Stella Ailbateese to join through Zoom. Stella is an advocate and works as Director of Data Protection Office of Uganda, an independent office set up under the National Information Technology Authority of Uganda and responsible for personal data protection in the country. Prior to this appointment, Stella worked as Director for Regulation and Legal Services at the National Information Technology Authority and has held other positions in the private and public sector. She's responsible for the management and operations of the personal data protection office, which is the national focal point for monitoring and assurance of matters relating to the implementation of Uganda Data Protection and Privacy Act. She's a practicing advocate with 26 years of experience on policy and regular regulatory matters in the public sector. I'm hoping, yes, I can see Stella there with us. I'll let you, Stella, take the floor and welcome and thank you for joining us.
>> STELLA ALIBATEESE: Thank you very much.
>> JONATHAN ANDREW: Have you unchecked your mute.
>> STELLA ALIBATEESE: Okay. I hope everyone can hear me now. Jonathan, can you hear me?
>> JONATHAN ANDREW: We can't hear you yet. Just want to check you've unchecked your mute. Yes, we can hear you now. Great.
>> STELLA ALIBATEESE: Okay. Good. Good afternoon, everyone. Thank you, Jonathan for the invitation to be part of this panel, and I'm honored to be on this panel with the other great speakers.
The request was that I talk about the access to remedies in safeguards rights and rights to privacy and data in regard to the Uganda situation. First of all, privacy, the right to privacy starts all the way from our Constitution. Enshrined in Article 27 of the Constitution of Uganda. The Data Protection and Privacy Act was one of those, it was a comprehensive law that was set up to further protection of personal data. Uganda, we had other laws that provided for privacy. For instance, the Registration of Persons Act where we do our national IDs, where we have the organization that registers people in Uganda and provisions for privacy had already been provided for.
The other law that specifically provided for privacy was the Registration of Interception of Communications Act which provided that a telecom company or even government cannot intercept your communication unless they have a court warrant or certain circumstances that they would have to comply with.
So, subsequently in 2019, governments saw it fit to enact a comprehensive law, which here we call the Data Protection and Privacy Act. That act introduced especially digital rights, the respect of digital rights in Uganda. For instance, the act has an entire chapter on data subject rights. I heard Cynthia talk about some of those rights, the rights to access to your personal information, the right to erase your personal information, the right to make connections, the right to stop decision‑making or many other provided for.
The law also provides for the office of data protection office which I head, and we know from the M Convention and other regional frameworks there is a requirement that regulators for data protection and privacy are independent bodies, so government set up this office. It is within an existing office, but provided for its independence. I think at an opportune time I will explain how we have that setup and why it works in Uganda. Now, this office is given the mandate to implement the Data Protection and Privacy Act. Part of our mandate includes resolving complaints from data subjects, so if you find that your rights have been infringed upon by a data controller or data processer, then the law gives you a right to make a complaint to the data protection office.
We also provide guidance, especially to data controllers in regard to the interpretation of the law in regard to issues related to compliance. The law also gives us powers to make directions, to inform some of the actions that are done by data controllers. It gives us power to investigate, and we can also prosecute where we find there has been noncompliance.
Under the same law, we are required to register all data controllers and data processers, and currently that system is online so the entire system up to when you get your certificate, how you pay for your certificate is entirely online.
Currently, under that same system, we have automated receiving of complaints. We activated that system around May, May 2022, and we currently have over 2,000 complaints that have been raised against various data controllers.
Now, how do data subjects access their rights under the act? The act is very specific. It provides ‑‑ while it provides for their rights, it also within the regulations, provides for mechanisms of how the data subjects will raise their complaints. And this is provided for in our regulations. Beyond that, within the regulations, there are specific provisions that require data controllers to respond to those complaints within certain timelines. The timelines range from 7 days to 14 days.
Under the guidance notes that we issued for data subjects to raise complaints, we require that data subjects first deal with the data controller or the data processer before they come to the office. That is also enabled through our system. So, if you find that you have a complaint to raise, you can use our system to even develop for you that letter that you submit to the data controller. It is automatically generated from the system. This was to ease complaints raising because we know that many people may have challenges sitting down to write letters. Many of them may think I need to go to a lawyer to be able to raise the complaint, so that process was enabled digitally through our ‑‑ through your system.
Now, once the system generates for you that complaint's letter, you're required to download it, sign it off, and then deliver it to the data controller or the data processer because we require that when you make a complaint, because we are gathering evidence against the data controller, that complaint must be in writing. But we aided or assisted the data subjects on how to do that.
Now, the law also requires data controllers and processers to have in‑house complaint resolution mechanisms as a requirement of the law. When we train data protection officers who are focal points of contact in these organizations, we normally train them on how to deal with various complaints. I'll briefly, Jonathan says I have four minutes, so once that complaint comes to us, we have the mandate under the law to investigate that complaint. We have a mandate to invite witnesses who we can interview and then make a determination. Our determination must be in writing, and we are required to communicate our decision.
If a party is unhappy with any decision that we've made, the law provides for right of appeal to the Minister of ICT and National Guidance, and then of course if they are unhappy with the decision that has been made by the Minister, then they also have a recourse to go to court.
Despite the fact that this law has been in place for a limited time, we already have cases in our courts where plaintiffs have exercised their right to privacy and succeeded in our courts; although, most of those decisions were based on Article 27 of our Constitution which guarantees our right to privacy.
In terms of our current area, given the regulations were passed in 2021, we do not yet have prosecution that we've had in that regard; however, we have a number of investigations that we're currently undertaking. Thank you very much, Jonathan.
>> JONATHAN ANDREW: Thank you so much. You've had legislation that's been in place, a key legislation since 2021, it's very impressive just how much you've achieved. You think one of the key points clearly being that you've done a lot of work to make or provide information and provide means mechanisms that are accessible to citizens so that they can use these mechanisms to bring up grievances, which is obviously incredibly important, and you've been doing a lot of work. I've been looking at your website seeing the information that you're providing, also is a key takeaway ensuring information is available and accessible. And clearly raising with the number of complaints, you mentioned the 2,000 complaints, that points obviously to some concerns that data protection provisions may be breached, but as that you have a system that's allowing a large number of citizens to raise their grievances and be heard. So, thank you very much.
Moving on we shall be now having the presentation from Mosa Thekiso from Johannesburg working with Vodacom with digital intelligent and platforms and works with Vodacom international big business and data teams working on regulatory and policy matters with data‑driven products and services across Vodacom group markets. Mosa has legal professional services background, worked as an attorney for international law firms and also experienced in GDPR and previous role as data protection specialist advisory firm in Germany and prior to joining Vodacom held senior management of national telecommunications company specializing in digital services and mobile financial services. Thank you once again Mosa for joining us online, I'll let you now take the floor. Thank you.
>> MOSA THEKISO: Thank you, Jonathan. Good afternoon. I hope you can all hear me okay.
>> JONATHAN ANDREW: Yes, we can.
>> MOSA THEKISO: Yeah. So, Jonathan, I'm just going to give the audience a different perspective from a private sector business perspective and how we deal with these issues on a day‑to‑day basis, and I think Cynthia and Stella have taken us through the legislation and intricacies of the rights that are available and remedies that are available to our consumers.
So, you know, coming from an entity like Vodacom Group where we deal with a number of privacy‑related issues across the continent and across various countries, the state of issues that we face are quite interesting in a sense that we want to, you know, to take Africa ‑‑ to take us fully into digital inclusion, financial inclusion, so those are the main topics that are top of mind for us. We don't want to get to a point where, you know, Africa or African consumers are getting left behind from a digital economy perspective.
So, bearing that in mind, what's important for us so balancing as provides those data‑driven services, which are embedded in us being able to roll out the relevant technology, and a lot of emerging and new innovative technology requires a lot of data, data processing, so the data‑rich technologies, so how do we balance actually using those technologies versus looking after the rights of our consumers?
And we actually took ‑‑ undertook a study where we looked at how we actually achieved that in the current regulatory environments we have across Africa. And like I said, Stella has spoken to a lot of those remedies available, and these ‑‑ what should be pointed out is that these remedies do differ from jurisdiction to jurisdiction which poses a lot of challenges for us as a business, we're a big business, robust, we have relevant measures in place, and you can only imagine how difficult it is for a small entity trying to maneuver through Africa from a business perspective to grapple with those different laws as they change from country to country. And the main barriers we've identified from with regard to rolling out these data‑rich technologies are data localization laws, and what we're seeing is for example when we're dealing with big data or AI technology, and we leverage off technology provided by Cloud service providers, it's that they tend to take a regional approach. So, for us to use those technology, we either have to think about, okay, where are we going to centralize our technology? Are we going to use the Amazon Cloud in Cape Town or perhaps another Cloud in Kenya? However, because we want to move our businesses throughout those jurisdictions forward at the same time, we tend to have to use one hub and that means that data is always moving across borders.
So, the first issue is data localization. The second is that in many countries across Africa, there are data protection law, but in others we don't have the data protection laws in place yet. How do we deal with that? There are obviously in some instances the constitutional right to privacy, so we obviously take that into account. And then in other instances we have strict advocacy requirements and I think the best example of that is the GDPR, which states out a list of robust list of countries that are considered to have adequate laws.
And the other, and it ties back to what the ladies have also touched on, is you know, you may have laws in Kenya and Uganda, and we have now seen a new bill in Tanzania, an act in South Africa, but there is an inconsistent use of terminology in the relevant laws. For South Africa, for example, you'll find that companies as about I say entities fall under the data protection act and you don't really see that in any other of the countries we come across.
So, when conducting our own studies, we actually looked at how do we develop these factors. So, taking into account the rights that we have to protect from the constitutional perspective and also from data‑specific laws or data‑protection specific laws, and we looked at a number of best practices contained if policies, in digital agreements, for example, and we looked at the EU and whole open data philosophy. We looked at the Convention for the protection of individuals with regard to personal rights, or Convention A +. We also looked at the EU, you know, the digital transformation strategy, the data policy framework, and from a standard perspective which was kind of our focus, we looked at Mauritius, a good example or shining light I would say is Mauritius has a robust data protection act which takes care of the rights of data subjects and also signatory. So, when we look at the bilateral perspective, we look at preferential trade agreements and we look at Singapore and which also is a good example and has robust bilateral agreements with Australia and also with New Zealand. And also, we looked at the Africa continental free trade area, and that's kind of what the policies we looked at and examples we looked at in developing this best practice. And the recommendations are that we understand that we need to protect the laws of data subjects. However, taking into account the technologies, where the technology is going, how Cloud service providers, you know, who are driving the technology are viewing Africa, we need to take ‑‑ we basically looked at taking a regional approach. So do we take policies, and look at them from a regional perspective, for example, from a perspective. Another option is regional cooperation by trade agreements where you make provisions for the rights, and also regulatory reform. As I mentioned, there are countries where there are no laws at all, and or where there are laws but they're too strict going in the opposite direction, basically, and it means that we can't operate, we can't roll out those new technologies, you know, for example because the data localization laws.
And essentially, our view is that we take a regional approach where there aren't any laws and laws should be put in place, data protection laws. We encourage the ratification of international conventions such as Convention 108 +, and I think at the bedrock of all of this is the right to privacy which exists in most constitutional areas, and you know we do have all of those specific measures in place, privacy by design as a business is top of mind for us whenever we're dealing with any data technology, which is basically all the time these days, so we have our privacy impact assessments which we already do internally even for jurisdictions that don't have laws in place, so for us that is internal best practice, so whenever we're dealing with any kind of data processing, we start with our privacy impact assessment and that kind of leads us.
And like I've already said, we have to adapt to each jurisdiction accordingly, take that into account. And also, when we are given the opportunity to comment on the various policies or laws that are still in draft, we do that, and we've just done that with a new bill in Tanzania. So, we take a very robust approach, we try to be very balance. You know, obviously what's very key for us is protecting rights of consumers from a privacy perspective, and also without going too much into the AI side, it is a little bit more broader than that from an AI perspective, you know, then you're coming to other institutional rights, the freedom of expression, equality and nondiscrimination and you deal with biases, so we're always thinking about a collective of those rights, and at the same time while really trying to cater for digital inclusion and financial inclusion.
So, a bit of work for us, but definitely trying to balance those two sides as it were. I'll pause there. Thank you, Jonathan. Happy to take questions after.
>> JONATHAN ANDREW: Great. Thank you very much Mosa, fascinating insights. Clearly there are some challenges for a company working across the region with very different data protection and privacy laws, different institutional protections. You mentioned the fact that you're working with emerging developing digital technologies, including artificial intelligent, which also presents challenges, but it was great to hear that you're very proactive in ensuring that internally best practices are shared across your different divisions, even in countries with recall ‑‑ or without data protection laws. I think there is really some great insights for other businesses operating in this area to see the great preemptive proactive approach that you've been adopting. Thank you very much.
Moving beyond just to keep with the schedule, our next speaker will be Maureen Mwadigme from the Kenya national commission on human rights. Her practice focuses on human rights, digital, economic, social and cultural rights. She's engaged in constitutional litigation, legislation review, international reforms, community engagement, and partnership building at the national commission in Kenya. In November 2022, so this month, Maureen was recognized as the Public Sector Lawyer of the Year, congratulations Maureen for that award.
And prior to joining the division, Maureen worked in private law firms in commercial litigation and transactions as well as Kenya's refugees as well as Kenya's refugees first Secretariat. Thank you, Maureen, and take the floor. Thank you.
>> MAUREEN MWADIGME: Thank you so much, Dr. Andrew. The institute for human rights in general, thank you for this session. I'm honored, indeed, to speak before this brilliant delegation both online and you know here in the room. The three presenters have touched on policy and legislative landscape, so I'll mainly focus on the role of NHRIs and success to remedies with a specific focus on digital rights, so basically national human rights institution and access to remedies with reference to digital rights.
In this particular case, I will restrict myself to the Kenya National Commission on Human Rights who is my employer, and I know there are several human rights institutions here, so I will not pretend to speak for them.
So, very briefly about Kenya National Commission on Human Rights, we are an A status national human rights institution as per the Paris Principles and established under Article 59 of our Constitution as well as operationalized by an Act of Parliament. The Commissioner's broad man days cut across promoting protection and upholding of human rights in Kenya.
The commission's functions are laid out in the constitutive contact the Kenya CHR act and these functions include research and monitoring, basically monitoring compliance of human rights standards in public and private institutions, conducting human rights education and training, carrying out campaigns, advocacy, and collaboration with stakeholders in order to safeguard human rights, as well as investigate and secure appropriate redress for victims of human rights violations, so we can see that we clearly, you know, can speak on matters digital rights in as far as our mandate is concerned.
With reference to our discussion topic today, I think by now it has become very clear that even seemingly neutral technologies can actually replicate pre‑existing inequalities and marginalization. Technology impacts human rights positively and at the same time negatively, and this is where the role of oversight institutions come in.
I have heard my colleague from the Office of the Data Protection Commission I think from Uganda speak, and from my case as an oversight institution, I will give the point of view as an NHRI, a national human rights institution.
So as a national human rights institution, Kenya CHR is very keen on oversighting online spaces to ensure that the milestones met in the physical world are not lost in digital spaces. There are so many issues and human rights concerns that have been happening in the online spaces, and unfortunately most of us do not tag them as such human rights issues, and for one what has been happening, a case study in Kenya, we have had a lot of complaints on matters of freedom of expression where activists get arrested and charged especially with offenses under the cybercrimes, cybercrimes and computer misuse act, and this was majorly during the COVID‑19 period where human rights defenders really took to it to express themselves online as opposed to going on the streets due to the limitations that we are all familiar with.
The other one is on censoring and blocking. You have an institution, a public institution have a Twitter handle or Facebook page and unfortunately, when they zero down on you and say that, well, this person is making negative comments because of A, B, C, D, and they do not like your actions, then what do they do? They block you from receiving any messages or interacting further on that particular platform.
The other is on surveillance, capitalism, and government surveillance. One targeting consumer decisions like the FinTechs, this is a huge problem in Kenya, and I'm very grateful to the central Bank of Kenya that is really frank to regulate this sector to ensure that there is some sort of sanity with the FinTechs targeting civil and political rights, government surveillance targeting civil and political rights including voting rights.
We've also had quite a number of massive data protection breaches, an actually prior to the elections this year, Kenya had these elections in August this year, and a big number of Kenyans found themselves registered as members of political party with the Office of the Register of Political Parties and they hadn't done that. So, it was very interesting to find that, you know, political parties will go the extra mile to get information very, very, very specific information on individuals to be able to meet the threshold that was required by the Office of the Register of Political Parties to be able to register as a political party.
Then the other thing is that we are seeing quite a lot of movement in terms of compliance in the private space. In fact, recently, the Office of the Data Protection Commission made a requirement on the regulations and compliance procedures by the private companies. However, the government on the other end, that's another story. So, basically, I think we need to move from that, you know, from that space where we are thinking with the government, and what we do definitely can be sanitized, it is okay. But to be very sincere, government is really the largest data controllers, they hold the largest amount of data. So, we are talking about the companies getting regulated, and we also ask governments need to ensure that we comply with the laws that we put in place, so issues to do with the regulation compliance and specific parameters for compliance that have been put in place by the Office of the Data Protection Commission. So, basically, this misguided, you know, belief that the public sector cannot infringe on data, needs ‑‑ needs to end because that is where, actually, most of the data is drawn from, you know.
Looking back, we have had so many instances where private companies have tried to get information from Kenyans and I remember there was a case way back before we got our Data Protection Act where one of the telecommunications companies tried to put surveillance on the SIM cards unfortunately or fortunately. The commission as well as several human rights defenders moved to court very, very swiftly, and the Court managed to declare such certain acts very unconstitutional.
So, what am I saying? Basically, state departments, agencies, the government in general should lead by example and implement data privacy programs within, within their organizations. And lastly, on the issue about the access to remedy itself, it is my humble opinion that national human rights institutions are very independent and trusted entities, so we definitely get quite a lot of complaints and a lot of feedback from the communities or the users of these particular technologies. So, when it comes to providing legal advice, holding public awareness forums, that we definitely do to ensure that the citizens are actually helped to understand their rights especially with regards to digital rights.
However, all stakeholders in the digital sector need to work very, very close together because we all know ‑‑ we all know and appreciate that human rights are interdependent and the different roles that we play here in our respective capacities, all complement each other. So, walking in siloes won't work. It is about time that we as stakeholders in different capacities, in different ‑‑ as different actors in the sector, come together to be able to, you know, impact positively on matters ensuring protection of the users of the technology that we develop. Thank you. Over to you, Dr. Andrew.
>> JONATHAN ANDREW: Thank you very much, Maureen. I think really the key point as you say is one of cooperation, collaboration, and that together we can work on some of these issues and we can prevent the breaches of the right to privacy, right to data protection. We can do work in terms of prevention, but as do a lot of work collectively to facilitate the access to remedies through different mechanisms for citizens for residents in the respective countries by sharing and pooling our resources.
We have a few more minutes left, so I would just like to see if we have any questions from the audience to our panel? Yes, can I take that question in the back? We have a gentleman here. We'll see if we can. We have just a few minutes. If you could keep it brief in terms of your question. Thank you.
>> AUDIENCE MEMBER: Sure, I'll make it quick. My name is Shonia from Mozilla Foundation and thank you to the panel. My question to the panelists and anyone can pick it, how do you gaff nature digital accessibility issues in order for you to safeguard the digital rights that you're working towards.
>> JONATHAN ANDREW: Perhaps I can put that to Maureen first.
>> MAUREEN MWADIGME: Sure. Interesting that Amazon ‑‑ it's Amazon?
>> AUDIENCE MEMBER: Mozilla.
>> MAUREEN MWADIGME: Okay, interesting. Interesting from a private expert because we had the conversation with Ericsson yesterday and what we were trying to ask ourselves is how best to we work to ensure that the vulnerable and marginalized groups are not impacted by our actions when it comes to, you know, the access and what not. Unfortunately, what is happening right now is that technology is coming to marginalize the vulnerable even further, so we were trying to think, is it a possibility for us as an NHRI to work with the, for instance, the networking companies to be able to understand the need of specific areas that we have mapped out. And secondly, to ensure that if networking is done, then the frame is equitably distributed. Again, there is an angle, and it's a long conversation. There is a business angle to that on whether they'll be able to recouped their profits when they go in the, for instance, in the lands. How best can we have government incentives to ensure that such companies conclude able to come in and still reach these offline areas and at the same time mitigate on their costs. So, it's an issue that requires a multisectoral approach. It is not one that can be dealt with by one sector. Something that requires the mapping aspect, the monitoring aspect, the reporting aspect so that the vulnerable and marginalized groups actually benefit from the networking. Thank you.
>> JONATHAN ANDREW: Thank you. While we still have a bit of time, I would also like to put that to Stella at the Uganda Data Protection Office. Stella, can you take the floor. Thank you.
>> STELLA ALIBATEESE: Thank you. Thank you for that question. Indeed, digital literacy ‑‑
>> JONATHAN ANDREW: Mute ‑‑ can you hear us, Stella?
>> STELLA ALIBATEESE: Can I proceed?
>> JONATHAN ANDREW: I think you're on mute. I'll see if I can unmute you.
>> STELLA ALIBATEESE: Hello? Can you hear me? I'm unmuted on my side.
>> JONATHAN ANDREW: Yes, we can hear you now.
>> STELLA ALIBATEESE: Okay. That is a valid concern. On our side we're trying to address that by creating awareness within our local languages. For those who know Uganda, I think we have over 50 tribes speaking different dialects. I can say within the office we can only speak about 3 or 4 of those dialects, which means that those are the only languages that we can speak, and that is a big challenge because when you create these literacy programs, it's important that you communicate them in the language that most of our people understand. So that is a challenge that we have and we are trying to work out ways, first of all, of interpreting our laws and then being able to develop at work that can be able to create that awareness within those laws.
Secondly, in terms of access, whatever technology we develop, we make sure that we provide for communication through the smart phones and future phones. I can say, for instance, for the complaint system because that is the one that interfaces really with the population, we have also enabled SMS and other technology that enables an individual, as long as they have some technology to be able to reach us. That is how we are addressing some of those issues. Obviously, it is a journey and I think for government, what we need to do is to continue with those efforts until we bridge those gaps. Thank you.
>> JONATHAN ANDREW: Thank you very much, Stella. I'll just pass the floor to Cynthia; would you like to make a comment based on your experience? I know you've done a lot of work working with people in different communities around, for example, public services like hospitals, education, et cetera.
>> CYNTHIS CHEIKEMOI: So, to respond to that question, and it's a problem that cuts across digital literacy, how do we go about it. Having worked with different institutions in creating awareness and improving digital literacy among marginalized communities and more especially women and children, the best approach is to work through associations, that's where you can reach many people and institutions, like from time-to-time train children on digital literacy, train them on cybersecurity and skills they need to stay safe online. So it goes to identifying specific groups that are actually more marginalized in the digital space and at times one of the major challenges is the infrastructure itself, in as much as we're trying to roll out the services to marginalized communities, you realize they lack the infrastructure, so it even becomes more difficult to enhance digital literacy, but then through working with associations and Civil Society organizations, then as Maureen had said earlier, that it calls for a multistakeholder kind of approach, a collaborative approach so that you can actually attain and reach the digital literacy levels that we need to see among our people. Yeah.
>> JONATHAN ANDREW: Thank you very much, Cynthia. We're running out of time. Unfortunately, I don't think we have time to take further questions. We've already run over a little bit. I would just like to put that to Mosa joining from Johannesburg from Vodacom, perhaps can you how you've been reaching out on different communities and informed.
>> MOSA THEKISO: Yeah. We have a very robust social contracting program, and a big part of that is when rolling out various products and services, for example, we have a mom‑and‑baby app and that's essentially health care, so that tracks pre and post‑natal development, and as Stella has already mentioned, for those kind of services to actually go out, you need smart phone and smart feature phone penetration and also the relevant digital literacy. As part of our social contracting and as we roll out the various products and cut across various sectors, health care, education, we partner up with other Cloud service providers on education as well, for example, and that's just some of the programs that we're rolling out that look at those specific issues, and hand in hand with that is obviously educating our consumers and users of those products on their rights, what we do with their data, how we look after their data, and also how they can put ‑‑ how they can hold us accountable when it comes to their data if they're not comfortable with their data or don't understand what we do with their data, they have a resource throughout various channels to approach us so we can educate them on that.
Yeah. Very robust social contracting program that we are rolling out throughout the continent.
>> JONATHAN ANDREW: Thank you very much, Mosa. We've managed to cover I think a great deal of ground in this short session of one hour. I want to obviously take the opportunity to thank all of you in the audience for joining us both here and also online. I'm extremely grateful and the Danish Institute is grateful to the panel members, Cynthia, Maureen, Mosa, and Stella, thank you for joining us. I think this session has really highlighted the value of collaboration, cooperation between those such as Cynthia with data protection and privacy lawyers working on strategic litigation but as working with associations on the ground, in communities with public authorities to help raise awareness of challenges and educate on issues of data protection and Maureen the Kenya national rights association with her work, reaching out, working with different stakeholders, also holding the government to account and speaking to public authorities which is really clearly critical work. To Stella for her work as Director leading the Uganda data protection authority, impressive to see how much work they've been doing and citizens they've reached to bring concerns and seek access through the mechanisms that they facilitate. And also to Mosa from Vodacom who joined us today from South Africa, wonderful insights to see really how an organization and business such as Vodacom working in different jurisdictions with many challenges with different data laws and different privacy concerns and regulations relating to constitutional provisions is able to take a proactive and preemptive approach to share best practices across different divisions and really to engage and engage with the communities in making sure that communications are inclusive and that people are aware of the privacy and data protection rights.
Thank you once again. Thank you for joining us. One last point that the Danish Institute is also very willing to reach out and engage with different actors and stakeholders in the space. We have different fora that we facilitate across different divisions. We have a specific international mandate to engage, so we would welcome if you want to speak to us afterwards, after this session or reach out to us online through the institute. Details are on our website. Please do so, and we look forward to continuing to work with you and moving forward in protecting and promoting human rights. Enjoy the conference. Thank you.