The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> HYRA BASIT: Thank you for being here, I'm waiting on some sort of signal from the people ‑‑ the moderator at the live venue and we can start, I'm not sure if the online ‑‑ the onsite moderator assigned to us is available or not.  I'm not getting any messages from here.  We can start as soon as we get some sort of go‑ahead ‑‑ oh, okay, something here.

I think I'm just going to go right ahead and start off because we have ‑‑ we are behind by ten minutes, and I don't want to waste more time.  Thank you for being here, I'm Hyra Basit, I am coming to you from Pakistan, I work at the digital rights foundation.

I will ask all our panelists to introduce themselves.  First I want to just give you an overview of what the session is about, what our intended themes are out of this discussion.

We have seen that, over time, lots of states use internet shutdowns and laws to, you know, control which platforms, which social media platforms, you know, digital ‑‑ digital services are available in countries for various reasons.  They sometimes decide to ban them or block them for certain periods of time.  This can happen in times of crisis or not.

And on the other hand, we have seen examples where certain tech companies have blocked and restricted their operations and content in areas where they did not agree with the political actions of the state.

So you have blockages and holding back of services, digital services from both sides, from both states and governments and from tech companies themselves.

But in the midst of all of this is the citizens who are made to suffer in the end.  They're not involved in the actual political ‑‑ they might not be involved in the actual political‑making decisions, and it's not necessary that they even support those moves, but they are the ones who at the end of the day are deprived of access to financial systems, to social media and especially health care, and educational systems.

So which connect them to the rest of the world, that's an infringement of their rights, their digital rights, their human rights, but also the other consequences of this is that people from ‑‑ people who aren't in that country, we are also from ‑‑ all parts of the world, we are the ones who suffer, because we lose very important information, very important, you know, live information that we can get from citizens on the ground, you know, whether it's through live streaming or, you know, citizen journalism, videos of the events that are happening, transpiring in that country at the time.  Especially when it's a crisis.  Even humanitarian crisis.

So ‑‑ which is why I wanted to put up this discussion, and I've invited our panelists who are think will be able to give us a lot of insight into what ‑‑ what rules or what frameworks should govern about the governments or states that impose these shutdowns or bans and tech company's decisions whenever they decide to withhold their services.

So I'm going to call out each of our panelists one by one and ask them to give a short introduction of themselves, and I hope that provide announce your name right.  If I don't, I'm very sorry, please correct me.  Start off with Mona.

>> MONA SHTAYA: Thank you, Hyra.  I'm Mona Shtaya, I'm working at the Center for The advancement of social media, leading advocacy and communication, I'm glad to join you today in the session, thank you.

>> HYRA BASIT: Tanzeel.

>> TANZEEL KHAN: I'm Tanzeel, I work as a ‑‑ representative of the foundation and work as a software engineer, I do work related to the digital rights and internet shutdowns, look forward to joining the discussion, thank you.

>> HYRA BASIT: Thank you, Tanzeel, Eliska.

>> Eliska:  I work as a freedom completion lead at access now and it's a great pleasure to be here today.

>> HYRA BASIT: And Nighat Dad.

     >> NIGHAT DAD:   Hi, everyone, I'm from Pakistan, I run digital rights foundation, and I also sit on the matters independent oversight board.

>> HYRA BASIT: I'll just start off by posing the question, first of all to Tanzeel.  So ‑‑ limiting digital services is a major disrupter in the lives of ordinary citizens, but what actions by a state or government can possibly justify imposing such restrictions on the all the residents of such a country.

>> TANZEEL KHAN: So I can't think of any reason that can justify imposing restriction on all the residents of the country.  Indeed, there could be some cases where there's a chance of violence or a chance ‑‑ there's a threat to human life, but in that specific area, some restrictions could be imposed and not ‑‑ you know, I don't think internet should be shut down as a whole, that maybe some services could be restricted, but I have not, you know thought of any such situation where the reason for shutting down the internet was, you know, a threat to human life or there is a conflict between two communities or something like multiple cases where I have seen and experienced, that ‑‑ that when the people are against the government, their decisions are ‑‑ the people are living under occupation in case of Kashmir, the government shuts down the internet to stop the spread of administration and stop people from sharing whatever is happening with them on the ground.  So yeah, I can't think of any other reason, I don't think it should be shut down as a whole, thank you.

>> HYRA BASIT: Can you speak a little more about the situation that you are facing in Kashmir right now?  What is your personal experience of governments and internet shutdowns and specifically platforms.

>> TANZEEL KHAN: So one experience that I remember is when we ‑‑ in 2019 the internet was shut for three months, and not just the internet, other communication services, including phone and SMS, and the post office was shut.

The reason for that was the government decided to, you know, remove one article from the Indian Constitution and insert Kashmir with India, and before they shut down the internet, there was no violence happening, there was ‑‑ there was nothing that could, you know, justify shutting down the internet, but the government shut it so that people don't, you know, assemble or people don't know what is happening around ‑‑ the people that are not living in Kashmir didn't get to know what is happening with the people in Kashmir.  I was not in Kashmir then and not able to communicate with my family for at least three months.

And not just me, thousands of people that were not in Kashmir, they didn't know what was happening.  No media persons were allowed.  The analysts were not allowed in Kashmir.  All the communication services were shut.  I think that is one example of, you know, the government, you know, suppressing the voice of people who don't have the authority, who don't have the power in the government or, you know, in any organization, but only ‑‑ only way they have to speak, to share their experience is the internet, and the government shuts down that as well.  That's the experience that I've had.

>> HYRA BASIT: Mona, I think I'll pose the same question to you, are there any situations where you feel like states or governments are justified or is it ‑‑ do you agree with Tanzeel on this?  What has your experience been.

>> MONA SHTAYA: So in Pakistan, like the context is different.  We are not talking about internet shutdown, but rather we are talking about more of a restriction on the freedom of expression, how people are expressing themselves on social media platforms, accessing like specific digital services, for example we can't talk about accessing Venmo owned by PayPal.  The social media platforms where people can express themselves and share the documentations of human rights violations that they are exposed to, and in cases where people are living under colonization or the colonial context, like Palestine, it's tricky to give adjustments or excuses why governments, whether it's like military occupation governments or like authorities ‑‑ local authorities to deny people their right to freedom of expression.  We can't give excuses for that.  People should be able to share their opinions, the documentation of human rights violations they are exposed to on the ground, on the social media, freely, openly, and ‑‑ but also with quality and affordable internet.  Here we can talk about the colonial context in Palestine where it's like, yeah, we are not talking about cut off the internet, but we are talking about the context where but Palestinians have no control over the internet or the ICT infrastructure in Palestine, which means the Israeli government decides to cut off the internet, or prevent us from having access to the internet, they can do that easily because the Israeli government is controlling the whole ICT infrastructure in Palestine, which denies us as Palestinians our access to affordable, quality internet, and the occupied Palestinian territory.

That said, that doesn't mean our communication on the social media, when we have ‑‑ I mean, internet connectivity to the internet, that doesn't mean that we are sharing our opinions freely or the documentations of the human rights violations freely.  I also mean that certain governments, which is in our context, Israeli government, have ‑‑ or practices its power to put pressure on different tech companies to oppress Palestinian voices as they are oppressing us on the ground, they are also oppressing our voices on the social media platforms.  In 2015, the Israeli authorities, they establish what they call the Israeli cyber unit, which works systematically to monitor Palestinian content, send requests to the social media platforms to take down the Palestinian content and oppress the Palestinian narrative in the online platforms.

If we just keep in mind when you are living in a Colonial context, then you don't have ‑‑ you are the weaker community, you don't have access to the ‑‑ access to the international media coverage, which means that social media platforms are an open platform for you where you can share your narrative, your stories freely and openly, but this is not the case when we have the Israeli cyber unit, which started in like ‑‑ in numbers, in 2016 they sent around 2,400 to the social media companies, and they increased their work, so in 2019, which is the most recent numbers that we have, they sent around 20,000 requests to the social media companies, and when we say the word request, we mean a communication message with ‑‑ we don't mean a piece of content.  Each communication message might have thousands of posts to the social media companies.

So based on what the Israeli cyber unit acknowledged, they said that social media companies are accepting 90 percent of ‑‑ approving 90 percent of those requests, which is basically censoring Palestinian voices and that means the narrative on the internet is not fairly representing the Israeli Palestinians and have a mono‑narrative where the Israeli narrative is on online, and the Palestinian narrative is oppressed and taken down.  The latest report by the business social responsibility network, to have investigation on how they are moderating the Palestinian content and the Israeli content, confirms there is bias on how Facebook are treating the Palestinian content and how they are over moderating or over enforcing that content and how they are under‑enforcing the Israeli Hebrew content, which means that whether you have internet cut off or whether you have censorship on your voices, at the end of the day, you have limited access or you are denied of saying your narrative, your story, and your content on the internet, which means like one party has more space than the other, and this is like a reflection of the human rights violations that we have on the ground.

I think you are muted.

>> HYRA BASIT: Thank you.  Yeah, why that's an interesting point you picked up about the infrastructure, about having access to and owning infrastructure and then what benefits that can have, not just about the access to the internet, but affordability as well.

Anything had, would you like ‑‑ Nighat, would you like to add anything.  We have had internet shutdowns here in Pakistan as well.  In some situations, do you think that would be justified?  There are various reasons given by the Pakistan government, what is your thoughts.

>> NIGHAT DAD: I think I agree, what Tanzeel said and what Mona already mentioned, and I think we really need to first see the context, you know, who is the more powerful.  I think we can talk about, yours and governments and states are the more powerful than users, and especially in the ‑‑ in crisis situation, or in conflict zones.  They are the ones who set the rules, they are the ones that make the rules and regulations and laws and justify these shutdowns or, as Mona mentioned, how cyber unit is not just ‑‑ we are talking about one jurisdiction, but, you know, law enforcement around the world have this power and access to the companies and asking not only for user's data but asking for taking down content.  I think this is something ‑‑ thank you, Mona, for mentioning about the decision that we at the oversight board gave on ‑‑ in Al‑Jazeera case, but it's basically based on all of your work that you all have been doing for so long, holding companies accountable, around being more transparent and accountable to the users and transparent in terms of what they are taking down, how they are ‑‑ how they are responding to the governments and law enforcement's requests.

So governments can come up with their own ideas around rules and regulations.  We have seen this in our own jurisdiction, we are seeing all over the world, and they are justify having these shutdowns in the name of national security or public order or, you know, when they ‑‑ and they try to say that there can be political turmoil or, you know, there is a threat to public at large or there is a threat to other actors, and that's why we ‑‑ we are shutting down internet.  But the thing is that there is a framework available, which mean countries have signed onto, and it's the international human rights framework, different treaties and convention, ICCPR is one.  So‑called democracies, you know, are the ones who have not only signed onto it, they have ratified.

This basically, it tells the government and states that, if there are certain things they want to do, they can do so, and there is this threshold, the threshold is what, legitimate aim should be there in the sense that there should be an appropriate legal framework that authorizes, you know, such internet shutdowns to take place for some purposes, but those purposes should be very, very specific.  And this principal of legality, it's a fundamental principal of international human rights that requires any interference with human rights to be proscribed by law.

In addition to that, any interference with right to access, with right to free speech, must be proportionate.  We have been seeing disproportionate, what Tanzeel was saying, shutting down the entire internet, blanket bans and blanket blocking, that doesn't justify government's actions.  So any shutdown by the state which has an impact on its citizens fundamental rights, must demonstrate in a specific and individualized fashion the precise nature of the threat.  What threat they are trying to address, public order, national security, something that we all have been seeing how governments interpret all these ambiguous terms.

So I think ‑‑ and of course proportionality and necessity, all these frameworks are there, it's actually the political will we are talking about.  It's not that the governments don't know.  They have ratified all these things, they know there is this threshold, but how much willingness is there to follow, I think, is a question.

>> HYRA BASIT: All right, thank you, just because this discussion is headed that way, Mona, you've already touched on this before, do you have an idea of what is the fine line, what extent do you think that companies, social media companies or other tech companies, what's the fine line where they should comply with the laws or requests by government whenever they want to, you know, restrict some of their activities, where is the ‑‑ when is the action ‑‑ when is the action should be taken by those tech companies?  You know, what's the time when they say should say we don't agree with the laws you're trying to implement, we don't agree with the rights of the people that you're trying to do ‑‑ to suppress, and we are not going to comply with whatever you're saying anymore.  It might be better at this moment to tap out.  For Mona.  Yes.

>> MONA SHTAYA: Thanks for the question.  Actually, I think that baseline, that we start our discussions, our advocacy with the social media companies usually is international human rights law, as well as the human rights and business principals.  Those are the starting lines, and the reality, however, the reality is a bit different because how social media companies operate, and how governments are sending those requests reflects something different.  It reflects the political economical interests for the social media companies as well as for the governments,

I can give you a few examples on this.  In our region, for example in 2010, 2011 when our revolution started, the young people in the Arab region, they thought they are now escaping the authoritarianism, like the regime authoritarianism, they have been living under for decades, when they escape that and they started organizing and mobilizing themselves online and reflecting that on the streets, they were shocked by new authoritarianism from the social media companies, where social media companies have also the community standards that they are applying for this content.  The fine line that we are mentioning, or that social media should take responsibility when it comes to moderating their content, as how this content or does this content could be reflected or extended to reward harm.  Like in the case of Palestine, Israel, whenever they are over moderating the human rights violations that Palestinians are exposed to and then they are under‑moderating the Israeli hate speech, violence speech and incitement against Palestinians and Arabs, which is basically reflected or extended to the offline space or to our real life as a form of violence in our real life, then we can see that there is a problem, because they are taking down evidence on war crime, where the ICCU, any party, any resume rights entity would investigate the war crimes, would go back to this human rights documentations, while the other incitement, hate speech and violence speech against Arabs and Palestinians which in other cases created stereotyping and labeling.  It creates labels and stereotypes against, like, certain communities that stays online, because this was part of a powerful government that sent those requests against or toward ‑‑ to the social media companies to oppress more the marginalized communities.

Because, as marginalized communities or colonized communities, we don't have the power.  If we think about the power, Israel has ‑‑ Israel has purchasing ad power, equivalent to Palestine, Jordan and Egypt combined altogether.  This is based on the numbers in 2019.

When we are talking only about like the purchasing ad power, we are talking about one of the major things that operates the social media platforms, they are relying on business and selling ads basically as part of their business model.  Then here we are talking about the political economic interests for the companies as well as the governments.

If we take another case, the Russia, Ukraine case, I think Eliska would be much more expert in this than me, the power dynamics there and how social media companies took a side or stood with Ukraine against Russia, how they took strict action to moderate the content during the time of the war, it reflects also how they followed the governmental ‑‑ the big governments, powerful governments like the U.S. government direction in taking a side, the side Ukraine, rather than staying neutral, quote unquote, which also came late, like now we are not negotiating if those measures were good or bad.  They came later, because for a long time civil society in Ukraine were demanding tech companies to take action to protect people there.  They were never serious about that until the government took action, took strict actions, they took strict actions.

So it's how they are operating, it doesn't reflect the reality or the ‑‑ the thoughts about how they should have worked with certain communities and with certain countries.

>> HYRA BASIT: Eliska, you've also co‑authored a declaration on principles for content and platform governance.  So I'd like to do here what ‑‑ I'd like to hear your experience or opinion is on how tech platforms should behave in times like this.  And Eliska Pirkova:  Thank you for everything.  Why I would like to respond to your first initial question about measures about internet shutdowns could be justified in the light of international human rights law.  Very short answer to that is no, and precisely due to the fact that such arbitrary measure under no circumstances can meet the proportionality and legality requirements that was already mentioned by previous speakers.  This is not only opinion of access now, this is opinion of member of international human rights monitoring bodies, including the special Rapporteurs or media free dumb represent of OECD and European court of human rights which elaborated what criteria has to be met if the state has to restore and then it's questionable on what basis to blocking websites.  There were recent judgments in 2020, issued but the court, that condemned arbitrary blocking of websites.

So just very quickly, back to your question, and Mona has already raised a couple of very important issues.  Indeed, access now, launched during IGF here in Ethiopia, the declaration on principles on platform responsibility in times of crisis.  I also would like to remind to everyone, it's not only about platforms or private actors responsibility.  Of course have the responsibility, especially under United Nations guiding principles, but it's about the stay positive obligations to protect individuals against interferences that justifiably violate basic human rights.

And so I will touch upon the responsibilities of platforms in a minute, but I really want to get this across because we often see, especially in the cases of internet shutdowns the basis, such as national security or even the efforts to protect individuals against the video of disinformation and online hate speech are being involved as justify having round for ordering these arbitrary measures.  Of course this is just abusive practice by states that cannot meet any requirements under international human rights law and a similar trend can be observed also when states actually arbitrarily criminalize different acts of speech such as disinformation or terminology such as fake news that enables to silence the voices of vulnerable groups, activists and others in the online environment.

Going back to your original question, what the platforms can do, they can do much more than they are doing right now.

We often see very knee‑jerk ad hoc responses to crisis, usually at the moment when the crisis escalates and when there is ongoing public and political pressure on platforms to finally start taking steps in order to create more safer online environment, especially for those who are impacted by crisis and that's pretty much everyone.

In our declaration, we took the approach where we divided up a set of where platforms have to do before crisis, what they are required do during crisis and what steps to, take once the crisis deescalates.  We are not trying to create hard end of crisis, our recommendations are based on the due diligence platforms should comply with, that's precisely because we want platforms to mitigate the risks that directly stem from their systems and processes, such as algorithmic content moderation and curation, but the resources they invest into human ‑‑ into content moderation and their content moderators.  Not enough languages are being represented, whether they have a proper understanding of social, political and historical context of the country.  These are the issues plagued by civil societies for a very long time, but we still see very little progress in the field.

For us, one of the main tools that the platforms absolutely have to deploy is different risk assessment measures, that are able ‑‑ whether human rights impact of their systems that they deploy, but also meaningful transference were criteria into the changes of internet governance policies, which we saw in Ukraine when they started adopting different carveouts in order to create more safer environment, but at the same time, being very nontransparent about those changes and how they're being implemented.  Indeed we often see platforms responding much faster where regulatory pressure and there is lots of political attention, especially from the western world, in comparison to other parts of the world, especially in the global south where we, unfortunately, have to observe lots of negligence from platforms when it comes to different response mechanisms to crisis.

The final point I would like to raise what platforms can do, is maybe to the elaborate more on the meaningful engagement with civil society organizations and trusted partners that operate that are suffering the crisis or other challenging circumstances.

We often see platforms engaging at the moment when it's too late.  So without developing any proper system of, for instance, quarterly consultations, or follow-up mechanisms on recommendations that the trusted partners and organizations with relevant expertise keep plugged into platforms, long time ahead of the escalation of crisis.  There are no mechanism in place to see whether the recommendations delivered to platforms are properly addressed and translated into the content governance policies applicable during the time of crisis.  Our declaration puts forward a number of recommendations the system could actually get better and could be truly effective.

Again, these consultations cannot start at the moment when the crisis escalate but should be put in place already before the escalation occurs and also in the post‑crisis phase.

I'm looking forward to your questions, thank you.

>> HYRA BASIT: So some of the things that were highlighted in this declaration that you co‑authored, is that there should be an equitable, fair and consistent approach to engaging, there should be more greater transparency whenever tech companies or social media platforms receive requests from government, that, you know, transparency should be ‑‑ is paramount, any requests should be ‑‑ they should disclose any government requests, and of course there should be a fair and impartial way of responding to content moderation and especially I think you've mentioned something about language, you know, not just folks in English, but give a higher priority to non‑English language.  Tanzeel, my question to you, in situations, what is your experience in Kashmir, what would you like for tech companies to engage in more whenever you felt like the governments or states was trying to impose restrictions too much, how should tech companies respond instead, have you seen that gap.

>> TANZEEL KHAN: So I think, you know, in terms of global resource, the tech companies need to have a set of guidelines and principles to follow, not just blindly implement the government mandates.  As we are seeing in the regions under occupation, people do not have authority or decision‑making, and ruling governments make every effort to suppress people, both online and offline.  So I think tech companies need to understand, they already know, but they don't care, so ‑‑ they need to make exceptions, in case of the regions which are under conflict, like they did in Mona mentioned, there was an exception in case of the Ukraine.

I think that exception has to be there in the cases there, which are conflicts like for Kashmir and Myanmar and many other regions where government are ‑‑ the ruling governments are occupiers arguments there are people living there in that region.  So in 2019, or 2018 there were 1 million tweets, because Indian government dictated and there was no ‑‑ there was no option to appeal or to, you know ‑‑ to remove the ban they had him posed.

So ‑‑ also recently, in the last one in 2022, we found that at least 160 accounts that are critical of India's, actions in Kashmir were suspended by Twitter.  Most of them you cannot access them inside of India.

We found that if you change your location from India to another country, we used to access those accounts.  Now if the change of account to another country, you still won't be able to access those accounts.  They are withheld in India.

So yeah, I think ‑‑ as mentioned, understanding the context is important when moderating the content online.  We believe the internet is open and accessible for all, I think it is the responsibility of tech companies, dominating the internet to keep it that way.  Collaboration would be one of the ways ‑‑ collaboration with civil society organization could be one of the ways to get help when they ask government authorities to censor a certain kind of content online.  I think civil society and the people living in that region need to be involved and not just communicating with the government authorities.  That way, we can better moderate the content online.

>> HYRA BASIT: 

>> NIGHAT DAD: Can I add to what Tanzeel said.  We have been saying how much transparency and accountabilities for both tech companies and states, more for tech companies because states are more powerful than the tech companies.  Tech companies are not only powerful but rich, and users in an ideal world, users should be the center of, you know, all the decisions they are making, but in actual world, that's not happening.

And I think that's why several civil society organizations, digital rights ‑‑ I've been working ‑‑ you are working at DRF, it's been a decade we have been saying the same thing over and over again, and now, you know, other organizations are also joining in around the world, and it's basically the same concerns that, you know, folks have been raising.  I think it's, again, I would say, it's the world, but I think we need real solutions.  What we are seeing by the governments are that they are, in the name of holding tech companies accountable, they are coming up with regulations and laws that are also, you know, really problematic for users in that certain jurisdictions, in a sense that how those states are interpreting those regulations, you know.  It's mainly ‑‑ it's basically ‑‑ what they say, we are holding tech companies accountable because they are not really responding to us or not really responding to our jurisdictions as opposed to the global one.  Which is true.

At the same time, the impact on those regulations on users in more author tear Dan regimes or multi fold.  I think users in global majority and conflict zones and in authoritarian regimes are more vulnerable than users in other jurisdictions.  They are dependent on the platforms at the same time they are dependent on their government.  So it's really, you know, like they are in a very tough spot.

I would like to mention a very special case, mention a case we have just decided at the oversight board, it's basically like an example we ‑‑ how much accountability we need for these companies.  It is just one example for Meta.  We do not do everything that is wrong with the company, it's only the content moderation decisions that we are making.

But at the same time, how many other companies are actually holding themselves accountable, they have life self‑regulatory models, I think that's something we need to sort of push, besides regulations, besides other things that are coming up to hold tech giants accountable.  We need models which are independent, and which have actual powers to hold companies accountable and tell them that what they're doing is actually wrong and, you know, the report around Palestine and Israeli conflict is one example.  Just two weeks ago we released a decision around U.K. drill music where, you know, the company turned to oversight board saying that the U.K., law enforcement met basically asked them to remove the video of the ‑‑ of a rapper, and it was originally removed from Instagram, but then they came to us asking us, how we can deal with such situations where law enforcement reached out to us.

What basically we decided the case and raised our concerns about Meta's relationship with law enforcement, which has the potential to amplify bias.

We made recommendations in that case, as well, with respect to due process and transparency, around these relationships.  It's not only with regard to that particular case, it applies to all the jurisdictions which Mona has mentioned, which Tanzeel has mentioned.  What are the ‑‑ these relationships with the law enforcement, we this governments and how companies deal with this?

You and I know at DRF how this happens in our jurisdiction.  I think it's very important for us to keep holding these companies accountable, that how ‑‑ are they really transparent in those relationships, what requests they are basically accept think from the law enforcement.  In that case, we also found out that there was no reasonable threat that was raised by the U.K. police matter in order to take that content down.  Please go to the website and read that ‑‑ the decision because we have given multiple recommendations to the company and raised our own concerns around these relationships with the law enforcement.

>> HYRA BASIT: , just mentioned how some citizens are more heavily dependent on social media platforms in order to raise their voice, to protest, seek support from other countries and governments, you know, because their situation in their state is  ‑‑ they're under attack.  So they're very reliant and dependent on these platforms.  And but tech companies, as we have seen in the Russia Ukraine situation, they decided to pull out as a form of protest against the state's actions.

What do you think ‑‑ what is the framework that can define that balance when they decide our actions are either helping those citizens there or if pulling out of ‑‑ restricting our operations will be more beneficial as a form of protest maybe.

>>  ELISKA PIRKOVA:  I definitely agree that, especially when the countries find themselves in a situation of crisis, whether that's armed conflict or other types of social unrest, platforms are often a last resort of any potential access to effective remedy, right to appeal or just to make their voices heard.  That also covers the documentation of human rights abuses and violations, there is no other proper channels how to access this information and gradually see, as more and more international bodies are relying on this type of evidence in order to establish that proper accountability.

At the same time, I also agree that state is often the main perpetrator of violence and has a direct access to the violence, and the consequences of that are really severe.

One way platforms and companies can do in order to either manage that political and public pressure for them to leave the country or how to avoid that last resort of escaping the country.  That's very much what the declaration is doing, to put in place certain measures that will, on the one hand, help them to understand the nature of the crisis and a measure specifically mentioned in the declaration are crisis protocols, that platforms can develop, even before, again, the crisis escalates.  These crisis protocols then should be deployed across all levels and likelihood of risks and be designed to prevent the potential harm and mitigate, that will become more and more gross once the escalation of crisis is fully blown.

And there are a number of other human rights due diligence measures that can be put in place.

When it comes to entering or leaving the market where platforms operate and where users are really reliable on them and there become ‑‑ they become the live source of information, they can also actually conduct risk assessments on those markets to understand what kind of countries and regions and territories they are entering before they will start performing operations there, and the same risk assessment measure should be put in place once they decide to leave the market because, of course, if they do so without any sensitivity or proper understanding of the situation, that can also cause significant harm for individual users.

Then there is a number of recommendations that also touch upon already mentioned business models of companies, right, so the data harvesting business models that are then of course translated also into content governance policies, content moderation systems or simply lies in the core of business models of these companies.  Of course, they are private companies by default they prioritize profit over the protection of human rights.

And specifically, the special set of human rights due diligence safeguards they should put in place can also help to mitigate negative impact, surveillance based advertisement, especially if they operate in areas impacted by crisis, and we saw a couple of those examples also during the illegal invasion of the Ukraine when Russia specifically banned my sort of advertisement activities of these platforms in its territory.

Which is, of course, sort of a very strong incentive to be used by states, that's pretty much going after the profitable activity or domain activity these platforms have.  So there are, of course, a number of other measures that can be put in place especially user empowering measures in order to mitigate the negative impact of business models, transparency is one way how to go about it, but also giving more proactive tools to users and more tools that will protect especially users address at times of crisis are essential.  Many of those are reflected in our declaration too.

>> HYRA BASIT: Thank you.  We are very short on time, but Mona and Tanzeel, I want to see if you have any closing remarks before we take questions from our audience.

>> MONA SHTAYA: I can't agree more with what Nighat and Eliska said especially in a time of crisis.  Like I really ‑‑ I strongly agree with what Eliska said regarding how social media operates, like when they knew about certain conflicts and certain context, and not taking action ‑‑ like until there is, like invasion right now, and this is also, unfortunately, it's also seen from a global south, global north perspective, it matters only when it comes to the noble north when it comes to the global south, it never matters for them, for example, Russia and Syria, like they have been in such situation for years, and they never allow people of Syria to spread hate speech, violent speech against political leaders and president of Russia, unless it happened to Ukraine, which is really heart breaking, and I don't think the way how social media operating is fair enough for all the users, which basically they claim to be an open space for everyone.  I do think that to use a social media or like at least to ‑‑ to have an open space for everyone, we should continue our advocacy effort because, of course, social media companies are not going to stand by the international law or the business and human rights principles, which we as human rights defenders have been calling on them to take as a baseline in the work for years, without, like, real advocacy, joining like ‑‑ standing shoulder by shoulder by or allies and global south, but also with the progress of allies in the global north who also are calling for decolonizing the digital spaces, decolonizing the how tech companies are working and operating.

I think decolonizing, it's not a cliche, it could be a real word and really useful in the context on how social media companies are operating.  But we should deal with that ‑‑ we should implement that in a very careful way not to repeat, again, how the system or how social media or tech companies are operating, how the regulations from the global north are set and then they are affecting people in the global south without even participating or without even thinking about people in the global south or taking their context into consideration.  I'll stop here, but like decolonization is a whole topic that I think we can keep talking about.

>> HYRA BASIT: Tanzeel, really quick, do you have anything to add.

>> TANZEEL KHAN: Yeah, I think everyone has added what I was supposed to say, but I ‑‑ I would like to add that, you know, we are seeing some changes from maybe ‑‑ if you go back four to five years, there was ‑‑ this content moderation was like people were not available rights online, something was just like people didn't care.  I think that is  ‑‑ that perspective has changed, and people are starting to question if something, you know, unusual happens to them online.  So I think we need to have more evidence and we need to have real actions to, you know, hold the governments accountable, because as I have been seeing from these areas, so many organizations like access now and many other organizations who have been trying to collect the data and trying to ‑‑ look at the impact that the internet shutdowns or the censorship and other such effects ‑‑ other things are impacting the people online, but there hasn't been much, you know, impact, like the government isn't, you know, going ‑‑ doing much in this area.  I think that we need to focus more on that so that people, you know, can have some ‑‑ some space to speak who don't have the space to, you know, to share and their experience, the information offline.  Thank you.

>> HYRA BASIT: Thank you, Tanzeel.  Bruna, who is with us at the venue, can you please see if there's any questions, and our panelists would.

>> BRUNA TOSO de ALCANTARA: Just to see if anyone would like to ask any questions or just raise your hands.

So far, no questions in the room, but yeah.

>> HYRA BASIT: Okay.  So because we are exactly on time now and I don't want to overextend, if there's a next meeting in the room, again, if anyone has anything, just to add, very quickly at the end, please do so, or we can just end the meeting right here.  Thank you everyone, so much for your contributions here, I think it was a very fruitful discussion, very important discussion to be having.  And Eliska, Mona, Tanzeel, thank you so much for being here, anything had.