The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.
>> MODERATOR: Indeed, we can start. Hello, everyone. Welcome to this session on Security Meets Responsibility, Connected Everything's Resilience. As you know, I am Lucien Castex, the representative for public policy at AFRINIC. The French CCTLD. I will be moderating with Samih Souissi from the French agency. I invite participants to ask questions directly on the chat in Zoom and also have a Q&A. And have a session after the panel discussion. It is important to be interactive in this session.
Our topic today is the IoT, Internet of Things, security and resilience, which is, let's be honest, quite a broad topic.
IoT has expanded substantially and know the diversity of objects to connect is a technology challenge and political challenge as well. From smart home to smart city. The actual COVID‑19 pandemic has shown us the essential role of the Internet in our daily life.
Despite the noticeable increase in traffic that happened during the first lockdown last year. Network ‑‑ sorry. Networks have held up and the Internet has proved to be resilient. Nevertheless digital product and services continue to evolve and expand to create connected everything. Building trust is as important as building new infrastructures. When the emergence of new technologies, there is a switch in priorities and a need to understand resilience, security, and sovereignty.
>> Samih Souissi: There is the legislation impacting the Internet worldwide. Think of European Union, we have more legislative packages introduced, like the digital service act, digital market act. Cyber resilience and. And take for example, the United States, we have the IoT specific guidance and Internet of Things cybersecurity act of 2020 and some manufacture impact on the IoT. The ecosystem has been of par paramount importance. There is a drive to foster trusted IoT ecosystem from cybersecurity design to human rights, for default.
In order to identify the issues and challenges facing us, we are pleased to have today in our panel three renowned experts that will provide us interesting insights, I'm sure. We are happy to have the VIPs with us. We have Luca Belli, PhD, Professor of Internet governance and the school where he directs the center for technology and the cyber project. He's a researcher at (speaking non‑English language) and editor of national law journal, published by Oxford press. And member of the Latin American edition of the computer privacy and data protection conference.
We have Rayna Stamboliyska, the founder and CEO of RS strategy, the advisory that nurtures, certainty. She focuses on EU diplomacy and resilience through cybersecurity. And autonomy and data protection. She's an award winning author (?) Published in 2017. And staunchly supports open source data and science. Has consulted for international organizations, private companies, and nonprofit. And she's recognized as an international speaker. We're happy to have her with us today.
Last, we'll have Jean‑Jacques Sahel. Hopefully, he's with us now. We'll have to check. He is the head of access to information policy of Asia‑Pacific at Google since 2019, overseeing Google public policy approach in the Region for issues of misinformation and intermediary liability. Before joining Google, Jean‑Jacques Sahel was the leader of ICANN Europe and led strategy across the European Region.
We start with our different presentation, let's start with Luca Belli. The name of our Town Hall is Security Meets Responsibility, Connected Everything's Resilience. You lately published a paper on the responsible IoT. So my question for you is how can IoT impact fundamental rights? How can tech businesses do better fulfilling their responsibilities to respect human rights?
>> Luca Belli: Thank you to everyone and all the friends that organize this timely discussion. As mentioned, my presentation is on the paper I published a couple of years ago but still very relevant, sadly, I would say because not too much has been evolving over the past couple of years. The name of the paper was the need for a riot. Where riot was the acronym of origin Internet of Things. This is the framework, and I will provide examples from the group of countries I analyze. Those that we have developed a lot of recent research here in Brazil.
There are three points I actually would like to highlight here today and first that we need to focus on what the doctrine defines as complex IoT environments. The systems that integrate many devices connected devices. Through platforms that are automate the devices and train functionally devices to provide smart applications. This is powered by AI systems and feed into big data analytics. We can have smart homes, smart building, smart city, smart grid. It would be enormous as size, the kind of system.
This leads to the second point which is we have to focus with this expansion of side, there is an expansion in the surface of the systems. And this means that there are both risks and responsibility for the actors behind the systems and duties for the States that have a duty to provide for human rights.
The final point will be a couple of suggestions how public and private actors can fulfill duty to provide fundamental rights in the case of the public actors and responsibility in human rights in terms of business sector with regard to privacy and security, but also with regard to host of other rights that may be compromised by the system when they're not properly designed and implemented.
So when we speak about IoT, it is particularly relevant system of connected devices, maybe millions of connected devices, with embedded sensors. They're based on the increasingly fine grain ubiquitous and voluminous, large volumes of data that can be collected and processed. It enhances efficiencies in your home, your grid, your city.
On the one hundred, big data and AI systems that power the systems are essential and enablers for the value. On the other hand it is complementary. The IoT is essential to an instrumental to realize the potential of AI and big data feeding the system that could be personal or nonpersonal.
This bidirectional relation is essential, because we typically focus on the things, the connected object. But we should actually see IoT as collection of technologies that expands the reach of the Internet, the reach of AI into the physical world. That is ‑‑ here bee enter the second consideration that is not only about privacy and surveillance. It is the IoT blurs the distinction between online and offline. Also this normally expands the potential of risks. So if we want to consider this from the Oxford University perspective philosopher would call this situation of on‑life paradigm. Merge online and offline. From cybersecurity perspective, this demands us to rethink completely security.
The past eight years of attacks of Russian attacks to Ukraine provided us a lot of examples of how IoT systems can be hacked, they can be weaponized to wreak havoc in a country. We have the large examples from sabotage of the Ukrainian grid in 2015 and '16, the malware attack that paralyzed the banking system in 2017. And disabled some radiation systems, at Chernobyl that makes us understand the risks we're facing.
It is essentially, we embed cybersecurity as a core item when we design and implement the system, otherwise, we are not building smart systems. We are building time bombs.
I'm sorry to be so blunt and look pessimistic, the past decade will collaborate what I am saying.
The last consideration is the state, public bodies have a duty to provide for security, privacy, freedom of association, freedom of movement. All of the rights can be impacted by when IoTs do not function properly and not secure. Corporations, there is a responsibility to fulfill the rights. But they need guidance to do this. This is why multistakeholder partnerships here are not only something desirable, but something essential for the implementation of proper cybersecurity.
If we have well designed normative frameworks, that is really the cornerstone that States should provide in order to achieve system resilience, to avoid attacks to define clear obligations to have obligation or periodic audits to see if the systems are still cybersecurity. What is cybersecurity today may be vulnerable tomorrow or in one month.
It is essential to assess to provide confidentiality, integrity and availability of information. And provide data privacy and security.
Legal frameworks are essential but as a lawyer myself I can really tell you will very honestly, legal frameworks are only the basis. They're necessary and not sufficient.
The many challenges is to achieve cooperation and implementation. I will provide you a couple of examples to finalize this. One from Brazil, where there is a nice protection law, adopted 2018, in force 2020 that mirrors the EU GDPR that is an obligation of data security by design Article 47 and 46.
One might say fantastic, Brazil has that security, a new authority, new law. The authority has never said how it is to be implemented. Most are insecure because there is no guidance. Any entity is basically in the dark because they don't know how to comply with the law.
Here is another counterexample to finalize. About China. Criticizable for many things but in my view the only country that is always consider cybersecurity seriously and strategically. They have a system of public entities that coordinate to strategize on how to implement information and cybersecurity.
So digitalization and cybersecurity are two sides of the same coin. This is how it should be seen. Another very important point that I think should be considered from the squeeze system is they do not consider the normative frameworks as the key, the solution, but only a basis on which the other vectors of regulation have to be built. It is something very visible for those who work with regulation, law is a very imperfect instrument of regulation.
You have to couple it with investments, which direct how the technology will be built.
If you have sign secure ‑‑ four billion on that technology. You have to translate the normative frameworks into technical specification for developers. That is a lot of things to in China do is translate norms into specifications, technical standard that can be understood by the developers. Otherwise compliance is impossible.
Thank you very much. Sorry if I was speaking too much. I hope I stayed within the 10 minutes I had. I look forward to the discussion.
>> Lucien Castex: Thank you, Luca Belli you were perfect. That is quite a topic in indeed, when you speak about human rights by design, privacy by design, security by design. It is quite a number of reflections from the you on that tact from Brazil and interesting. It is quite an imperfect instrument indeed.
I would like to give the floor to Rayna Stamboliyska. Now, legislation is present in the realm of the Internet of Things. The wide range of guidelines, Best Practices, legislative proposals exist to help improve the security of connected products and services. What are your reflections on that, Rayna Stamboliyska?
>> Rayna Stamboliyska: Hello everyone. Thank you for having me. I'm glad to be back to IGF again. There are different aspects to consider here when we talk about norms in general. Legislation is one of the norms. One word about why I'm interested in this is to also build upon what Luca Belli was just saying is we need rules to live together as society. But if a big part of the people do not understand those rules, we cannot live normally and healthily as a society. This has been like a way for me and motivation for my work is to bridge that gap of sorts. Kind of translating to human language what lawyers say on one side. Legislators or policymakers and what technical people say on the other side.
Because with technical people, it is the other way around. We're so drowning in slang, in technical slang that people cannot understand what exactly the whole thing is about. So without repeating what Luca Belli was saying as a definition of IoT, I would like to offer a complementary in a way, reframing of the definition, which binds directly to what I just said. Which is IoT is an ecosystem. It is basically two types of ecosystems. One is an ecosystem of technical and technological items. You have the object itself. And that object connects to different protocols, different connections to services. The data that translates through the connections that is stored. You know, treated to produce different services, whatnot through machine learning or simple algorithms. You know, ordering in a chat timestamping messages. It is not AI but still an algorithm.
That is one ecosystem of technological yeah, items that interact together.
For those items we will get back to this afterwards, but we are very far from making it exactly clear to anyone, 100% of the populations what the items are, how they interact together, what is being done to guarantee those can be trusted and that their interactions can be trusted as well.
The other type of ecosystem in IoT is the user's ecosystem. You know, if we ‑‑ like here we're how many? Say we have five people on stage. If each and every one of us is asked what IoT do you have for you, there will be at least five different IoTs being cited. Consumer electronics, industrial IoT, home appliances, you know, that sort of things.
There is something that is neither an industrial IoT nor a consumer electronics per se object. Which is the increasing and vastly increasing amount of wearables, things that have sensors that we wear.
That can fall into the health monitoring realm of stuff. If you like, not necessarily medical devices, per se. And for which so far there is very little legal let's say framework and guidance.
Because well, they're not medical devices, per se, so they're not certified as such, but on the other hand, neighbor kind of consumer electronics, but they deal with sensitive data because health data is sensitive data. As a reminder with sensitive data, GDPR defines sensitive data as anything that relates ‑‑ personal data that relates to political opinions, religious obediences, and also to the self. My biometric data, DNA, so on.
So we're left basically with a situation where those two very complex ecosystems collide. And at some point, we are having our discussions about how can we make those ecosystems trustworthy before they become trusted. (Chuckling).
And how they can be resilient. We are asking that question quite often through the angle of let's say legislation audits, things like this. The blind spots I'm seeing is that we are, on one side, having big troubles communicating all the efforts to the 99.99% of the populations that is not you and me.
Meaning we don't have labels, we don't have any meaningful, accessible, simple, not simplistic ways of conveying the efforts that we're doing. On the other side, we're still struggling with specific aspects of technology and organization of security. Which one of those that for me is essential and that basically crystallizes everything that is not right or not done right is vulnerability management. And this is a very huge ‑‑ yeah, problem. And a very huge remit for IoT in the sense that it concerns vulnerability, concern not just the software components and all the intermediaries there. If it is relatively simple to ship updates, patches, fixes for software components, it is not at all that simple to do so for hardware components, for microcode, that sort of thing.
This becomes a particular pain when we look at medical devices, for example. Or industrial IoT. Meaning those are all connected systems, connected ecosystems that are vital infrastructure or critical ‑‑ or provide critical service to their users.
So we have those issues here for which legislation is necessary, but it is a base. It is a baseline. It is a starting point. For us to have responsibility in the ecosystems, responsibility is not enough. Responsibility means little without accountability. How do I, you know, the law says I will get sanctioned and so on, so forth. But how do I actually implement the law? And hold manufacturers, vendors, so on to account? This, to me, is very different, again, question that is for the time being we're kind of, you know, dancing around in legislation different legislative pieces. But we are yet to get to, you know hold of and grasp especially when it boils down to vulnerability management. And getting users, basically to be able to exercise their fundamental rights, for example. And freedoms. GDPR is an embodiment of how to exercise your fundamental right to privacy.
However, we don't have the equivalent in terms of, for example, in terms of security.
So where I am going with this is how can we as a global society and community move forward not just with responsibility but also with accountability. Especially when it touches down to non‑State actors. Because States need and have procedures to make rules that are at least theoretically participative. And it is their obligation to protect fundamental rights and freedoms.
But more and more, we also have the trend where for innovation to happen, it always happens outside of the rule of law ‑‑ I'm not talking about constitution or things like that. Although that sometimes happens. I am talking about fundamentals such as GDPR for example. How do we get those people? This is an open question. Because I'm too young or too old, depending on point of view to have answering to that fundamental question. That is a question for me and all of us because we need to tackle as a global community. Thank you. I hope I didn't go beyond my time.
>> Samih Souissi: Welcome, Jean‑Jacques Sahel, it is interesting to get feedback from you. If I look at what Luca Belli said, legal basis is necessary, but not sufficient. So with the growing number of connected devices, the technology combinations available today and the growing worry of a number of users about data governance and privacy. How do you perceive the need for this increased security and for the best practice implementation for both Google infrastructure in your case and more globally the Internet ecosystem? The floor is yours.
>> Jean‑Jacques Sahel: Thank you. Good morning, good afternoon, good evening, depending on where you are. It is nice to see many people I know actually on this screen. I might go ahead and pass on the question considering how we're in an interconnected age. Where our personal lives and our everyday work, if you will, are surrounded by connections and by increasingly by connected devices. We really need to make this work. Obviously, it is happening all pretty fast, arguably. So you know, we need to address malicious actors and innocent mispreparations and not enough proper assessment of risk in these areas. How do we do this? I think to continue on from what I think Luca Belli was saying, indeed, yeah, we do need basic legislation, but it is not enough. We need a holistic approach to the challenge. So the way we look at it from ‑‑ from the perspective of our business, but it works quite well, generally as a wider, perhaps policy approach is to have a multipronged approach, where we look at four or five components. I will give you five. Products, the wider ecosystem. The role of Government. Businesses, and then people. People might be employees as well.
If you start with products, the way we look at it is every single product that we make is secure by default. We build in security as part of the development process and try to make it secure by default. We hope as we see more connected devices, others across the value and supply chains will have the same approach to security by default. I'm summarizing of course. I could speak for a while on that point.
Products, first. And thinking about the wider ecosystem, if we want to create a safer Internet, we need to have responsible companies and when you have other components to make that product work for the end user, you know, that includes the connectivity part, the telecom routing to the end user. We need to consider that value chain as a whole and that value chain is secure the whole way through. I was touching on connectivity, there has been some interesting cases over the years I think Luka was referring to the weaponization of the connected device potentially or in real life.
One of the things we need to think about is what is the routing of the data? We need to make sure that we have resilient systems for the transport of the data as well. We need resilient infrastructure and infrastructure that itself is secure and also diverse. We need to think about different types of technology for connectivity and networks. We need to make sure there is diversity in the routing of the data. Make sure we have good competition at the level of networks and routing of data so we don't have too small of number of critical points of failure and how data moves. Therefore an easier way, if you will, for malicious actors to be successful.
We do our bit. We invest as a company in infrastructure that supports the transport of data from submarine cables to data centers, and caches, and that is what we would like to see in the wider ecosystem across the journey of a product and service from conception to being used by the user. We need the continuity of security.
If we think about businesses, generally, I know every business nowadays is increasingly a digital business in some part. If it is just for digital operations or because it is getting online in some shape or form. It is fine. Everybody should embrace this and should be afraid. We should build a knowledge and understanding of risk of cybersecurity across the organization from the employee to the leadership. The leadership of the organization today has to be a digitally minded leadership in any role of leadership, there is an element that needs to be taken into account. Part of that can be done by adopting good practices.
It could be about the adoption of security and data protection standards, the ISO standards. That is a good tool about how businesses can build in the capacities to be secure. That is not just for those operating ‑‑ let me be clear, perhaps. Not just companies that are manufacturing connected devices we're talking about. Any business out there that is using connected devices and generally connected serves. Then those businesses adopting good practices. Thinking about the supply chain, providers, making sure they have the right strong secure capabilities, and making sure we have the security.
The role of Government, they need their infrastructure protected and how they interact with the public. That is obvious. It goes through the whole range of elements. Here, it is a good moment to think it is not just cybersecurity in the hardware, sort of way. We also need to think about, again, people and safety in the broader sense of how malicious actors can combine sometimes attacks on the hardware and software and things like propaganda, how there can be a continuum of attacks for malicious actors and we need to be prepared across the broader understanding of what safety means.
The final one is people, as I said in the beginning, it is about individual and employees or decision‑makers. Everyone needs a culture of cybersecurity increasingly. There are steps that can be taken to protect oneself and our devices and our accounts and don't require much time and little cost. Take free security assessments online. You can find them. We offer some. Multifactor authentication. And don't get it extracted by bad actors. Be vigilant, be aware about phishing or social engineering attacks. Not clicking on unknown links, many on this call know about it, but the wider public there is a massive gap. We as a society should invest. Luca Belli was saying, cybersecurity costs hundreds of billions a year. Let's invest in that in awareness raising.
To me, empowering users is key to mitigating the problems long‑term. Technology at the end of the day I user that is aware is a major part of mitigating the problem. That is where I would like to pick up on what Luca Belli was saying. Yes, we do legislation, but that is only part of the solution. We need a holistic approach with the other measures in mind. That could be technology or awareness raising.
When we think about legislation, whether it is hate speech or poison pen letters or propaganda, they're not new. Today we might call them trolling or disinformation. We have had product safety laws for a long time, but we need to reinterpret them, perhaps and reapply them in this new context to make it effective and combine them with other initiatives, other tools to ensure we have an overall effective approach against this problem.
To sum it up, I think it is really that embedding really that way of thinking of cultural online security and safety we all need to work on and embed in our everyday lives and work. Thank you. Thanks for listening. Of course, plenty more to discuss and happy to take questions.
>> Samih Souissi: Thank you for the presentations and different insights. Before starting the Q&A session, I want to go to the different panelists and see if you have remarks on what the others have presented already? The floor is yours.
>> Lucien Castex: Go ahead.
>> Luca Belli: A quick comment on what was said about the question of incentives. A very important point to bear in mind is that most of the connected devices are not developed by corporations that traditionally invest or know about software security or cybersecurity or digitalization in anyhow. Most of the consumers electronics are developed by producers of toys or any other household appliance or cars that traditionally were producers of those goods, not software. It is unprepared. Not a question of bad faith. It is a question of lack of means. I don't want to downgrade the consumers or producers of consumer goods. It is a matter of lack of know-how and intellectual capacity. People working there, they are not software engineers. Maybe they hired a couple over the past months or past years. On the other hand, consumers don't care about security. They care about cheap prices and nice design. But cybersecurity is the last concern of consumers. It is essential to have laws that create obligation to secure connected devices. But also essential to explain how to properly implement the law and give financial incentives. You have cybersecurity and pay less taxes. We create bonuses for those that excel in cybersecurity. Otherwise the market itself I'm not convinced will regulate with regard to the Internet of Things. Of course those enterprises producing software know how to do it secure or at least they understand how to do it secure. They have incentive to do it secure. The large majority of connected devices is not produced by the companies. Sorry again if I am speaking too much.
>> Samih Souissi: Rayna Stamboliyska, you wanted to intervene? Go ahead.
>> Rayna Stamboliyska: A reaction to the reaction from Luca Belli.
Of course, like many manufacturers of now connected stuff are not originally traditionally professionals in this. The Internet has been here for what, 30 years? We were hardly born. You know, but that is not the point. The point is we are nearly in 2023. Hiding behind oh, I didn't know. That is not okay either, right?
In the sense that when everyone was experimenting in the late '90s, early 2000s, it was okay to move forward, fast break things, whatever motif, big corporations can think of, so on, so forth. In 2023 ‑‑ we're a month away, right? This is starting to be largely unacceptable to hear people who want to invade other markets. Who want to engage into digital transition or Digital Transformation of the products to diversify and augment revenue, so on, so forth by saying oh, well, we'll do quick and dirty because we need money. Yeah, too bad if people don't care about this. Incentives go both ways. One way of incentives to go is to ‑‑ if especially if we think in terms of policy, is to say look, guys, for those of you who accept that time to market can be slower, but because you want to protect yourself and the users, we'll basically, you know, give value back by, I don't know, while decreasing taxes for a given period of time or whatever. There must be ‑‑ I say must and not should. There must be incentives and sanctions because again, it is not acceptable for people like Barbie, not to name them. But for Mattel and Barbie that used to do a connected doll that was listened on kids and allowed anyone to speak to kids from afar.
Those people, even though technology is not their core expertise it is not acceptable for people who have largely the budget to hire and outsource the production of those tools to just not embed security into it. Like again, users are starting to be increasingly concerned about security just because we're hearing about attacks, confirmations, breaches, whatnot every day.
An increasing number of people have been targeted or been victims of the incidents. So they are concerned about this. The problem is that they don't have the means to impose this on the people they buy stuff from. That is where we have this equilibrium where basically people do want and expect because when you go to the supermarket, you don't expect to buy stuff that is good and stuff that is past the date of consumption. You go and don't look at the date because you are used to ‑‑ you know, even though you have never read that rule, the food in the supermarket, it is good for you to buy and eat. We don't have the equivalent of this for technology.
That is where, for me, you know, the problem is again, not just that you can have you know, startups that decide to test out things, so on so forth. (Audio skipping) the problem is (audio skipping) of a way for everyone to see what they're doing. I'm not talking proprietary stuff. Not IP, intellectual property, I'm talking saying what you do and how you do it. We have those thoughts about food. When do we have that for technology.
That is where I kind of disagree with you when you say people don't care. Well, they do. They just don't know how to make their voices heard. Again, we as policymakers also struggle to give value to people who do good. How many cybersecurity marketing efforts have you seen recently? How many companies that have effort to do cybersecurity and do it right have an actual communication campaign to say here is how we protect you?
I could count two or three, which is a shame, right? I mean why aren't you just forming those efforts for privacy and security into a unique selling point?
I see Jean‑Jacques Sahel has a comment.
>> Lucien Castex: We will give the floor to Jean‑Jacques Sahel but in the chat we have a colleague asking us about the best practice worldwide. We answered here. Jean‑Jacques Sahel, you have the floor.
>> Jean‑Jacques Sahel: We have ongoing campaigns to help our users on safety and security on the platforms and in the physical world, if you will. Just a couple of points on this one. 150 thousand scholarships have been given by Google for the Google career certificate for things like I.T. support and data analytics, we show in practice and online safety seminars. Not just for students, but for educators, companies, et cetera. And help with the propagation of standards. It goes to the points I was making about supply chains. Where for the products we're involved with, we look at the entire supply chain. Not just what we make ours but the people that work with us and this concept we have in Cloud which is zero trust. We cannot assume that we should trust any supplier or particular equipment or service. We have to assume that there could be a flaw from anywhere. So we have to be doubly careful.
We have to ‑‑ by trying to do that, we try to do it with a wider ecosystem. And spread our practices. There is a number of efforts, sponsoring various communities, open source community that try to do more research into this, et cetera.
There are efforts, but I agree there is not enough. There should be much more out there talking to the general public and helping them. It depends from country to country. Some are very good. I see sometimes countries, and ads about basic cyber health on buses. It is wonderful. You see that in Singapore. In the local regulatory NDA. A lot of safety advice everywhere. It is very visible. I don't know if it works, but they try at least.
I know a lot of phishing teams on Singapore. There is advertising on that by the Government, I would love to see that more.
There is a discussion to be had with Governments about how we look at product safety. The point about thinking about what laws we already have, perhaps it is reinterpreting the laws and thinking about them in a more practical effective way that works today. There is a lot of work done in UK about consumer IoT. The work is continuing. They came out with a code of practice in 2018 and one in 2021. A lot of points came out. I think the jury is out. There is more to think about. There is a bit of both of what you Rayna Stamboliyska and Luca Belli are saying. There is a push and pull. We need to spread the good practices.
We need to see that we have the product safety laws and are we applying them to the right extent when products are defect 85? There should be routes for consumers already. What it takes is the sort of approach Council of Europe has taken where they don't automatically provide a new law, but there is guidance in a new setting.
>> Lucien Castex: Thank you, Jean‑Jacques Sahel. I was thinking of a new convention on IoT. Well, time flies we asked for four hours but sadly we could not get it. We wanted to thank you for the great discussion Rayna Stamboliyska, Luca Belli, Jean‑Jacques Sahel. I was checking on the chat, and on the Q&A part if anything was flying. But it is not. I guess we were very clear on the statements.
I would like to thank you all, give the floor to Samih Souissi for a quick word.
>> Samih Souissi: Thank you for the discussion and for the insight and for the (audio skipping) human rights policy. We had a large overview of the issue and the discussion will carry on. And hope we meet at next IGF or other conventions. Thank you very much.