IGF 2023 – Day 1 – Lightning Talk #141 The new European toolbox for cybersecurity regulation

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> So, let's start.  And drive right into it so I would like to give you a bit overview of the regulatory efforts of the EU.  They are two former directives and two that are right now more or less work in progress.  Yeah.  The first, is the ‑‑ was the cyber security act of 2019.  Yeah.  I think all those regulations kind of have, like a headline issue.  They show the description.  It's called the cyber security act but in a way all it does is in a way, it formalizes the mandate of the major European cyber authority.  It was established I think in 2011?  Yeah, don't pin me down on the date but around for a few years and therefore the cyber security act formalized the mandate and it also introduced, yeah, a formal process to articulate European cyber security as identification schemes so you can ‑‑ with the schemes you can as an ITC operator or manufacturer, you can aim to formulate specific use cases and therefore choose a certification base approach to cyber security to be honest I think they are right now like the cyber security act was readable in 2019 but so far I think the only major certification scheme that is a work in progress is the European certification scheme of cloud computing but otherwise there's, yeah, not much going on more.  Then some of you may have heard of the NIS2 director from 2022.

This directive generally addresses operators of critical infrastructure to implement certain security measures, to secure less services so the aim of that directive was mainly, yeah, critical infrastructure and critical services.  There's one major regulation work in progress.  It's the proposal for cyber resilience act.  They can see like the headline problem again, like, basically so like the long title above ‑‑ from the cyber security act is a regulation on horizontal requirements.  I don't know why it's called cyber resilience act.  In the end it's like cyber security and product safety.  To go all the way.  But I just guess like the name cyber security act was already taken and so, yeah, cyber resilience was kind of the new buzzword to use in that sphere and therefore, yeah, why not take it as a short title.  Yeah, as I said, it's still a work in progress, I think it's in the trial discussion right now.  Yeah, as I said, it regulates the safety and security of digital products or products with digital elements.  And, therefore, the main approach of the cyber resilience act is product safety.  When you enter a market and you market a product within the European Union, then you have to comply to certain product safety and security which includes the needs of the cyber security.  The address to put it in is not mainly a cyber security regulation but just, yeah, you mention it.  There's also the proposal for the AI‑act which takes a risk‑based approach to regulating certain high risk AU use cases.  It doesn't regulate AI as a technique or the technical implementation itself.

But certain use cases which are seen as high risk have to comply to certain security ‑‑ not only cyber security but like, basically, yeah, security and risk management requirements all in all.  And this is, as I said, it's like the contact point for the regulation.  There's the use case and, therefore, it's also product security ‑‑ yeah, product safety regulation for product security regulation in its heart and its core.  Yeah.  So you saw like the just to give you an idea like what it's ‑‑ why it's actually so complicated.  To approach in cyber security from a regulatory perspective.  It's, yeah, the latest European efforts I talked about were yeah, quite recent but it's not that the European Union and especially in the member states.  There was like no cyber security regulation before.  You can like in how it's managed in Germany they were cross‑sectional law as I said in this directive, the, yeah, this one has a predecessor and another was mainly regulated before like the main new thing about the new directive is the scope of the regulated entities has broadened significantly.  There was like the GDPR, the BSE law is actually like the German implementation of the NIS2 directive but they existed also like a lot of sector specific IT security law which was kind of like historically grown like there was like telecommunication law, a law for medical products, energy law, those sector‑specific laws existed long time before anybody thought about cyber security but like it depended like sometime in the early 2000s when there was a regulatory update of those regulations.  Somebody thought, oh, yeah, this cyber security stuff is now coming and more getting more important like people use ‑‑ start to use like computers to actually operate a power plant and therefore they crammed in something like just a paragraph and something has to be implemented like that and therefore was like, it is still, like, a big historically grown eclectic body.  And even though there were like reason efforts, for example, within NIS2 directive to, yeah, make this jungle of laws a bit more approachable.

But of course, like regulation changes slowly and therefore, yeah, it might take some time and I think you can, like, a good, you can compare it like to a legacy system like it's not ‑‑ no different to an IT system in a way.  You have a lot of legacy systems but also historically grown and at the end of the day nobody wants to touch them.  And once you start on a blank paper once over, therefore, yeah it might take some time until ‑‑ to get a bit more view in it.  Yeah.  What were like the basic methods used in those regulations?  Like to give you like a very brief entire overview.  Of course we have the public law measures that require member state to at first, yeah, implement institutions and authorities to enforce obligation because, like, when you make obligations for private sector you need an authority to check if all those obligations are met by the private sector and there was an effort to create something like cyber security incident response team in order that, yeah, member states have, like, public authority that if there's like a large scale cyber incident to be seen that they can help direct, organize and, yeah, mitigate the year before of those large‑scale incidents.  But I think like the core and major point of the cyber security regulation in Europe in general is like the obligations for the private sector like as I said, it's grew from the ‑‑ if you are, like in the regulated sector then you have to basically implement appropriate technical organizational measures to ensure the security of your service or your product. 

And the way this is usually conducted is by risk management.  So, it's like no one size fits all.  And here's your check ‑‑ compliance checklist of measures that you have to implement but like in general, it's the private institution have to conduct like a risk management by themselves and, therefore, the result of this risk management is the actual measures that have to be implemented.  For the most part, and I think this is ‑‑ this risk based approach is generally been a new approach historically it was like more common to actually stay explicit technical requirements either way explicitly this the law which is still in the case in the NIS2 directive there's a catalog of actual measures you have to implement.  Some are very abstract.  Like the risk management themselves or some are like actually more bit explicit like, yeah, a backup management or that you have to think about encryption and all the stuff.  But like speaking for example in the field of medical products, it was it acted as a certification based approach that, yeah, there's like no ‑‑ not really risk management involved but like in order for your medical product to get certified, there were like certain standards which contained like actual things you had to implement and therefore, this was like in ‑‑ a way it was done back in the days.  I think there's ‑‑ this is a certification based approach is still valid for very explicit use cases like very explicit use case but it's like not a good measure to do it like in, yeah, cross‑sectoral law. 

Yeah, when you want to address, like, because like the situation in each sector is still like two different to, yeah, require explicit technical measures here.  Those manufacturers or providers have to implement.  Yeah.  This was like the, yeah, very high level overview of the actual tools and measures that the regulation contain selected issues.  One selected issue I should say example of digital identities like why for one reason is like the project I'm working on, but for the ‑‑ on the other side I think it's a good ‑‑ this is a very raw sketch but I think it illustrates very good, like, what is actually ‑‑ why it's actually so hard to come up with the working cyber security regulation.  Because, yeah, like digital identities are still at least in Germany we are not Estonia.  It is very youth filled in a way.  There's not really like in infrastructure set up and they are like a lot of different stakeholders that have interest and also are interested in proper cyber security measures like, for digital identities, for example, you have, like, two different regulations.  You have like the ‑‑ there's two directives because us providers are regulated as particular infrastructure but then of course we had the NIS directive which is a directive on the digital identities and which is like more sector specific and in practical you can always have to like see if those two pieces of regulations really work together or if they are like ‑‑ if they contradict each other like if they are like two separate paragraphs that regulate the same thing.

And you come one a solution which actually implies.  Which applies.  Which is, yes, then ongoing legal discussion.  There are actually, like established standards for digital identities and especially like with the NIS2 directive.  The most standards were like, considering the regular ‑‑ the regulation and, yeah, and within this tool again it's like the question of, okay, when I am certified, implemented standards for cyber security or digital identities concerning the eIDAS regulation but can I copy paste everything in order to prove my compliance with the NIS2 then of course they are like the technical requirements.

Which is of course like the first, like the technical realty I would call it.  It's, yeah, what can you actually technically do.  But also like the, yeah, the marketing implications, like, for example, for digital identities, there's the one part of it is like a secure element which when you have like your world on your phone like the phone actually needs to have a secure element in it.  And therefore, in a way like phone manufacturer dictate how the secure element actually looks like.  Therefore, also like market interests come into play.  And, yeah, then it's like the use cases as I would call it.  Like the actual thing where you need digital tool identities like for not only in the cases where you actually show your passport but like, for example, yeah, for hotel reservations or going to the library and stuff like this.  Which is also affected by other regulations and of course, there's like always like the users, not only the people that implement the actual infrastructure and use cases which have like an interest like how those use cases are designed but also like how the general infrastructure of digital identities is designed and all this is, like, at least in Germany it's new, it's evolving, it's an evolving ecosystem.  And therefore, it's very, very complex to, yeah, to recognize every new requirement and every ‑‑ in the interest of every stakeholder.  Yeah like another, briefly, like another selected issue.  It's with the risk management itself it's like the risk management is now the fancy method to come one the technical measures.  They are like established standards.  And like how to actually conduct the risk management digitally contains stuff like get to know your system.  Risk identification.  What could actually go wrong and risk estimation like how possible is it that stuff goes wrong with evaluation.  Yeah just lake risk as it's defined.  The probability and possible damage to be expected and then the risk treatment.  So this where you come up with like the actual technique and the implementation measures to be implemented.  Yeah, it's so risk management is nothing new but at the end of the day who has to conduct the risk management.  For the most of the time there's like no conflict of interest.  For example when you provide a critical infrastructure you are also interested that power plant is provider is interested and that it doesn't get hacked because it is an economic interest in it.

But like this is like the pro, of the cyber security regulation.  The interests generally align but at the end of the day it's also, for example, in the cyber security, cyber resilience act as a product manufacturer that has to conduct this risk assessment, it's ‑‑ I have like ‑‑ I have the ‑‑ interest to market my product in the open market and, therefore, there are standards to make the risk management as objective as possible but at the end of the day you cannot never take the perspective of the entity that actually has to conduct the risk assessment.  Out of the equation.  It's always slight subjectivity to it.  And this is actually ‑‑ this is actually hard when it comes to third party assessments and in practice it's not easy to, yeah, always, yeah, change the perspective and also consider it that are not for yourself but for other persons.  Yeah.  This was like the ‑‑ this was ‑‑ with the substance and I would like to ask you in ‑‑ you have any points or any questions to have a short but fruitful discussion.
>> So you mentioned like, you use medical product regulation as an example.  And you also mentioned like critical infrastructure.  And I'm just curious about whether you, like how you think about the ‑‑ like, instead of thinking pain so much around the downstream regulation of the products, like you mentioned product security a few times.  How do you think about like how ‑‑ how the European Union currently secures critical hardware components like chips and whether you see any sort of regulation or export controls around those critical security ‑‑ critical hardware components of, you know, that run our infrastructure?  Especially as they host more, you know, advanced IP and stuff like that.
>> Nils yeah, I think in the directive there is a part to consider supply chain risk.  So in order, yeah, that way like the private entity has to consider the manufacturers they get the chips and the technical components from ‑‑ in order to, yeah, not only from cyber security perspective but also like those who don't get too reliant on one manufacturer I think the 5G issue was like the most prominent one.  Export controls I actually don't know but especially with the semiconductors they were like strategic regulations like to ‑‑ and thoughts like to actually become less reliant on, yeah, less reliant on the Chinese manufacturers.
>> Hi.  It's been interesting to listen to you.  I work in European Parliament and work on these legislations.

So we cover semiconductors so they should be more secure once it's enforced but one element that I am missing also on the European level when we talk about this legislations is really the how are we bringing more of the human element?  How we bring kind of the third element that's missing.  So you can secure your services, and then the supply chains through product regulation but we still need more awareness and bringing it into everybody of us understanding what cyber security is so I would be interested in hearing how you see kind of how this can fit in this whole package of full legislations that we have on the table.  The more human element of it because it is important one, for cyber security. 
>> NILS BRINKER: I think, yeah, the human element is important but the other thing is important to consider who do we address?  The end user?  And my opinion is ‑‑ it is always good to have the end users aware but we have, like, limited resources.  And like typical example like you roll out like a fake phishing e‑mail campaign like in order to make an awareness trainer for employers but at the end of the day this is not where it goes wrong to begin with.  If you rely that, yeah, Thomas from accounting 60 years old doesn't click on a cat picture in order for your ‑‑ in order for your organization to don't get hacked is not Thomas fault it's system security was shitty to begin with and I think the addressee of those human elements and awareness education's like more like the yeah like the manufacturers like the programmers, like yeah, consider cyber security when setting up the architecture for your product and for the quotas, yeah, like what are the common mistakes.  And that cause cyber security issues and, therefore, yeah, this is, yeah, main point is like, don't focus too much on the end user but on the technical administrators and team that actually implement and make those products.
>> Thank you.  I have one question.  Any policy, must have some tools to investigate the oppression of that society.  The European Union provided MECSA which is a tool to check the security of the main server.  So I think it's a very good, you know, tool to investigate oppression of each server.  My question is, after that tool's introduced to the European Union, what are the effect of that tool?  It reduced, you know, security level.  Or an increase in the level of the security of each set in the European Union.  For instance.

I found in almost none of the companies use the NIS security.  Why?  Because it protects your server from spoofing I think but almost none of it.  So most important, what are the tendencies not to use such an important tool.  It is very effective but almost none of the other companies use such an important tool right now. 
>> NILS BRINKER: Yes that's a great question and I have to admit I have no clue why they don't implement it.  I think the general problem of cyber security in organisations is like in other companies, it shouldn't be but like, especially like in non‑tech industries like cyber security was always this abstract thing that cost money and therefore, it was like, not, yeah, people just, like, didn't want ‑‑ it didn't generate money.  It didn't generate profit therefore people were not here to invest in it.  I think maybe, like to be a bit cynical ransomware did a good job in bringing the abstract danger into, like, the minds of executive because there you have like a very, very, yeah, vivid picture, what ‑‑ of the actual damage that can be caused to ‑‑ through cyber security incidents.
>> May I?  Thank you.  I think that's a very important question and interesting question that connects slightly with a CRA even though the other part is not in scope.  However what both questions have in common or the regulation addresses outside your particular example is that there are economic counter incentives to deploying system this is the mass market.  You mentioned it's a product, first, and foremost that CRA is a product safety regulation which is why it goes ‑‑ it makes use of a whole set of prearranged tools that are available on the regulatory side.  The radio equipment director is the predecessor in the regulatory toolbox.  And the threat model is the cheap cameras coming from somewhere that are thrown on the market without any support for software update, security updates and so on and so forth and why is that?  Because people don't want to pay extra for security.  And that is kind of comparable to your example, right?  So this is something.  First of all it's not something you add to a product.  It's a system that needs to be deployed in various places.  It needs cooperation and so on and so forth.  But economic counter incentives in the mass market I guess is what both have in common and that might be something to inform the discussion.  And of course the CRA also responds to very prominent threats.  Some of you might remember the Log4j incident which was ‑‑ and then open source comes into play.

That's not the main point but Log4j maybe not triggered this but would be a prime example for something that triggers this kind of regulation.  Everybody was crying around that this was a very important piece of software that is deployed all over the place and it was maybe unmaintained or working on a shoestring budget because a single individual was "responsible" for this and it is deployed everywhere.  But the economics are very important in this part and should be looked at.  Thank you. 
>> NILS BRINKER: Yeah, thank you, I agree and I think we came to the end of this session.
>> So I have a quick question. 
>> NILS BRINKER: Yeah, sorry.
>> Basically it was a group presentation.  I am from Nepal so when we talk about the regulation on cyber security, particularly what I understood is we have to go through from some time governments partnership, workforce development and public capital so in EU, is what quick.  Does it cover all these aspect in a single act or is it very separate? 
>> NILS BRINKER: Very separate.  Like NIS2 is in a different infrastructure than on products.  Like when those that you mentioned.  It was government ‑‑ governance, human aspect.  It's ‑‑ I think it's like not ‑‑ especially like for example the human aspect.  It's kind of thought of a bit like in this tool but in general, you could say like there's no overarching IT security law in the European Union and why is that?  Like so I had in the beginning like everything is a bit historically grown and some sectors are, like, very, yeah, Firth on their way, for example.

Financial sectors are historically a highly regulated sector and there were efforts to consolidate it a bit but in general, it's still a bit like, yeah, scattered around.  Okay.  Then I think we came to the end of this session.  Thank you very much for your questions and, yeah, then see you soon on the venue.
>> AUDIENCE: (Applause).