Session
IEEE
Karen McCabe, IEEE, Technical Community, Western European and Others Group (WEOG) Kristin Little, IEEE, Technical Community, Western European and Others Group (WEOG) Constance Weise, IEEE, Technical Community, Western European and Others Group (WEOG)
Nishan Chelvachandran; Iron Lakes, Private Sector; IEEE Standards Association - Trustworthy Technical Implementations of Children’s Online/Offline Experiences Industry Connections Activity, Technical Community; Western European and Others Group (WEOG)
Armchair-style question and answer interview session.
English
With the rapid development of new technologies and the proliferation of data-driven ecosystems, governments are finding it difficult to keep pace. It takes time to install governance frameworks and legislation to ensure the safe deployment of new technology. In doing so, governments are pressed to find a balance between wholesale technology adoption and the enforcement of ethical guidelines.
If technology is inextricably woven into the fabric of society to tackle and solve the challenges we face, then we must reshape the ecosystem. Instead of looking through a binary lens of stakeholders and users, the users should be among the stakeholders. Additionally, and as we move into the future, should cybersecurity practices and mechanisms be strictly restricted to the cyber “space”? Or should we look to reimagine the entire intersectional landscape, through legislation, standards, accountability, market drivers, practice, and human agency? This armchair-style talk will explore these questions and more.
This session links to the IGF 2021 cross-cutting issue area “Trust, security, stability,” and covers the following IGF 2021 policy question: “1. Cybersecurity practices and mechanisms: What are the good cybersecurity practices and international mechanisms that already exist? Where do those mechanisms fall short and what can be done to strengthen the security and to reinforce the trust?”
We plan to have a very interactive question and answer session with Mr. Chelvachandran and we will incorporate questions that come up in the chat. As a former UK Police Officer and a high level cybersecurity advisor for the UK Government, Mr. Chelvachandran will be drawing on his years of experience with bespoke cybersecurity operational activity in the UK Public Sector to inform the discussion.
Report
Trust and transparency can form the base of a more holistic approach to cybersecurity. Nishan Chelvachandran suggested that we bring a much wider variety of stakeholders to the table to discuss cybersecurity--anthropologists, teachers, engineers, policymakers and others--even if they don’t agree. This can be a first step toward more effective cybersecurity measures.
A next step calls for redesigning the design process so we begin to design with stakeholders, not for them. He stressed that the involvement of a wide variety of stakeholders who have a say in the design and security of technology could form a basis for trust from the very beginning.
The talk began with the basic question: “What is cybersecurity? At one point, cybersecurity was thought of as IT security--antivirus software or combatting hacking, remarked the speaker, Nishan Chelvachandran, a former high level cybersecurity advisor for the UK Government (and Founder and CEO of Iron Lakes, Chair of the IEEE Industry Connections program on Trustworthy Technical Implementations of Children’s Online/Offline Experiences, and the Co-Chair of the IEEE AI-Driven Innovations for Cities and People Industry Connections Program). This is an important part of cybersecurity, he said, but the term is much broader than that. Cybersecurity actually covers the intersectionality between technology and humanity. The cybersecurity element would be to secure those interactions--through a technological security implementation, with the various encryption protocols, for instance, or perhaps through legal and governance frameworks defining how processes are used. So, in this sense, Chelvachandran noted, cybersecurity does not just secure the technology, it also considers accountability and asks the question “how is the technology being used?” “Why is it being used?” and “What is it actually doing?” (its effects), and then it secures the individual’s experience.
With the rapid development of new technologies and devices, the scope of cybersecurity has mushroomed--more people are using more devices, and more governments, companies, and others are digitizing. This in turn means that more and more personal data is being collected and used for decision making.
When asked “what keeps him up at night about the current state of cybersecurity?” (especially in relation to children, given that he chairs the IEEE standards working group on “trustworthy tech for kids”) Chelvachandran replied, “Running before we can walk,” or introducing things into the market before stress-testing them and perpetually searching for cybersecurity solutions rather than trying to get things right during the design phase. He is all for “lightning levels of progress,” especially when it benefits the global south, underrepresented groups, and the UN SDGs, but warns that if we run before we can walk, we run the risk of creating even bigger problems.
In the past, considering cybersecurity after producing a product or service may not have caused a problem that could not be fixed, noted Chelvachandran. Now that we are on the precipice of a full virtual presence, with conversations in the Metaverse and government maintenance of full datasets of personal biological data, “the train could run away from the station,” said Chelvachandran, “and at the moment, we don't necessarily have adequate brakes.” Right now, we really need to be able to bridge that gap and bring the governance, standardization, and trustworthiness in line with the design and deployment of the technology rather than deploying the technology and then seeing where it's failing and trying to patch it up afterwards.
He acknowledged that doing things better would not be easy, and noted that this is why we really need to think about cybersecurity in a transdisciplinary way, recommending that we strive to bring in all stakeholders. Cybersecurity stems from technology, and of course, a lot of people like Chelvachandran who are engineers and technologists know the technology, but if we are thinking about human intersectionality, then we need to include anthropologists, psychologists, teachers and policymakers in the development of standards and technology, he emphasized. We need to figure out what the problem is and look at things from a different perspective, especially because the technologies that we are deploying will be used by everyone, not just a small subset of society.
A next step calls for redesigning the design process so we begin to design with stakeholders, not for them. He stressed that the involvement of a wide variety of stakeholders who have a say in the design and security of technology could form a basis for trust from the very beginning. As an example, many technologies that children are currently using were not designed for children’s use, and those that were, such as children’s products, services, and even games, have been designed for the most part by adults, explained Chelvachandran. “We should redesign the design process,” he said, “this is key to really rethinking how to create something not just fit for purpose, but something that is more future proof, and that “bakes in” security principles and processes at the design stage rather than coming up with a bandaid solution after something is built.”
Involve everyone, even the naysayers. Cybersecurity must be a truly collaborative effort, with the involvement of government, NGOs, academics, and companies, noted Chelvachandran, mentioning that even stakeholders with objections should be included in the process. The IEEE Industry Connections program brings together a wide variety of stakeholders to ask tough questions and work on what something could look like. Many of the issues discussed are conceptual, abstract, and the program looks at how to take the abstractions, whether from research or case studies, and build something. This work can then lead to standardization work which then can help govern and steer industry.
Cybersecurity, transparency, and trust. An attendee asked: “Do you think that there is a shift in how we think about distinguishing and connecting security and safety concerns?” Yes, there has been a shift, replied Chelvachandran. We used to talk about privacy as protecting our data and our anonymity--not letting people see what we are doing. Now that our data is now already in the hands of agencies, governments, public entities, companies and others, the focus is shifting from privacy to agency, and governing who uses it and why. It might be that later on, a user decides that they don’t like the company or service anymore and they withdraw their consent for the company to use their data.
Chelvachandran advocates for going back to “plain, straightforward, transparent ways of communicating to the user, rather than serving them countless pages of terms and conditions. We should make things “understandable” rather than “explainable” because many technologies are actually quite difficult to explain, and we could do much more interms of helping people understand who is using their data and why they are using it.
Chelvachandran ended by emphasizing that we should not have echo chambers, with everyone agreeing to design something. “We really need to have these conversations of not just whether we can design it, but should we, and if we should, which a lot of the time I believe we should, then how do we do it in a way that's fair and transparent and representative and secure and safe? This is really where we should be at and I don't think enough of that is done.”