IGF 2024 - Day 4 - Workshop Room 4 - WS125 Balancing Acts - Encryption, Privacy, and Public Safety

The following are the outputs of the captioning taken during an IGF intervention. Although it is largely accurate, in some cases it may be incomplete or inaccurate due to inaudible passages or transcription errors. It is posted as an aid, but should not be treated as an authoritative record.

***

 

>> MODERATOR: Okay, it's now 1:45 here. We'll make a start, if that's okay with everybody. A very warm welcome to this workshop, workshop Number 125. And here we are looking at balancing acts in terms of encryption, privacy, and public safety.

My name is David Wright, I'm director of the charity SWGFL. And I'm delighted to introduce you to the panel that we have here. This is the last workshop of this IGF and it's a privilege to close out the IGF ahead of the ceremony with this particular subject.

We're going to be looking at this complex balance between encryption, privacy rights, and the need for public safety and the needs for multi-stakeholder discussion.

We have an hour with you, so we'll have to keep comments, questions perhaps brief, but it's such an important subject that we see.

I'm going introduce the panel to you, and for the panelists, I'm going to have to convince some of the bios just because of time. I'm going to run through them in terms of the sequence of questions as well.

So I'd like to welcome Andrew Campling who is here with us. Andrew is director of 419 consultancy, a public affairs consultancy focused on the tech and telecom sectors. He has 40 years of experience in a wide range of increasingly senior roles in a mainly business-to-business technology context. He's been engaging in initiatives linked to encrypted DNS, DSI, and Internet standards primarily to understand their impact in real world deployments.

Andrew is the trustee of the Internet watch foundation, global charity, one of our partners within the UK safe Internet center. And it holds an MSC, strategic marketing management, and an MBA. Andrew is studying in his spare time, studying law and completes his LM in the next couple of years.

Online joining us is Arnaud Taddei. If we might be able to bring up, please, Arnaud on to the screen. Arnaud is a global security strategist semantic by broad com. He's an executive adviser of security strategy and transformation for the top 150 semantic customers. Part of his mission is participating in standardization defining organizations and in particular, this warrants congratulations, elected chair of the ITU, SG17 and works at the ITF. Started his career in Geneva which created the worldwide web and where he led the team responsible for communication, authentication, and authorization.

In 2000, joined SUN. And then joined semantic and hold chief arc tech roles up to director of research and reports up to Dr. Thompkin, an RSA conference chairman.

We can bring Honey on to the screen please. Honey Makola is the manager of the independent communications authority of South Africa where she guides the regulator in navigating its evolving roles in cybersecurity. She serves as vice-chair Penn of the ITUT study Group 17 which focuses on setting international standards for cybersecurity.

Within study Group 17, she leads the group on child online protection working to identify gaps in technical promotions and to create safer online environments for children.

If I can come back in the room and introduce Afnan Alromi. Afnan Alromi is a cybersecurity leadership with 12 years of experience in complex projects and shaping cybersecurity strategies. As vice-chairman of the study Group 17, she holds an advanced degree with several industry professional certifications and is known for her strategic vulnerability management and fostering international collaborations.

Afnan, welcome.

Finally, if I can come to Boris. He is the head of engagements at partnerships at SWGFL, the UK-based charity. He works with UK safe Internet center which is part of the European in safe network. He has worked extensively with various European countries where he worked with the safe Internet center and he's been involved with numerous groups to present online safety strategies to government officials and NGOs. He focuses on protecting children from online threats such as cyberbullying, child sex exploitation, and scams. As well as professionals through workshops and keynote speeches. I'd like to welcome the panel and also this afternoon, forgive me, I'm joined by my colleague who will moderate the online conversation when we get to chats.

If I can invite the panelists to give a quick opening. Andrew, we'll start with you, please.

>> ANDREW CAMPLING: Good afternoon, everyone, and hello to everyone online. As David said, my name's Andrew Campling, I'm a trustee of the Internet network foundation, amongst other things. I think this is incredibly important issue which we're going to go over. Although we've focused on specifically privacy, in a short while I really want to get into the debate about privacy versus other human rights.

Because I think we overinflate the importance of privacy and ignore the other fundamental rights as opposed to privacy as a qualified right. That's something we come on to when David asks us questions, I'm sure.

>> DAVID WRIGHT: Thank you, Andrew. Arnaud, I'll throw it to you, please.

>> ARNAUD TADDEI: Yes, can you hear me correctly? Thank you for the chance to be in this workshop.

This is -- it really is breaking when you start to understand what it does take. It's concerning to see the level of harm that is increasing, perhaps accelerating and we need to know that it's difficult for a number of humans. Serving this from ITU is an interesting journey. Hopefully we're putting in conditions where question have a meaningful discussion.

>> DAVID WRIGHT: Arnaud, thank you. If I can now turn to Honey.

>> HONEY MAKOLA: Yes, thank you. I just wanted to draw attention, it's a part of the work that I do, we are focusing on identifying and addressing gaps on children and protection center organizations within the study Group 17.

We have done a lot of work into reviewing the regulations, the standards that are currently in place, and you know, the work has progressed well and we are on our way.

I did find the gaps. But I'd like to take this opportunity, please, to invite the people in the group, as well as online to please join the correspondence group in child online protection. This can be done through subscription on my workspace.

For the purpose of today's meeting, for me, I think encryption is a very important and powerful tool that can help us safeguard communications and information, but it also creates significant challenges in protecting children online.

So for the purpose of today's meeting, I just want us to try and find a balance between privacy on the other hand and other issues such as the protection of children online. It's a challenging balance, but one I believe that is essential for the effectiveness of child online protection.

And I look forward to the engagements this afternoon. Thank you.

>> DAVID WRIGHT: Honey, thank you very much. Now turning back to the room, Afnan, if I can hold it to you, please.

>> ARNAUD TADDEI: Good afternoon, everyone, and looking forward to this wonderful discussion today and engaging in this topic. And looking forward to discuss the challenges that just my colleague Honey just discussed that are to be discussed today. And also see how we can succeed with balancing the encryption and privacy rights at the same time the public safety.

So looking forward for this discussion. Thank you.

>> DAVID WRIGHT: Thank you, Afnan. And finally, Boris, if you can introduce.

>> BORIS RADANOVIC: Thank you very much. Appreciate the invitation and ability to contribute especially from a diverse set of points looking at this. I love the title, it says encryption, privacy, and public safety. I think that is the framework of conversation that I think we should all have and support.

And one of the questions in my mind that we can later hopefully answer is how do we create meaningful and impactful discussions on these topics that takes into account a wide array of different perspective needs and abilities and representation, but equally respecting the direction that we all want to take to a better and safer world which includes protections of children in itself.

So really proud to be here.

>> DAVID WRIGHT: Thank you, Boris. Moving on to the particular questions that we're going to pose and then panelists will share and discuss with everybody, after which point we will then open the floor and the virtual floor to questions. So please do hold on to questions, there will be a time.

Because of timing as well, I'm going to keep panelists to perhaps four minutes. So please keep contributions succinct.

So Andrew, I'm going to turn to you without any further ado first. And I wonder if you could elaborate a little on how should governments and tech companies approach the creation of lawful access mechanisms without infringing on privacy rights? Straight into this point.

>> ANDREW CAMPLING: Fantastic. Thank you, David, for the question and to provoke hopefully a response from some of the people -- the participants in the room and online.

In my view, the weaponization of privacy is being used and has been and is continuing to be used to override all of the human rights of children and other vulnerable groups. I think that's a fundamental problem.

As I said earlier, remembering that privacy is a qualified right and we would need to think about all of the human rights.

And also, again, to provoke a response, encryption, let's remember, is not the same as security. They're fundamentally different things, but they're often conflated. And for example, if you begin in Internet standards mode to encrypt and compromise metadata, you end up with weakened security. If you weaken security, you have no privacy. But you think you have. And that's also a big problem.

So very briefly, let's put scale to the problem. I'm going to focus on child sex abuse because that's what the Internet network foundation does. We're seeing roughly 150 million victims of child sexual violence every year around the world. And we -- after recording in the order of 100 million reports of CSAM committees and videos every year, that's roughly three images -- new images being found every second, new images or videos.

So even in the course of this workshop, that's a scary number of images and videos being found.

The Internet has magnified the scale of the problem of CSAM significantly, it happened pre Internet, but the ability to publish and share the images globally, remembers every time an image is shared, that's a crime and there's a victim. The scale of the problem is huge compared to what it was pre Internet.

We know from research the encrypted messaging platforms are widely used to find and share CSAM, and there's a very large sample size behind that research. But to get directly to the problem, in terms of like lawful interception, you don't need to backdoor encryption to help solve this problem.

Client side scanning for known CSAM images would immediately reduce the size of the problem. It doesn't break encryption. It doesn't break privacy. So that's an easy way to make an impact, as would be the use of tools like age estimation to keep adults off of platforms intended for children and children off of platforms intended for adults to try to keep the victims away from the criminals.

So those would be my suggestions, places to start, and hopefully that will provoke a response from people in a few minutes.

Thank you.

>> DAVID WRIGHT: I'm pretty sure it will, Andrew. Thank you.

Just can I ask quickly, a follow-up question, just to prime the microphone.

Are developments in Internet standards helping or harming human rights?

>> ANDREW CAMPLING: So in my view, the increased use and requirement to use encryption in some of those standards, I could give examples but probably that's too much detail for here is making the problem worse, not better. Is making it harder to find where the crimes are happening. And it's making it easier for the criminals to hide.

So some of the developments are actually problematic in the standards bodies, an area we're active. And they also coincidently weaken security as well.

So I think that's why we need Civil Society to engage in places like the standards bodies. Currently it's mainly technologists making these decisions, and I need, dare I say it, multi-stakeholder engagement to shine a light on things causing huge societal problems.

>> DAVID WRIGHT: Thank you. Thank you very much.

Okay, let's move on now to Arnaud, if I can, please, on to the screen.

So Arnaud, the question that we would pose to you, so what technical innovations or solutions do you see as viable for achieving a balance between privacy and public safety?

>> ARNAUD TADDEI: I like the Mission Impossible, it's a very difficult problem.

So in one hand, one of the issues is that we have only one mother for Internet as we know it today. So anything we do is going to impact various communities. So far so good. Until some version of DNA and other considerations, we sort of managed to keep the community together and everybody could find what it needed from [?]

But for various reasons, the actions we set for good, bad, no reason, it's not a judgment, it's just they were going a certain direction and now it is pushing the solution for one specific part of the spectrum of the Internet. And now we have situations where some are going to get [?] and some going to have a problem with what is happening.

So this is very difficult to move from there because we have only one design and it's difficult to get out of this.

So now that doesn't mean we cannot be creative for other areas we realized that in the background of that the anthropology called assumption that was maybe and was one mother for all humans. That means very narrow mother for all the humans to make sure it fits the [?]

But when you do that, you lose the fact that there are subcontext with very specific needs. Education is one of those subcontexts. I would add [?] people is one of those subcontexts. And all these subcontexts are different requirements and needs. And when we have to take a step back and make better, say when we have to make design choices, the issue of design is the tradeoffs. Which tradeoffs are you going to do for a specific set of use cases and requirements. That's what the engineer is missing.

So when you go on this approach, perhaps the action we could consider is subcontext. It's not related here to child protection, but I see for example subcontext that may be already happening. I discovered the evidence of what we call enterprise brothers, that's specific requirements for enterprises cases.

Equally you see more and more family solutions by some hyperscalers. So the question I have is, is the premise of the solution not to consider that we should, perhaps, rehighlight the concept of subcontext and from there we could start to envision some practical solutions.

I will stop there. Thank you.

>> DAVID WRIGHT: Thank you very much for trying to take on Mission Impossible.

Moving on to Honey which we can't see at the moment, Honey, there we go. There we go. If I can pose you the question, so how can international collaboration improve or complicate the encryption debate, especially when balancing privacy with cross-border safety issues?

>> HONEY MAKOLA: Thank you for the question. So I have a lot of experience in international collaboration and also having recently shaped the cybersecurity resolutions in the WTC 2024, I want to join on that experience, right?

And I would like to start with what complicates the debate and the collaboration. First of all, the participants come to the collaboration table, if I may, with different legal and regulatory frameworks. They could be divided into the ones that are pro data protection and privacy. Then you get the ones that are for government control, for national security purposes. Then you get the ones that want balance.

So those differences make it very difficult to agree on a unified approach to encryption across borders. So you can even imagine for the frustration for the companies operating across borders regarding encryption.

Then there's also the imbalance in cybersecurity capabilities across the different nations. You get nations with advanced cybersecurity capabilities that may argue for stronger encryption to protect the critical infrastructure, which is fair. And then on the other hand, for example, at the moment a priority for Africa could be the protection of children online versus individual privacy. And that is understandable because currently Africa has the highest dividend when it comes to young people. So you would understand why from this protecting the children online now is actually protecting the future economy and protecting everything.

And then you get the MOUs and the multilateral treaties and so forth. But from the readings, it could be -- you see that this can be quite slow and cumbersome when issues need to be dealt with right away and governments are seeking sort of exits. And the geopolitical tension and distrust as well, and you know, that we witness.

And from our work with the ITU so far, there's also a question of how do you develop global standards that are balanced in addressing the needs of the different regions, taking into account the imbalances in technical capabilities, the skills the different countries are having, and the level of skills, actually, and the culture of privacy in itself.

But the compensations are just the negatives. Moving to the how it can improve, you know, taking into account obviously having what complicates it in the background, I think the starting point in the encryption debate is that there are two sides, but how do we develop common ground?

So -- and that is what the international collaboration can improve, the ability to facilitate dialogue between nations with the different stands on encryption. And moving from the established common ground, countries can then establish harmonized legal and technical standard.

There will be compromises to be made by different groups, but remembering or recalling that the common ground is important for facilitating the debate. And I personally believe and from my experience, that it's during this international [?] where innovative solutions can be brought. So if you cannot compromise what you do, you look for solutions.

That can be in the form of research into privacy preserving technologies, which is what the work of the group on child land protection of the ITU group is doing. You know, we're looking for the solution that balances the two.

Public-private partnerships, you know, development of the work that is done by the ITU development sector in sharing information and also capacity building.

But the most important thing to also remember in all of this improvement that can be brought about by international collaboration, it requires active participation and contribution from the private sector as well. Not just government and regulators.

So what makes the collaboration work is everybody finding that common ground first and then starting to move on and see how they can compromise.

>> DAVID WRIGHT: Honey, thank you very much and I very much share the hope that we will find solutions events like this, despite Arnaud saying it being Mission Impossible.

Okay, I'm next going back to the room and to Afnan. Afnan, if I can pose to you from your perspective what are the most critical challenges in balancing encryption with public safety and privacy rights?

>> AFNAN ALROMI: Thank you, David.

First of all, before I go into answering your question, I'd like to thank my closing here on this panel discussion for their great insights and the points they've brought in. And they've actually pointed out a couple of challenges that I was hoping and planning to discuss in this talk.

Although as we say encryption is an important tool and aspect that helps us to secure our sensitive data and have it more safe, it poses a couple of challenges.

In today's session, although we have limited time, I'm going to discuss a couple that are most likely those with us and attendees are more familiar with.

The first challenge that we can sense and see is when it comes to global inconsistency, and I think Arnaud and Honey mentioned a couple points on that aspect. And also the conflicting international laws where it brings it more challenging to discuss this. Countries having very legal standards and laws and regulations. When we see international companies working in those -- in those different countries, they have to comply with each standard and they have to fulfill all those standards as well.

So this brings a hard challenge in that aspect.

Another challenge is when it comes to -- to the evolving technologies and we've discussed this as well in -- as G 17 and we have specific question to emerging evolving technologies in that aspect. Evolving technologies and the rapid pace of technology invent means we have to keep up with them and increase those encryption standards they bring in. One of the evolving technologies that we are considering or discussing in SG17 is quantum computing. And quantum computing in the future will break the encryption standards we have today. This poses a challenge that we need to consider or start the transition to secure quantum infrastructure.

Another challenge, it's very important and we have a couple of initiatives where you get that, is looking to that aspect, which is the challenges of protecting at the use against abuse online. And as we see, some of the tools or the applications that we have today or even chatting applications that we use in the day-to-day life, they have a lot of encryption or they encryption implemented to secure our communication.

Although this is vital and important, it's created a challenge for law enforcement to help secure or have public safety more safe in the environment. And most importantly, children where we cannot see what type of -- what type of text that is being communicated and the abuse that could be happening in that discussion.

So these are just, like, some of the challenges that we -- some of the challenges that we need to consider when we balance between encryption and also public safety and privacy rights as well.

This is something we need to consider. And we need to have it either from are the government, private sector, from civil, and all parties that collaborate in that discussion and try to find a way to balance -- to reach a balance in that aspect.

Thank you.

>> DAVID WRIGHT: Thank you very much. That's the ensuing conversation afterward, which I'm going to encourage everybody to think about questions that when we open the floor, it is okay with questions online too.

I say that just before I throw it over finally to my colleague Boris. And so my question here, Boris, is what role should the public play in this discourse and how can awareness be effectively raised on the impact of encryption policies on privacy and security.

>> BORIS RADANOVIC: Thank you. Short answer, education. But how do it properly is a much longer discussion. I think we should start with defining frameworks for meaningful discussions that allow us communication goals and structure that allow exactly these kinds of discussions on multiple levels of representation, diversity, and abilities and disabilities that could contribute to these conversations that might not be able to do now.

But on a broader point, I think we should all be aware that vision can only pull as hard as reality can follow. And the current reality then in these chairs around us is not 150 million children across the world being sexually abused every year, and that number is rising. That is the reality we need to face.

And while the vision we can all agree is magnificent, the reality is something we need to take into account. Thinking about the discussions I'm going to raise more questions than answers, this is the perfect space, how did we not allow dominant discussions to be a part from a certain area, agenda, stakeholder, or interest and how do we have meaningful level playing field for anybody contributing to this discussion.

How do we make sure that we develop initially technological solutions to take into account the of the benefit of the user or the benefit of the child first and foremost and then we continue developing those solutions.

And I think all of that builds up into our -- what I personally consider our principle duty as adults to create a better and safer world for people and young children following in our footsteps.

I'll come back to our nod and say, I love the movie, Tom Cruise in mission possible, but I don't know if you remember that in each of those movies a great team of people working towards their own abilities and capabilities working together make the movie and in the end, mission quite possible. I know that doesn't make a great marketing title, but that should be a good notion.

And just to comment on something that Honey said which I think is important, what we are trying to do is not easy, but we have to ask ourselves, what is easy and what is right and lean, I would suggest, on the side of what is right and finding solutions for that. Just come back on that point for my final.

We need to be able and find a way to develop global standards with local sensitivities that respect many of the things that I mentioned. I wholeheartedly ask all of you today listening to us online and here, do ask us questions on level that we all need to understand. But if we can take into account that all of us I don't know if you remember being a child, and needed somebody to stand up for us and what is the benefit for us. I ask you to look for that as we're discussing this topic.

Thank you very much.

>> DAVID WRIGHT: Okay. That concludes the contributions from the panelists to set the scene for everybody.

And if somebody puts a hand  up behind me, help me out. I haven't got eyes in the back of my head.

I did like this theme of mission possible or Mission Impossible, depending on which one. Perhaps we should be reaching out to the producers of Tom Cruise to find his way over here.

So I open the floor to any particular questions that anyone has. I can't see behind me. Okay. If I can ask, just introduce yourselves, that would be helpful to start with too. Thank you.

>> Thank you for giving me the floor. I'm Cynthia, I come from South Africa, and I work with most of the panelists in this session at ITU. It's quite refreshing to listen to the diverse views of different stakeholders on this important topic, which is quite dear to South Africa, but not only South Africa, but many of the countries that participate in the ITU work.

And specifically in the study Group 17, which is the technical study group of the ITU when it comes to issues of standard and security.

For me specifically in South Africa, we believe that actually we stand a good chance. And why we are saying this is because we are looking at the upcoming WSIS+20 review process where we're bringing in the issue of the Global Digital Compact. And we believe some of these issues as a community that are concerned with this particular issue, we find ways.

Because you know, what I'm actually picking up here today is that worry all concerned. But as we've said, the issue is how do we deal with this.

So I'm also hearing we need this continuous discussion. For us to continue with this, we need to take advantage of the processes that are currently happening to make sure that this issue is not pushed at the back of other priorities.

Because different stakeholders will fight for their priorities. So all I'm looking for, for all the stakeholders in this room, let us take advantage of the processes that are happening and we make sure that the issue of the child online protection also takes the forefront in all of these position, especially at the UN level.

Thank you.

>> DAVID WRIGHT: Okay, yeah. What we'll do is we'll take three questions, and then we'll come to the panelists. Thank you.

>> Thank you, I'm Catherine. I'm an infectious disease physician at Harvard medical school. I'd like to add another layer to this, but I wonder if the pandemic response might help this as well. There's responses we can learn from the COVID-19 pandemic in terms of public data, health, and safety. That's perhaps a little bit simplified, but when we did contact tracing to the COVID pandemic, people can give up their right to privacy, they can volunteer that information. And then certainly how much is surveilled does not necessarily dictate how much data is kept, how it's kept, and where it's kept.

I think that is important, too, for other pandemics and syndemics which can carry a lot of stigma laws. So when that information is kept, there's surveillance related to that. But in the United States that's kept in a secure encrypted facility. And the amount you surveil is not proportional to the amount you keep.

My question is really to how these lessons might apply to this discussion in other areas as well.

>> Online I'm seeing a lot of interaction and complement for the speakers as well. I've the good a question here from Cheryl. Does balancing necessarily mean we need to rank rights and risks to properly weigh them against each other? If not, how do we begin an objective, comprehensive review? If so, how do we do it on a global level? There was another question which basically comes up to like can somebody please clear the air because there's a lot of disinformation, lot of discussion where also a lot of fake news in there.

For example, when we are talking about privacy versus child protection, how can we -- is it true that if we want to go to our child protection that we're giving up on privacy, that -- I think there's a lot of questions there to solve that one.

>> DAVID WRIGHT: Thank you. Those three questions addressed to the panel. I'm going to go to Andrew first.

>> ANDREW CAMPLING: So let me have a go at two of those, but briefly.

Firstly, on the weakening of encryption question, I would argue, and I'll be as precise as I can without hope of getting too detailed, that specifically to detect known child sex abuse material you'd need -- that needs to have no impact whatsoever on encryption. Again, to be -- expand that ever so slightly, if in the interim messaging applications, if they agreed to scan any images before they were uploaded to see if they contained known CSAM and then encrypt, there are no privacy implications to that because you don't learn what the image is, you simply learn that it isn't known CSAM by called #matching, matching for those of you that have knowledge in that area, you don't need to look at the content of the message either. So you're simply saying does this image in a mathematical sense match database of known CSAM. That doesn't, in my opinion, have any privacy implications unless there's a match. If there's a match, then you've committed a crime and you're qualified right to privacy is surrendered anyway. So that's fine.

And then just briefly, the other point on this, I think the question was ranking or trading off different rights. Yes, and I would always say that if you have to trade rights, you ought to bias towards the most vulnerable in society. And at the moment, in my opinion, the weaponization of privacy is largely benefiting privileged adult at the expense of lots of different vulnerable groups. That's an unacceptable tradeoffs. If we have to make tradeoffs, we should bias the vulnerable or advantage the vulnerable, not the privileged. That's the wrong way around, in my view.

>> DAVID WRIGHT: Andrew, thank you. Thank you very much. Arnaud, if I can just bring you in here.

>> ARNAUD TADDEI: Yeah, thank you to Boris about Mission Impossible and mission possible, I like it. But I have to be [?]

I think to come back on the issue, it's a real design problem in the sense of theory of design. I like the privacy intervention, I could not capture of the name of the person who made the COVID-19 learning, that's what we should do. We should learn from others and other areas where they resolved the problem.

Sometimes there's a lot of hype about the data is not secure and people are losing their rights and so on. There are areas that, no. The problem of why the Mission Impossible is because of something else. It's the fact that very few people realize that security cannot be proven.

At the low level on encryption, yes, you can, perhaps, prove mathematically some cryptography and other things. But the moment you elevate the altitude, you lose the possibility to prove that your system is secure. And if you ask anybody about is whatever control I put is secure, I can trust it, the answer is fundamentally no, you cannot.

And that's the problem of who [?] behind the scene. And there is no way or I could not find a way that we can resolve the problem. In fact, it's not even the problem. The fact is it's not impossible, per se. The problem is that few people missed that point. So when security discussed with privacy, it's an unequal battle, because security has little to offer to the privacy side.

So we are -- we are turning into circle with some people trying to split us for dogmatic reasons, versus let's think of things how they are and to be pragmatic and it's a design problem.

If you get to the design problem, we would include back the anthropology, experience, the low -- we could do something about it. That's where the mission becomes possible.

So one portion of breaking the problem in pieces of getting a subcontext, back to the question should we have different weight between people, of course not. That will be terrible. If we end up in a place where we have to put priorities of some humans versus some others, I don't think that we are doing the job.

It has to be equal. All humans should be respected in this. So if we could take a step back, understand perhaps from others. I like the example of the COVID-19, and we group all the possibilities with the strategy of perhaps to concur so we can split the technical design and open the options.

So I believe the risk of doing this is more a problem of it going to impact significantly the way that we are built and developed our entire Internet at the time moment from the browsers to the CDMs to servers to all things. Because now we would need to [?] human. I think the problem today the underlying problem is that the human mother behind this all is very, very narrow. We are locked. We can't do anything, because if we help one now we lose [?] for the other one.

So once we enrich the mothered him the scene, how many possibilities it could create. That would be something I would propose.

>> DAVID WRIGHT: Arnaud, thank you very much.

Boris.

>> BORIS RADANOVIC: Thank you. I'll try my best to cover it all in short -- thank you so much for the questions and I'll come to the first one and bring child protection to the highest levels.

I wholeheartedly support you as GS17 and if I can support more to keep this and this is one of the places to do it, absolutely, yes. I love the idea of using COVID learnings, especially when volunteering rights and seeing how that works. I would be interested to see how that works.

And on the question of how do we balance that risk in the global, I think that is the challenge. That is the biggest challenge that we have to do. I have to agree with Arnaud, we cannot have conflicting things and have to decide what from the other.

Which brings me to the point that I want to make with no disrespect to the person asking that question. We should utterly reject the framework of conversation of having privacy versus security. And if we reject it, I'll just remind everybody that most of us flew to this wonderful country. And what if 90% of our flights had 90% chance to land, none of us would take that option or those odds. So let's reject the framework of conversation of privacy versus security and focus on the title that it's privacy and security. They are solutions. They are ways we can achieve that. They might be difficult, they might be hard. I'll come back, what is easy and what is right.

I think we should -- to answer the question of the online speaker -- absolutely detest and reject the notion that's the discussion we are having. None of us want to go back on our privacy, but none of us want to see the thing that as well Arnaud mentioned, that we cannot fully trust that any system is secure.

But what I can tell you that we know and have referenced and have research and evidence, that there are currently unintended consequences, they are doing harm to the often unheard, unseen, and unsupported people and young children across the world.

So I go back and ask the question, what is right and what is easy and let's start doing the right thing, even though, and I still hope, Arnaud, we will find the mission quite possible in the end, and maybe laugh at this one day. But I am worried about quantum computing and making this whole discussion basically pointless. But yes.

>> DAVID WRIGHT: I feel that's an entirely different workshop. Okay.

Afnan, you want to go first. Arnaud, I'll go to you next.

>> AFNAN ALROMI: I want to thank the floor for the great questions and to go back to the pandemic and COVID, from the lesson learned brought up from this is having public awareness, I think this is a big part of it that you have a right and it's part of the online safety that you should be granted to have -- since the pandemic made us all become most of our time remote. So I think part of the lesson learned here is that awareness of the public community and do what is right for them and what they can subscribe or work to.

So this is just a comment. Thank you.

>> DAVID WRIGHT: Thank you, Afnan. Arnaud.

>> ARNAUD TADDEI: Yes, very quick to come back on something I need to perhaps requalify with.

When I point the fact that we cannot trust security and I say I totally agree with you both. It's exactly where I want us to go. We need to stop this debate about privacy versus security. And the fact is that security cannot be trust and we have nothing to offer to privacy, I see it as a problem.

Now to come back on the person with the energy, it is exactly the same thing. We forget that in the real world your immune system has defense. We meet survivors, you will meet -- you will have a new diseases, can I trust my security by design? No. And that's why we created the health system. But can I trust the health system? Absolutely not either. The surgeon makes a mistake I die. If I take too many medicines I die. Point is, it is a product, it's a positive product that if you would precisely heal our security and privacy people together, let's do something about it. And then we can reestablish, perhaps, a new approach that could be fruitful for not prioritizing human versus each other. On the contrary, having the right design for each of our different contexts and they can evolve over time from when people are children up until when they are elderly. That's it, thank you.

>> DAVID WRIGHT: Arnaud, thank you very much.

Honey.

>> HONEY MAKOLA: Thank you. I want to echo what my colleagues are saying about the purpose of this workshop. We're here to find balance. It's not necessarily weighing one against the other. But the last part of the question asking how do we begin an objective comprehensive review and how do we do that on a global scale, in my opinion, there is a need for a global body responsible for the global framework.

And at the moment we have additional labeled African telecommunication union and internationally and other regional bodies. But internationally we have the international telecommunication union. And I think those international bodies then have a responsibility to ensure that they become neutral conveners of the differing stakeholders with their differing viewpoints so that there can be an unrestricted dialogue regarding finding the balance and the solution.

Because ignoring any of the views, whether extreme or on the extreme side or on the other without really, you know, looking at the matter and really discussing the situation, can oversimplify the issue of encryption. That's not what we want.

So I think that to answer the question, you know, bodies such as the international regulatory bodies are very important for creating that space for that dialogue.

>> DAVID WRIGHT: Okay, thank you very much. Hopefully in terms of those questions posed there was a suitable and adequate responses to them.

We have just a little over five minutes left. Are there any other questions that anybody has?

Any more questions online? Okay. I'm going to perhaps this question could be one to close us with, given, say, just a few minutes left.

And it's about one about public awareness. Public understanding of encryption is often limited. I think we've kind of heard about that. How can stakeholders better educate systems, everybody, about the impacts of encryption on privacy and public safety?

Who wants to take that?

>> I'll try to shorten it. The same word I use is education. But more so to the fact is adaptable education, because different levels and different capabilities, and people need to understand this topic from a different way. I think someone said much, much better than me, if you can't explain your topic in five minutes to a 5-year-old, you might not be an expert on the topic.

We need to expand on the topics that are way too complex for people in the world. Either we don't understand it enough or we don't have the right people to expand that. Stakeholders generally take effort in educating the people the right way without the agenda leaning left or right is important. How to assess that and who is doing it better or worse and aspire from.

We have been doing it for child work in general, IGF is 25 years old next year. We know the principles that can build that and do that but all of those fall on education. Sometimes it took us a decade to educate the whole nation of the importance of why we do one thing or another. It will take us time.

The short answer is educate the general public in the right place and allow for time to pass so we can do that on a global scale or a larger scale. Does that answer the question enough?

>> DAVID WRIGHT: Good job, Boris. Andrew.

>> ANDREW CAMPLING: So to -- I feel I would start by being less ambitious. And dare I say, repeating a point I made earlier, where are a lot of decisions about encryption are made are not here. They're in some of the standards developed organizations such as the ITF [?] which makes design choices about the underlying Internet standards.

What I think we actually need to do is to get people from the groups that are here, at least some of them, to engage over there. So Civil Society groups, government regulators, others who have sufficient technical knowledge to engage in the standards bodies, need to attend and pay attention to what is happening there and the implementations for some of the decisions being taken.

Because otherwise, I think what we risk is developing Internet standards which create societal problems, not because the people behind the standards are bad or evil, but because they don't have the necessary knowledge. Dare I say it's appropriate to finish on the point here, multi-stakeholder approach is the way forward and then through our communities we can spread the message backwards into the others that we engage with.

I'd like to get some element of the multi-stakeholder approach into the technical bodies first and work backwards.

>> DAVID WRIGHT: Just before you put the microphone down, Andrew, can you give us an example of one of those underlying technology changes that maybe will have an impact and how would that -- what would that look like just in case, assuming that not everybody has a technical understanding about it. A real life case example.

>> ANDREW CAMPLING: I'll keep this hopefully at a high level. Some of the current changes being made in the underlying standards, something called encrypted client hello, for example, will make it increasingly difficult for parental controls to work.

So for those of you that rely on parental controls to stop your children being able to access adult-type content or indeed schools similarly use the same sort of controls, potentially those systems will stop working not because you've stopped using them, but because the underlying technology has changed.

So that will be an example where because there's not a multi-stakeholder discussion it's being overlooked. So that's why we need that multi-stakeholder approach.

>> DAVID WRIGHT: So I guess there's a point for everybody here, both in the room and online as well, that sounds a bit of a call to action that if you weren't aware of or indeed if you have the opportunity to engage with the ITF and have that technical level, please do. Please go and understand how your browser and the Internet is being designed in terms of some of those standards and perhaps the unintended consequences that you may well see within perhaps why you're here.

But those views in this  multi-stakeholder approach is a call to action. We have literally a couple minutes left, and I am going to just have a look around if there are any particular closing remarks that anyone may well have. I would go around but we don't have time to do that. If anyone has any concluding remark.

Okay, thank you very much. So in 60 seconds, we clearly did cover a lot of subjects here, and I've got already pages of notes. I think so definitely, Andrew, the term that was really opened the responses about the weaponization of -- or the weaponization of privacy to override impacts children's safety was a bold statement to open.

I think we very much heard as well that this is not a privacy against security, this is a privacy and security, as it is one thing that I think we've come across as well.

This does require multi-stakeholder, this is not one dimensional and it does require all of us to get involved so the output will be reflective of the multi-stakeholder contribution as well.

But I will finish this is not Mission Impossible. I know Boris, this is mission possible, I think we've concluded with and we've seen the way through. In that regard, Honey, we have found a particular solution at this particular workshop.

So thank you very much, everybody. For those for those questions as well, it's a real pleasure to be able to moderate this panel of such amazing, esteemed, and really world leading experts here as well.

So I would like you to join me in thanking them for their contribution just as we close out. Thank you very much.

(Applause)