 Today we have Gaurav Kirthi, Pete Cooper, and Lily Newman talking about global challenges and global approaches in cybersecurity policy. Let's give them a warm welcome. Thank you. Oh. Booming voice. Well, I'm so happy to see all of you here today. I'm Lily Hayne Newman. I'm a senior writer at Wired Magazine. And this is Global Challenges, Global Approaches in Cyber Policy. And I'm just going to introduce my other two panelists here. And then I'm going to let them talk a little bit about what they do, because I think if I give too much of a preamble, we'll just go straight into it. And I want to get a chance for you all to meet them. So Pete Cooper is the deputy director of cyber defense in the UK cabinet office. And he's to my immediate right. And then Gaurav Kirthi is the deputy chief executive officer of the Cyber Security Agency of Singapore. So Gaurav, maybe we can start with you. Tell us what you do. Thanks, Lily. So hi, everyone. My first time at DEF CON, super excited. And super excited for the policy track. Quick self-introduction. So I'm Gaurav. I am currently the deputy chief executive of this organization called the Cyber Security Agency of Singapore. It's a government agency. It straddles a number of roles. So we have an operational responsibility. Singapore's CERT, the emergency response team, comes under us. We also have a team that does development and infrastructure protection. So CSAS kind of roles comes under us. We also have an economic function. So we make sure that the talents are well-trained for cybersecurity. We look at the developing industry and the ecosystem around cybersecurity. So it straddles a number of different agencies. But it's all kind of squeezed into one. And I'm the deputy chief executive for that. My personal background, I studied computer science way back in the 90s, tripped and fell, became an Air Force pilot for 20 years, loved it. And then the last part of my career ended up doing this thing called network security, which then led to cybersecurity, which is how I ended up here. And I love it. Specifically, my role in the organization is to bring tech and policy together. So I am in charge of development. I have engineering teams that work for me. But a large part of what I end up doing is trying to translate the technical work, the technical challenges, and the technical opportunities with emerging technology and the risks thereof into policy. And I'll talk a little bit more about that as we go along. That's me. Pete, go for it. So hi, I'm Pete Cooper. So I'm currently deputy director of cyber defense within the UK cabinet office, within the organization that looks after security for the whole of the UK government. So I look over basically all of the government departments and all the bits of the public sector and on everything from cyber operations, so how we respond to log 4J, for example, through to strategy, policy, red team, standards, assurance, and everything else in between. I also founded the aerospace village, so I know the DEFCON community. And actually, my journey with DEFCON started back in DEFCON 08 in 2000. Anybody in the room at DEFCON 08 in 2000? OK. Oh, I saw some hands. Oh, I saw some hands. OK, so my previous life before I got into cyber was I was also a pilot, but I enjoyed all the technology. And I was actually doing a flag up at Nellis. And DEFCON 08 rocked into the same hotel where we were staying. So for three days, I basically got drunk with all the hackers and going, this is amazing. And they want to talk about flying. And I want to talk about hacking. And then my DEFCON journey started. So it's just been a fantastic sort of journey all the way through and then sort of getting involved with the village. But again, fantastic to be here talking about how we do this better and pull the community together either for policy, technology, and security. And Lily, before you continue, one random trivia fact which you might not know. So the previous speaker, Mr. Chris Inglis, was also a pilot. So if you ever want to get into cyber policy, now you know the way to get in. And actually, I did say at the right start this that whenever people talk about doing anything in cybersecurity, it always starts with a pilot study. So that's why you find so many pilots. That's my only joke, by the way. Earlier, he said it was his worst joke. Yeah, it is, yeah. Thank you both. And I would love to hear more stories about DEFCON 08. At some point. As we think about sort of the daily grind of information security, of defense, this is network defense, institution defense, defense of individuals in cyberspace, you all out there are very familiar and involved daily with that. And I'm wondering if we could start by just talking about how this is a kind of big question, but how do we marry policy with that? And how do we approach policies that resonate with that daily work? Yeah, let's try. So how do we do policy for cyber defense broadly, I guess? We're sort of a policy that aids and works together with that daily work, rather than sort of imposing something or making things harder. How do we make things easier? So I mean, it's an important question because I think part of the history of kind of cybersecurity agencies came from regulatory authorities, audit inspection. And so there's always this fear that when a cybersecurity agency thinks about technology, they think of it from an audit perspective, from a regulatory governance perspective. How do I clamp down on your great idea and make it as cumbersome as possible? That is not ideal for everybody. Like I said at the start, we also have an economic function in my agency. So I have to make that balance internally as well. I want to support innovation. I want to support emerging technologies. I want 5G. I want AI. I want all those cool fun things that are happening outside. I just want them to be secure. And that's kind of a shared mission that we have at DEF CON as well. We're excited by autonomous vehicles. We just want them to be secure. How do you marry it together? Part of it involves having conversations like this. Part of it involves having people with the technology, not just background, but passion coming into policy and people with the policy background coming into tech. We see that more and more. And especially in smaller countries, like Singapore is the obvious example, we don't have the depth of talent to have multiple organizations focusing on different parts of the conversation. So we just kind of squash them all into one place. And what was actually a limitation now becomes an opportunity. You're too small to have 11 agencies looking at cyber. So that one agency now has that centralized focus and that trade-off inside. Two quick points on that. The first is like, how do we do policy? It starts by recognizing that defending a country is not the same as defending an enterprise. A lot of CSOs come into or we have conversations with CSOs and they're like, well, I'm really good at defending a multinational corporation, a big enterprise. It's not the same. You don't control the assets. You don't control the network. Apology, you don't control anything actually as a country. You just inherit whatever's around you and it's connected to whatever it's connected to and you have to grapple with that. And you don't get to boss people around either. I can't disconnect my citizens. I have to deal with the citizens that I have. And as much as I'd love them to use strong passwords if they don't want to, that is still a choice that they make. And so there's a whole bunch of differences and you have to understand the ecosystem, the people, the individuals, the technology well in order to do policy well. Yeah, I mean, I'd agree with that. I mean, you can have the best tech idea in the world, but if you want to roll out a scale, you can only do that if you marry it with amazing policy. And that's where there has got to be a good dialogue so that we can actually put the right things in place. The roll out. And on your point, I completely agree, is that enterprise scale change is different to sector scale change, which is what we look at from a government perspective. So if you're trying to change an entire sector, I mean, when you sit in the center and you look around and you think, well, the levers of change are amazing. It's like, they're really not. They're really challenging because it's about behavior change. It's about helping make people do the right thing. And that's one of the sort of mantras that we had in the Cabinet Office doing a lot of the change around the government, which was if we build the right thing, people will come. So if we have really poor policies, if we put really poor standards out there, nobody's going to be interested. Everyone's going to fight back. Therefore, what we put in place has to work for everybody. But the way that we get there is by having a really good, diverse community of people feeding into it. So not just from the government space, but also from the hacker community, but also from industry and getting everyone to come in and talk about what good looks like. So we can actually lift our head to the horizon instead of running when looking at our feet when it comes to technology. So for the development of the UK government cyber strategy that we launched back in January, got signed off by the Prime Minister, we had a challenge panel that basically brought a whole bunch of people in, including people from the hacker community. And I've never quite heard so many F-bombs on a call before from a government call, but it was great to have the perspectives in there. And we ended up getting some really good dialogue because we broadened out those perspectives because your worldview, your horizon, the more you can expand that by talking to amazing people and having amazing conversations, the better solutions you have, be they technical or policy. And actually, so Singapore shamelessly copies from other countries because other countries do so much thinking, we just don't want to reinvent the wheel. So a lot of what UK has put into the thinking for their strategy, we'll look at it and we'll say, okay, a multi-stakeholder approach makes so much sense. We should do the same thing. US is doing the same thing. And I want to add one kind of dimension to it that has challenged government a little bit. In most domains of governance, aviation, for example, government owns the airspace above the country. As a government, I can dictate what happens in the airspace, what flies, what doesn't fly because we the government own that airspace. In the sand, the sea, whatever you want to, all the natural domains, they existed before companies existed and they will exist after companies long cease to exist. That's not true for the internet. In the internet, the government doesn't own most things. Most of the infrastructure, most of the hardware, most of the software is owned by private companies. So it's unlike soil, it's unlike sea. The second part that's different is that actually it's not even in your country. The infrastructure that you want to govern is international. That is the definition of the nature of the internet. And so when you want to govern it in your geographical boundary, the network topology of the internet doesn't fit neatly with that. So you have to unlock many of the traditional mental models that governments have had about, I am going to govern my geographical boundary because you can't do that with the internet. You have to work with other people. And those other people are companies, they're hackers, they're other countries. There's so much more depth in that conversation. Yeah, no, thank you for bringing that up because I think it would be, I was going to say I want to make this concrete a little more by asking you both to talk a little bit about maybe some examples of triumphs or fails, things that you learned from in policy or things that are working really well. Peter already started giving one example. But I think it's great to frame it in that way you were saying, Gaurav, that how do you make the triumphs so that everyone is being served and everyone is able to kind of come together on that? So yeah, maybe give us some examples just because this can all be very abstract, I think. Yeah, I mean, I think one of the examples that I'm most proud of that I did before I joined government was I got involved in how do we make aviation aerospace a lot more secure? And one of those questions that we got into was that the relationship between a lot of the aerospace industry type world and the research and hacking community was actually quite an adversarial one. So that's one of the reasons that we built the village and start building bridges and communities and things like that. But for me, whenever I'm trying to look at how do we make things better, it's what's the biggest scale we can go to have the most impact because we can sort of fiddle around at sort of lower levels but the more we can get impact and lay out what good looks like at a senior level, the better. So at one point I found myself on the keyboard writing the, so IKO, which is like the UN body for aviation and they were writing their first ever global aviation cybersecurity strategy. So I think 193 different nations signed off by the Secretary General, industry bodies and everything like that in there as well and they wanted to get a view of what good looks like for states when it comes to aviation, aerospace and cybersecurity. And I managed to get a line in there which was states are encouraged to set up appropriate mechanisms for cooperation with good faith security research because we've got to start driving this top down at scale internationally to start normalizing the fact that if you're in your organization you don't have all of the answers. You've got to be working collaboratively and positively with the security research and hacking community. So we managed to get that line in and get it signed off by UN body 193 different nations and then start cascading out. So when these states are thinking about their strategies they've got this in the forefront of their mind. But having the right people in the room when those sorts of discussions are happening so the policies and strategies have the right people in the room to get these perspectives is really critical and we can only really do that by building bridges, being really collaborative and being really open with our conversations. Yeah, no, I love that. Again, so the aviation industry is a much more mature industry in terms of thinking about safety and security because the consequences of not taking those seriously are pretty disastrous. And the conversations that he mentioned IKO which is that kind of UN body are very similar to the conversations that we're having at the UN now as well. So there is an open ended working group that discusses the norms of responsible behavior on the internet. It has a long complicated name but roughly it's basically about cybersecurity. Singapore happens to chair it for the cycle and it's a fascinating conversation because you have 193 different countries. Some of these countries are just getting on the internet. They are trying to figure out what the internet is and what it means to them. Some of these countries are extremely sophisticated, very advanced users of the internet. How do you define baseline norms of security? How do you encourage good behavior on the internet? For 193 vastly different countries. And so one of the successes and failures is that this is a really complicated process made even more complicated by current geopolitical realities that the world is not all wonderful and peaceful. So that conversation is really difficult but it's a really important one because if you can get these lines in, if you can start shaping subtle behaviors on the ground at each country, at each region's level, you've got a huge opportunity to move forward. And so Singapore does sometimes put up ideas like we have a massive government bug bounty program, very proud of it. It took a lot of effort from my colleagues over at the government technology agency to get it through because there's this natural aversion to like, well, if we're opening ourselves up to the hackers, you know, then it's terrible. We will find all these vulnerabilities. It's better that the good guys find it before the bad guys find it. We all know that. Policymakers take some time to understand it. I'm just gonna give two examples of, I think, successes. The first one is that we had a huge challenge with passwords and I think every country does. So what we did in a typical kind of civil servant response was we put up these billboards all over the country, like use strong passwords, use long passwords and like we told everybody. And so when we did our surveys, everybody's like, yeah, we asked them, do you know what a strong password looks like? And there's like 90% of it like, yeah, we know what a good password looks like. So do you use them? Yeah. Sometimes I had an exclamation mark to the password one, two, three if I really want to be very strong. And we had this problem every year, like people knew what a strong password was. They just refused to use it. Eventually we gave up. We decided, look, as a government, all of the government websites, we just won't use passwords anymore. We would just build a big ass app that does national biometric authentication for the whole country and use that. So it's no longer use strong passwords, use long passwords. It's just like, look at your phone, that's it. Just look at your phone and we'll log you into your taxes, we'll log you into your vehicle, we'll log you into your education records, whatever you need to do, just look at your phone. And this solved so many of our problems because if we tackled the policy problem purely on a policy approach, without thinking of the opportunities that technology could bring to the conversation, we'd keep banging our head against the wall, like, oh, these users are terrible, they're all using weak passwords, they're all getting accounts compromised. But when you think about how technology can fix that, and there's so much potential in there. Yeah, there's a bunch of things I want to jump off of from what both of you said, but first I'm thinking about, you were both talking about incorporating different perspectives, regional perspectives, different levels of digitization, or a state that is more, but population is more predominantly on mobile or more, like, all these differences. Can you talk a little more about how, like, how do we get the right people in the right rooms? How do we do the bridge building? Like, let's dive into that a little more. It's hard, it's, you've sort of got to get the right person in the room first that says, there needs to be way more people here, and this needs to be a much more open discussion. I mean, you look at, I mean, some of the work that Bo and Josh have done around I'm the cavalry, like, creating amazing communities that talk about, and sort of pushing on, like, medical device security, and then bringing in companies to say, look, we need to talk about this, and creating that safe space, to go, this is not going to be us criticising you, or this is about talking about these, or honestly, realistically, okay, where are we in the building of the bridge? But a lot of that will come down to getting the terminology right, so that people have the right perspectives when they're going into that conversation. I mean, and you and I have spoken about it, is that I have a real sort of love-hate relationship around the word hacker, because I see it used so many times as a term of bad. I mean, how many headlines do you say, hackers have done this? It's like, we don't turn around and look at the headline saying all drivers cause crashes, all drivers are drunk drivers. Why on earth do we let people talk around, sort of use the word hacker as a bad thing most of the time? So we've got to get the terminology right, and we've got to have the discussions framed in such a way that it's a positive one, not seen as a bad one, because then it starts breaking down the barriers a little bit more, and it turns into a productive one where you can say to organisations, it's like the world view you have isn't quite the world view you think you have. And actually, here's some people here that can talk about what reality really looks like and what the adversary sees about your network. It can be as simple as that. Yeah. So we live in an unusual neighbourhood in ASEAN, the Association of Southeast Asian Nations, there are 10 countries. Every different type of political system that you can think of, we have that. Every stage of economic development, we've got that. And each country actually has a different language. So we are a very diverse region. And yet, we're also the only region to subscribe to the norms of responsible behaviour from the UN. We're also the only region that has for 17 years run exercises across the country's cert to cert. And it has now become operationally a habit. If I see log4j, IOCs, I will immediately tell my cert to pick up the phone and call all of the other regional certs and let them know this is what we're seeing. It takes 17 years. It takes regular practice. So despite all the difference, it is possible to bring diverse countries together. But let me share a little bit about how and why that happens and how it works. You have to start by recognising that countries are coming at the problem from a very different lens, a very different angle each time. You take Costa Rica as a perfect example. Six months ago, I think, they wouldn't be interested in cybersecurity. They wouldn't be in this room. They wouldn't be at all part of the conversation. And to some extent, when we think about talking about cyber threats, the language that we use in going back to terminology, the language that we used in the previous era of cybersecurity was state-sponsored threats. Big bad guys going after your country. And if you're Costa Rica, you're relatively harmless. No big bad guys are going after you, so you don't really care about cybersecurity. But ransomware isn't like that. Ransomware is just going after anybody, anybody who's willing to pay. And so in some perverse, ironic way, ransomware has actually democratised the conversation about cyber defences, about protecting your country. Even more so because smaller countries have less depth. A big country like US has, I don't know, 100 power grids, 100 different water stations, 100 for everything else. Singapore is tiny. We're like 40 kilometres by 20 kilometres. I don't have 100 power grids. I don't have the buffer and the redundancies to allow seven or eight of my power grids to get hacked. And so each station, each essential service becomes so much more critical. And that conversation for ransomware now, you put those two together, small countries take cybersecurity much more seriously in an incident. And understanding that, framing that for them, making them realise that Costa Rica wasn't targeted, it was just a target. That changes the way that developing countries think about it because then they realise that they don't have a choice. You don't get to pick if you're a target. And you don't get to pick how, when, why, where. But you have to be ready for it and you have to be prepared. And then they're a bit more receptive to having that conversation. So I think there are enough crises in the world. We should use them and bring those stories out to the countries that might not realise how important it is, use the right terminology, use the right language and get them engaged in the conversation. Yeah, I think ransomware is a great example. Recently, you know, is something that has, in addition to spurring countries to rethink or, you know, reconsider some of their posture, it's also spurred a lot of global collaboration, law enforcement collaboration and, you know, in cybersecurity and in policy. But how do we reconcile, you know, Gaurav, you were giving some great examples about, like, the pros and cons in Singapore that on the one hand, you know, you don't have all these massive, landmass of interlocking power grids or, you know, these things to draw on. But on the other hand, there's a nimbleness that comes from that, or, and you were talking about the depth, you know, not necessarily having the breadth and, you know, depth. But on the other hand, there's so concentrated and there's an ability to, like, come together and act. But that, right, there's both poles of that, obviously. And so how do we, I either, I don't know, learn from both things or reconcile both things as everyone is trying to work together on some of these global issues? I mean, I'm not gonna pretend that the experience or the lessons that Singapore has are always right. They may not be applicable for most countries. They work for us, sometimes, more than on occasion. We're a small country and that, as you pointed out, has its disadvantages. But one of the strengths of it is that we are a small country. Every country has bureaucracy. So, you know, government bureaucracy exists everywhere. Government bureaucracy is proportionate to the scale of the country, the bigger the country, the bigger the bureaucracy, the more levels and layers in friction and red tape there is. Having a tiny country actually is an advantage. We don't have, you know, multiple cybersecurity agencies because we were too small to create multiple cybersecurity agencies. But that smallness now becomes an advantage. I don't need to deal with that friction that comes with dealing with multiple layers. And that allows us to move a little bit more quickly, a little bit more nimbly. In some senses, we like to try to be the world's policy lab. You know, if you guys have a great idea for a good policy and we regularly do this from UK and US, they have a great idea and the strategy, we're like, we'll do that. And we do. And then, because we're small enough to pull it off, we sometimes get it through a little bit earlier, a little bit faster. We don't claim credit for the idea because many of the best ideas came from that conversation across different countries, different organizations. But we do have the ability to implement faster. And that implementation sometimes gives us great lessons which we then share back. Sometimes the policy ideas work out. Sometimes they're a catastrophic failure and then we just pivot. We have a cybersecurity act that's four years old and we're reviewing it already because we realize that, oh wait, we forgot the cloud. We should put that in there somewhere. And now we're thinking like, wow, how do you put cloud into a cybersecurity act? So we look around. What's Australia doing? What's UK doing? What's US doing? So all of this stuff allows us to experiment a little bit faster. I would like to talk about one other specific thing that we did, which is the labeling scheme, which again, was a very bright idea from the UK which was talking about principles for IoT devices. What are the principles of security that you'd want or expect to have in an IoT device? We took that basic idea and we said, look, I need to change two types of behavior. The first type of behavior is consumer behavior. Today, people are buying baby cameras because they're cheap and they're looking for the cheapest possible baby camera they can find. Nobody cares about security because they don't know what it is. The second type of behavior we wanted to change was manufacturers. Manufacturers are building the cheapest kind of baby cameras they can build because consumers aren't demanding for anything more. So we realized that if we were to put a sticker on the side of the box with the one, two, three or four star rating, no one knows what the rating is. The cybersecurity agency of Singapore just kind of magically does it. We have a system for it, but we just put that sticker on the side of the box and suddenly now there's a premium device that's a four star baby camera and parents are like, oh, well, I want to buy the four star baby camera for my princess because I don't want her pictures to end up on the dark web. And they're prepared to pay five bucks more, four bucks more. Once a consumer is prepared to pay $5 more for a four star rated baby camera, guess what the manufacturers do? We should build more four star devices because you can get a bit more money from the consumers just by not having a default password. And so all of this kind of behavioral nudges again, the UK has an entire nudge unit that thinks about nudging consumer and manufacturer behavior. We just put it into practice and launched the policy. So we're kind of the world's policy lab. We'd love to have more ideas from DEF CON's policy group if you have any. We're happy to try them out and see if they work. And if they don't, we'll also give you feedback. Peter, I want to turn to you, but I like the idea of having like the nudge agency. But it's like a series of nudges towards getting, you know, all of this where it needs to go. Anyway, go ahead. Well, I mean, but that does touch on it because this is about people. Right. So we're all here because we love tech, but we're also here because it's an amazing community. So this is about the, if we're talking about people and diversity across multiple organizations, really that's about building trust. So how do we start building trust across those diverse communities so we can have productive proactive conversations instead of transactional reactive conversations when bad happens? And it takes a long time to build up trust, especially if, I mean, some of the most challenging collaborations I've sort of worked on are where the two parties are a long way apart right at the very start. And it's hard work to get people in the room. And it's hard work to start bringing people closer together and build trust to the point where they can actually talk to each other. But because they are so diverse, when they get to the point of being able to talk to each other and trust each other, the value of that partnership is huge because they are now speaking from completely different perspectives that are able to talk, they're able to learn from each other, they're able to collaborate. And one of the things that you'll see a lot of the work across both the international community, both government and also private sector is large scale adversaries. You see what's happening in Ukraine at the moment. The international community galvanizing public and private to help give as much support as possible and innovation, that activity has been fantastic. It's been coming for a long time with the conversations have been happening up to that point. But the more and more that we actually have the exercises where we collaborate, we have policies that have some semblance of parity across different organizations. The better place we're gonna be in the future. And part of the challenge in going back to your point about the cloud we all talk about getting to this utopian vision in the future where we've solved everything. We're not getting there. Because whenever you think in the policy world of hey, we'll get to this point, we'll plant a flag and it'll be great and that's what we're aiming for. The world changes, the adversaries change, now we've got to plant a new flag. So it's a continual evolution. The quicker and better we can do this evolution collaboratively across all the different stakeholders, the better we are and the quicker we get to that good position to reduce the risk to our countries, to reduce the risk to all of our stakeholders and build this all global commons. So I think we would like to take some questions. We'd really love to hear what you all are thinking and what you'd like to know from Pete and Gaurav. I do not think there are mics. So you might need to, oh, I think I see a mic off in the distance coming towards us. We have a dancing goon with two mics. Yeah, so okay, find a mic if you wanna ask a question. I see some hands. Thank you very much for a very insightful session. I particularly enjoyed the global aspect. So with the last portion of your conversation about the ever-changing landscape and the challenges of changing the policy ongoing because it's not a one-time set, what type of techniques did you use to make that communication effective? Because every time you're gonna change direction, every time you're gonna have to implement a new policy, it's wonderful to write it up and put it together, but how do you get it out there in time so people could actually have, or whether it was private sector or the public sector, be able to act on it and use it as a guidance? Yeah, I mean, so I think just to a slightly different team, I think the best way to answer it is that if you're making incremental change, you're putting the flag slightly further ahead of where everyone is at the moment, and then, yeah, okay, well, we've got a good likelihood of getting there, but it's a small change. We know we're gonna have to come back to this, and there's a massive decision to make about how far ahead do you put the flag? How visionary are we gonna be here? The further ahead you plant it, it's a proper long-term vision. It's gotta be a big change. It's gonna make a massive difference. So to give an example, the cyber strategy we pushed out in January, that's got a timeline out to 2030 because what we're looking at is long-term change to actually get all of the key foundational pieces that we need at scale, because as soon as you bring out huge scale with the policy, it's like, so we're trying to do this huge amount of work in breadth and depth across a huge amount of organizations, it's gonna be a massive change for them. Okay, we've just got to accept this is gonna take a long time. And the way that I always think about trying to get that policy to work is, make sure it's not shit. If it's a good policy and people want to do it because it's common sense, they're gonna come to you and say, I really want to do this. So we, for example, we're looking at how we do assurance at scale across the whole of the government. So we've come up with what we think is an assurance process that will make life easier for people and give us much better data for both organizations, departments, and also a sector scale. And now we've got organizations with other organizations across the country saying we really wanna get involved in this. Can we do a pilot? Can we get in? So it's build good stuff and then the policies way easier. But you're only gonna do that if you have great dialogue beforehand. Can I add two points? The first is, it sometimes requires you to have philosophical clarity about what the principles are. So for example, in Singapore, we treat access to secure trusted internet as a public utility, like drinking water. That is a big difference from treating it as a private service, which means that every average individual who lives in a rented flat deserves to have secure internet, regardless of whether they can afford a firewall or antivirus or everything else. That is a principle and that changes how your policies then evolve over time. The second is, and I wanna echo Pete, the conversation cannot be a post-implementation conversation. It cannot be like, oh, this is the new regulation or the new policy. Now I will communicate it to you. As you're designing it, you should be having those conversations with the companies, with the stakeholders so that by the time it goes public, they already were like, yeah, we were involved with like 90% of this. The wording's a little bit different, but that's fine, roughly know where it came from. And if you're doing it only at the end, it's too late. So I mean, and so on that, if you're imposing a policy on people, it's not gonna work unless it's really simple. So the more that they can collaborate and be involved and feel a stakeholder in it, it's way easier. And to give one of the examples is, one of the other things is that sometimes a bit of vagueness helps, especially in the long term. So on the strategy that we've got, we've got two pillars, one is really good resilience. So we all know that's a really good thing. We can get into the detail of what resilience looks like. And the other one is defend as one. So how do we better bring all of these different stakeholders or organizations, all these departments, industry, how do we bring everyone together so we better defend as one? So if we're defending in stovepipes, we will fail because adversaries don't attack in stovepipes. If we try and defend them, we're gonna fall over. So how do we defend as one? And people sort of kept on asking me, well, can you scope defend as one? Give me a list of what defenders one looks like. And it's like, well, no, because I want people to start using this within their organizations, within their teams. How do we better defend as one within our team? How do we better defend as one within our department? And then if I'm having a conversation with industry, it's like, okay, so how do we better defend as one together? Because we're all on the same platforms. So trying to get that sort of vision and that culture and that ethos of, right, we're in this together and working together can be a really nice way of helping people come on that policy journey. The English have an unfair advantage because they invented English. So communication is their strength, unfortunately. So we do learn a lot from them. Not all the time. So there's one more. Yeah, another question. Yeah, thank you. I'm interested in your thoughts on how do you scale information sharing between, in particular, between national governments. I mean, Rob, you mentioned having, you know, picking up the phone and saying, hey, sir, share this LawCorpJ information with these other countries. They say that that doesn't scale, right? Traditional government methods of information sharing, especially in intelligence, don't scale very well and they don't move at the speed it is required. ISACs sort of sometimes work, sometimes they're something that companies can throw on their website. So do you have examples, have you guys seen examples of information sharing arrangements that scale at speed and you say that works, that's working better there than these others? And what's the secret sauce? I'm gonna give you an answer that you're probably not gonna like. And the answer is that it takes hard work and there is no shortcut. Confidence building is incredibly difficult because it takes time. So even within our region, getting Vietnam, Philippines, Indonesia, Singapore, to come to a point where we're prepared to exercise together, discuss our operational processes with each other, that takes decades and you can't just kind of shortcut that. Unfortunately, information sharing arrangements will always exist as a function of the level of trust that you have in the other party. If you don't trust them, you just won't. And no matter what policies you put in place, no matter what structures and processes you put in place, it will fail because the trust isn't there. And if the trust is there to start with, then actually you can layer on processes informally and they work. So even within ASEAN, we have processes, I wouldn't say that they're super formal, it's not the nature of our region to be super formal about the processes, Chris knows this very well. But it is there because we trust each other and to reiterate the point, that takes time. All of the things that you mentioned about Isaacs and so they're all the right components of the answer, but you have to start small and slowly build up. And I'll give a slightly different slide on that. I mean, we've all got communities around the, around or contacts and people we know and trust in different sort of either countries or all different organizations. Whenever something kicks off, you can generally pick up the phone and speak to a friend and go, what's going on? I've got any really good information. So that's information sharing, but point to point. And we've got quite a lot of that across different sort of people and friends and things like that. But when you talk about institutional information sharing, formal information sharing, the first people are gonna get involved are lawyers. So therefore, when's the last time that we had a lawyer sort of like really coming in to talk about policy and things like that. So it's sort of, you're having to diversify who's in that conversation, how do we share information properly and what does that look like? And also, if we look and I'll go right back to what's going on in Ukraine, the information sharing that's flying around with all of the stuff that's going on in Ukraine. I mean, there's been some fantastic presentations both here and at Black Hat about, here's what we did through open source, here's what we crowdsourced, here's what we did and we published straightaway as soon as we saw it to make sure that people got the right information and we can verify it and actually then use it and work on it together, be it across the ISACs or across the certs and actually collaborate for good by using that information because we've got to share the information in a way that is actionable, reduces risk, reduces harm. There are ways to do that through the crowdsourcing side of it, but it's also the signal from noise challenge. So you can do that signal from noise when you've got a friend and go, you really need to know this. So on the one hand, we've got that and we want to scale it and then you go to internet scale and a whole bunch of stuff going off. There's so much noise. How do you then distill that down? And that goes back to your point of the more that you have exercised with them in the past, the more that you've done stuff like this in the past, I mean, all of the new connections that have sprung up because of what's been happening in Ukraine and all of those collaborations means that if anything else of that scale happens again, they exist already. So it's partly, I don't, there is never a good answer apart from just sort of galvanizing and getting behind something that you sort of see the need and you'll see an amazing amount of blockers in place for really good information security sharing until something happens and all of a sudden they magically disappear. We need to get to the point where we can disappear them way earlier because we can articulate the need. There's a whole conversation buried inside that about liabilities and responsibilities that we could have another two hour conversation on because there are huge corporate challenges in sharing information for good reason which is why the lawyers are the first ones to come out sometimes, which deserve to be unpacked and I think it'll take years for us to unpack them but those are important conversations for us to have as well. But like you said, in a crisis, fortunately, many of the companies know that they have to do the right thing and so those impediments do kind of step away. I wanted to take one more question but just watching the time, I think instead I'm just gonna ask you to, you know, we brought up the word resilience and both of you mentioned it and I just was wondering if each of you could talk a little bit about, we talked about the word scale, you know, that's the real purpose and the goal in all of this, right, is to develop policies that promote resilience and I just wonder if either of you have thoughts from our conversation today or, you know, from your work about, is that always what needs to be driving these conversations and do you think it is right now? Is that what's in everyone's minds? I think it's in both of your minds. I could try. So, the morning speaker, Chris Inglis, used the word confidence rather than resilience. I like that word. The one that, and the reason why that word drives me is because confidence in technology, confidence in digitalization, in progress is not a given. Humans have invented and uninvented technologies when they lost confidence in it. I flew here from Singapore. It is a 20-something hour flight. It is a long flight. In the 1980s, we invented the Concorde. We invented supersonic travel as a human species and then we couldn't find a way to make it safe. We lost confidence in it and we uninvented supersonic travel. So now I have to fly 20 hours. It could have been a three hour flight. We have the capacity to uninvent really useful things if we can't find a way to make it safe and secure. If we can't find a way to give people confidence and trust, if autonomous vehicles come out in the streets and end up killing a whole bunch of families every week, you're not gonna buy an autonomous vehicle. Autonomous vehicles are gonna disappear from the market. So the idea of giving people confidence in technology is what drives us. And all of the vulnerabilities that we find, all of the risks, the issues, the threats that we see in emerging technologies, we want to address them as early as possible so that people can grow up having faith, having confidence, having trust that technology is here for good. There are parts of it that have challenges, but it's here for good and we should embrace it securely. So that's kind of what drives us. Well, you have something quick, Pete? Well, I was just gonna say, so for me, resilience is about people. So we talk about technology and resilience and all that sort of stuff, but look how many people are actually really tired post the last two years that we've had. And we need more people into the sector from lots of different areas. And that's why things like DEFCON are fantastic because we are hopefully inspiring more people to get really excited with playing with the villages and things like that. That's the thing that gives us resilience. Technology will always give us challenge and vulnerability. It's the people that give us the resilience. And I think we've always got to keep an eye on that because I'd much rather take an amazing team rather than amazing technology. The team will find a way. The team will give you resilience. We almost hang off the technology way too much. Yeah. Well, I think that's a great place to end. Thank you both so much for doing this. And if there's one thing I learned today, it's make sure your policy is not shit. Fair. Thank you.