 Rwy'n fawr, ac rwy'n fawr i ymweld â'r panel ar y dyfodol FFT. Dwy wneud! Mae'r fawr ymddangos. Yn! Dyfod yma yn y sorgfyrwyr. Felly mae'n gobeithio eich Llywodraeth fel ymddangos. Mae'n gydflwynau eich Llywodraeth ar y cerdd. First of all, we have Camille Stewart Glouster, who is the Deputy National Cyber Director at the Technology and Ecosystem Security Division at the Office of the National Cyber Director at the White House in the United States. That's the shortest job title I've ever heard. Then we have Charlie Gladstone, who is the Head of Software Resilience at the Department for Science, Innovation and Technology in the UK. Then we... Who's this guy? Does anybody know this guy? Is he your hands? Is anyone in? No, nobody. Dr Alan Friedman, who's the Senior Advisor and Strategist at the Cyber Security and Infrastructure Security Agency, CISA in the United States. Welcome, Alan. My name is David Rogers, and I advise the UK government on some security stuff, and I also am the chair of the private security group for the Mobile Industry Association, the GSMA. So, I'm just here to kind of facilitate here, and I'm also standing in for Jen Ellis, who unfortunately can't be here, and we all send her our best wishes as well, and hopefully she's watching live now. Hi, Jen. So, we're going to kind of split this up, and we would like to have quite a lot of audience participation, so please, somebody stand, fall asleep already. So, yeah, you, get off your phone. I walked you up, didn't I? It's going to get better. So, we're going to talk about a global approach to tackling software resilience, because software resilience isn't the biggest problem we've got in cyber, if one of. Networks, yeah, networks, yeah, accounts, phones. We'll all be in a job anyway for a few years. So, yeah, so what we're going to do is we're going to kind of flip through the risks. Well, we'll spend a little bit of time on the risks. Then we'll talk about the challenges. Then we'll talk about solutions. And throughout this, you can ask any question, any question to the panellists that you like. I can't promise you that they'll answer them to the other panellists. And yeah, we'll see how we get on. So, start thinking about some of the questions that you'd like to ask the panellists. You know, you don't get to meet these people every day, thankfully. But please do, now's your opportunity. So, I'm going to start with Camille. So, welcome Camille. Thanks for having me. Hi everyone. So, what does software resilience mean to you? Yeah, I'm going to start what my office does. How many of you are familiar with the office of the next people? Okay, then I'll have to pick this thing up. Okay, can you hear me now? How many of you are familiar with the office of the National Cyber Director? I love that as I do this over the last year, more hands go up in a room. But just a level set for anyone who is not familiar, the office of the National Cyber Director is the newest office and the executive office of the president, organized to orient ourselves around the National Cyber Policy. How do we set an agenda for the United States around cyber policy? We aim to do that in four ways. One, drive federal coherence. Two, embark on public-private partnerships. And when I say private, I mean in the broadest sense, all non-federal partners, state and local, international partners, private sector, industry, everyone. We want to align resources to aspirations, so we've got to make sure the money is there to do the work that we want to do and we want to focus on future resilience. And that's really where my division sits, is a focus on future resilience of people and technology. And on the technology side, which is most germane to this conversation, one of the biggest pieces of that is supply chain security, and of course within that software supply chain security. And you know, at ONCD, we are really focused on thinking about the connection between the supply chain to all of the priorities that we have set as a nation and in this international ecosystem. And we recognize that there has to be a global endeavor. If you take a look at the national cybersecurity strategy, you'll see global supply chains called out explicitly. But also what you'll notice is that supply chain security is a thread throughout all of the pillars. That's how we become more defensible. It's how we move towards that affirmative vision we set. It's a piece of the puzzle as we think about shifting market forces and all of those things. And so it's a real priority for us as an organization. And as we think about software in particular, we've really honed in on what is the most atomic unit, and that's the code. And so how do we make sure we have resilience in the code that it is defensible, and that raises two things. One, how are we thinking about open source software security? How are we focused on the most composable parts of the ecosystem? One of the greatest things about our software is that some really smart and generous people have decided to make code and make it available to a bunch of people. That's something we want to incentivize, but also means that there are inconsistent standards, that things are insecure, and we really need to focus on making sure that security is a part of the model when we develop code. Whether it's open source, whether it's proprietary, or proprietary code is built on open source code, as well includes some open source code. So when we think about that, how are we going to advance the work there? We really want to focus on collaborating with our partners, our international partners, our private sector partners to understand the scope of the problem and the opportunities. And you'll see in the national cybersecurity strategy, two of the biggest way we do that is that open source software security initiative and software liability and thinking about that. And I'm sure we'll get into that more as we have the conversation, but just want to highlight those things. Good start. Okay. Over to you, Charlie. Great. Can you guys hear me all right? Super. I might actually steal your idea, Camille, because I'm expecting probably fewer people, but could you guys raise your hands if you've heard of the UK Department for Science, Innovation and Technology? Okay. Actually, pretty good. I don't blame you because DCIT, as we're known, was only actually created about six months ago. And it was a sort of a new department based on a combination of what was previously the Department for Digital, Culture, Media and Sport, which is, as you can imagine, a pretty broad department. They basically took the D of DCMS and merged it with some different aspects of our Department for Business, Energy and Industrial Strategy to create a new department. Whose sort of real overarching ambition is to really cement the UK's position as a science and technology superpower in the world. In practice, that's still an incredibly broad remit, everything from sort of space policy, science and innovation, artificial intelligence, but also obviously cyber security. And cyber security is split in terms of responsibilities across a few departments in the British government. So specifically, DCIT is responsible for developing the UK's cyber ecosystem and actually growing the sector, helping to secure the technologies that we use on a day-to-day basis. And also is responsible for organizational resilience when it comes to cyber security. So obviously quite a few of the different themes of that, I think I'm definitely keen to discuss today. So in terms of what the UK is interested in, I imagine most of the crowd here appreciates why software security is important. It's the absolute lifeblood of the digital economy and is absolutely key to driving innovation in the tech sector, but also realistically in modern day, it's also key for all sorts of organizations' productivity. And in the UK we are one of the largest consumers of software in Europe, but we also have a really significant developer presence. So I think we're interested in it from all sides. Our work, looking particularly at software supply chains, stemmed from a call for views that we held in 2021, looking at supply chain risks relating to cyber in general. And one of the key things that came out of that was that software supply chains weren't just the biggest risk, but they were actually something that really needed more investigation and really more of a deep dive into it. So in February this year we published a call for views looking specifically at resilience and security about software used by businesses and organizations in the UK. And there we set out some of the risks that we were identifying across the software life cycle. We also asked for views on what industry and the sector is already doing to mitigate some of those risks, but crucially we also asked for views on what the government should actually do about it. So that closed in May and we've spent the last few months listening to views and really understanding the quite wide range of views that we heard through that call for views. So today I'm really keen to sort of share some of those things that we've heard, but also I'm definitely here to listen as well. So yeah, as David said, really keen for questions, but also I would love for anyone to just share their views on what the risks are in this space, you know, anything really. I'm definitely here to listen as well. So yeah, I mean thanks again to Defconn for having me. I think hopefully this is going to be a really great conversation. Great. You can tell that out of this panel, I'm the one that said he doesn't need a mic. You shouldn't have a mic, I guess. He's feeling left out there, isn't he? Bar this firm. That Alan's not allowed to talk about S-form on the panel. This causes me physical pain. But no, we won't talk about the specifics of implementation unless folks want to get into it, but when you are starting to try to build this, there's sort of a couple of technologies that, one, everyone in the world is now a software company, right? We all know this. We have amazing people from avionics and automotive and medical. In DC, you say law firms are software companies and they're like don't be ridiculous and then you say Panama Papers and they say oh, right. So we sort of have this model and so that means that the solutions that we advocate have to scale for the entire ecosystem. And that started off as you start an initiative around different approaches to advancing new technology, new paths to transparency, new tools. We all start at the same starting point, so we actually can coordinate. One of our firm challenges moving forward as we think about resilience is we all advance at different paces. So in the cloud native world, Sbong is kind of solved, it's boring. There's a Docker command called Sbong. You know what it does? It gives you an Sbong. Now, there are several of you in this room that I've shared the stage with who hate when I use that example because you know what, that doesn't work for a new satellite addressable transformer, right? The people that make OT, the people that make the stuff that affects human safety don't really use that kind of cloud native technology yet. So we need to sort of understand that. So it's one of the higher level challenges that we have in the policy space is how do we work with the full richness of the software community. And then the second piece of course, we haven't done a good job of defining resiliency. And part of the problem is there's a famous, very popular internet pastime that was famously defined by the US Supreme Court, which is, well, we don't know how to define it, but we know it when we see it. And resiliency, I think, fits part of the house of the panel at the White House. Remember when they used to let me speak in public? So part of the problem, and I think my approach at CISA where we look to our friends at the White House and partner with them on the high-level strategy, but a lot of the work that we do at CISA is identifying what are concrete things that we can work on and build coalitions around that make a tangible difference. And so of course, we'll spend a lot of time today talking about the first step, which is transparency. You've got to know what you have, you have to know what's out there, and that also creates some challenges. So Camille talked about liability. Accountability and transparency play terribly together, but we need both of them. We need a way to encourage good actors to talk about what they're doing and have visibility, not just from government regulators, but from private contracts. If everyone is in the supply chain, most of us are in the middle. So how do we have that while still saying, you know what? If you don't do the basics, if you don't do accepted practices, we're going to ask why and sometimes we're going to ask why with teeth. So that's one of the other tensions that we're going to resolve. Okay. So I feel like we're talking solutions already and we need to rewind a little bit and go, why are we all sat here? What is it about software resiliency that is a problem and why are we thinking about these solutions? Are there any... I'm happy for you to give examples or specifics or things that you think are of concern. So Camille's got the mic. Yeah. Much like what Charlie talked about, we embarked on a journey in 2022 to really understand especially the open source software ecosystem, as a whole and see how the federal government should support the evolution towards a more secure environment, more secure software. And we established something called the open source software security initiative that convenes partners like SZA that includes DARPA and OMB and a number of other partners throughout the federal government to really think about what is the work that we're already doing? Where does the ecosystem need support and what are the opportunities for us in the future to add to the evolution that we're talking about? We went to London and we launched that initiative after we had done this in-depth analysis and some of the things that we found were that in addition to malign actors trying to leverage software vulnerabilities to wage attacks whether that is state actors or not, there were some opportunities, right? The opacity of the incentives for developers to incorporate security into their development life cycle it's not there, right? The transparency that Ella was talking about it is hard to incentivize that behavior particularly in the open source context. We need transparency, better transparency into the code that's in our environments of every organization whether that is a government or a private sector organization we saw that in spades and log for shell, right? There were a number of organizations that the biggest part of their challenge was just figuring out what's in their code base and folks were scrambling to figure out not only the things that they had incorporated themselves but what their vendors had incorporated such that they needed to remediate that. That's a problem. That's part of what Alan is trying to solve is how do we marry that up with the security vulnerabilities? We need better data and better connectivity between something like an S-bombs and the security vulnerabilities that present themselves and how do we automate that and leverage that in a way that makes it actionable and easy to implement quickly. And then another thing that we really focused on especially in the open source context but in general is memory safe languages. How do we advance transitioning into memory safe languages? That is a huge opportunity for all of our governments to play a role in the evolution of the ecosystem. If we migrate large code bases from memory unsafe languages to memory safe languages we remediate up to 70% of security vulnerabilities. That is not the only solution but that is a solution that we can advance in a very meaningful way. So how do we do that? How do we get creative infrastructure insurance? Is it the liability structures that we're talking about? Is it some other incentive structure that encourages this evolution and change in the market and environment? So a lot of our work has been collaborating with our international partners leveraging the insights they gain from some of their engagement with industry and our engagement with industry like we have an RFI that we released this week and we have one on regulatory harmonisation that data, that collaboration with our partners is really key towards us better understanding the space and being able to advance not only things like funding and leveraging the federal purse to advance behavior but what are the other ways that we can be supporting the ecosystem to drive software security and software resilience? So I'm going to try and not be too repetitive and I take it as a good sign that a lot of the issues we've identified by the White House I think so in our call for views we set out a sort of full risk framework looking at risks in software development looking at sort of barriers to security and open source software risks around the distribution of software and sort of software resellers and unique challenges around that and also around sort of procurement maintenance and use by customer organisations who sort of buy a new software and somewhat unhelpfully but what we've heard is that all of those areas are important and need to be addressed which is true but I think in terms of sort of key threats at its core insecurely developed software is still the most important thing and it's still the thing that we've heard from all the people we've spoken to that's the thing they're most concerned about and that actually a lot of the organisations we spoke to through this call for views said that it was if not one of the most actually the most significant threat to their organisations resilience was either software that has accidentally introduced vulnerabilities or intentionally introduced vulnerabilities so you know there are a lot of different threats here but to some extent the most fundamental one remains the key issue right but we did also hear a lot of the same things that Cymru was talking about I think memory unsafe language came up a lot as did the use of secure development environment but at its core the issue that I think a lot of people identified was as you were saying that the market incentives just aren't there to prioritise secure development and while this is a challenge for software vendors the actual the core problem is also that customers and the organisations who buy this software fundamentally only care about two things it's how well does it work and how cheap is it security is often a second thought right and if securely developing your software isn't a thing that allows you to differentiate your product in the market then it's not going to be something which companies are investing huge amounts of money so I think it's a really massive challenge and yeah I mean we'll get onto solutions but I think we do need to think more about how the government can influence that and try and address that sort of market thing so I just want to break that down a little bit so so what you're saying is about insecurely developed software but also obviously the languages you know we don't you know there's a lack of type safety and memory safety issues but we're starting to see the languages come through but surely there's an education question for developers as well so most of us who've done software engineering degrees know that there's no mandatory security modules and most of the software engineers that I work with and I've done a lot of software development even though there's plenty of secure coding books out there and one of my students I showed him so does anybody remember Stagefright the vulnerability on Android 2015 and I was looking for this secure coding book and it said it was written in 2003 I think so Stagefright was an institute overflow issue and it said you know here's how you solve the 2009 institute overflow book should be a thing of the past and then we were at 2015 but you know so how does education fit into this? Can I quickly jump again? So I think this is a really important thing and you know when preparing for this talk I was thinking about both sort of short term things that the government can really do to quickly address this but also I think one of the government's role is to really do that long term investment piece and as you were saying why isn't this a poor part of all software development courses and I think the UK is making some inroads at trying to change that so the National Cyber Security Centre has a specific certification regime for university degrees that includes sort of core cyber essentials in the software development courses for both undergraduate and postgraduate and also the UK's Cyber Security Council is working on the sort of formal and structural professionalisation of the cyber profession which I think will really help but I think actually that long term view is one of the most important roles that government has because it will obviously take a long time for all those people who are being trained to make a difference in the actual market it takes a while for those skills to get through the system but that doesn't mean that we shouldn't be doing it I think there probably is more that governments can do to make sure that security is almost sort of but yeah I think we've already done some good stuff in the UK and I'm hoping we'll be able to expand that further I'm so glad you brought that up on Monday we released last Monday the National Cyber Workforce and Education Strategy that addresses exactly this to your point yes the government has a large role in driving towards educating developers or anyone about cyber security and cyber in general and our strategy really is to start at the beginning and really wants to make sure that every person is equipped with some foundational cyber skills that we can then build upon because we recognize that having skills based hiring skills based training is going to be essential to doing exactly this if when you became a developer you didn't have security as part of your curriculum that should not mean that part of your professional evolution does not include training involved with the ecosystem and the things that we now recognize that we need and so we need to build an education and training apparatus and a broader ecosystem that supports you on that professional journey it is a project for everyone though right we should not only look to our government to drive this but private sector community organizations all of us need to band together to really think about what those pieces and parts are that said on the other side of the cyber security strategy is that we should be shifting the burden from the small players to the big players so in that education journey we should be thinking about how we put more of the onus on the tools that developers use to build security into the cycle, the life cycle the frameworks, the tools that they use to develop code and implement it into a code base as another avenue for advancing security outcomes but I encourage you to read that strategy because it talks a lot the workforce strategy because it talks a lot about partnering with universities and employers to better understand their needs and to drive education outcomes it talks about accreditation bodies it is not an easy thing to change curriculum at any level whether it's K through 12 or at the university level and so that is a long term investment that we must make now but we need to get out from that but we also as organizations demand of and prepare our staff for those things I mean if we are going to do the things that we talked about in terms of secure by design and driving towards a more secure ecosystem then that means we have to equip the people to support that work and so recognizing that every profession that works in the technology space which is why the strategy is focused on cyber not cyber security it is something we articulate very clearly in the strategy because quite frankly our engineers are developers they should all understand what security is and how it can be built into the things that they make while we still have cyber security practitioners who can focus on end point security and some of the other things so since they have got the important high level views I will say that I know some developers I mean friends with a few of them I am not going to say that they are not smart but I am also not going to say that they are paying attention they have to pay every two months there is a new framework every three months there is a new platform they are busy trying to keep up with tech and we have tried the next generation definitely needs to be included but one of the things we are doing is one how do we make it so that they don't have to think about security this is the famous shift left model this is the famous the second dev sec ops is what are the tools that we can bring into the process so that you can't commit code that goes pulls from something that is known bad that your repo that you are pulling from is actually maintained by the security team as well as the dev team and so there are things that we can do today as we move towards this and then again also shifting to better ecosystems you can't do a buffer overflow and go lang actually it's not sure I saw someone try to figure out how but it was really hard like it was a very smart person to figure out and that gets back to the model of what are we trying to do on the threats and there are a couple things one of them is just known badness and Rick Camille has talked a lot about vulnerabilities and we know that world and it's not just the public ecosystems it's not just the CPEs there are private feeds but there are other types of known badness so resilience is this huge issue at open source last Christmas there was a great blog post by a French open source developer who said I am not your supply chain because from his perspective he provides software as is any other problems you have are your problems now that's not true for when the US government buys software it's our problem and it's our suppliers problem and so that's the fun tension of thinking about that resilience what do we do about most people don't know that it's a lot more fun to learn to play the ukulele than it is to maintain open source software and so we need to make sure that projects won't go away if someone decides to do something a lot more fun and then one more thing that I also want to put on the table that's sort of in the known badness capability it's a little more subtle is tech debt right we have got to find a way of wrestling this and this is one of those key for resilience models where if you're about to buy a product or buy into an architecture what's that going to cost you what's your total cost of ownership because if what you're buying is already outdated then you're going to have some real problems down the road and not even a security issue this is a dollar and cents issue someone's going to pay for it and who is going to end up paying for it so this is what was springing to mind as I was listening to this as a software developer myself and I think there was a recent tweet that we all saw about this demand that had come down to this open source library developer saying you know there's a serious vulnerability in your library you need to fix it now within the next four year hours and this guy I think quite rightly pointed out he said I recognise this issue I'm not saying that's not an issue I don't get paid for this and you're like this big tech company and you're using my stuff for free and I think this has been a perennial problem for anyone involved in open source is basically these companies are essentially abusing open source half of them are contributing anything back some of them aren't declaring they're using it and you know it's the XKCD picture with Dave at the bottom right maintaining this vitally critical library like you know what do we do to compensate open source developers or prevent big tech companies essentially creating their own forks and maintaining and then thinking at whenever they like how do we deal with that problem I think those are the points about building things into the tools and make it easier for developers to incorporate security from the beginning and also the liability structure I know that you know it will be complex it will be tough and what we are trying to do is make sure that we are doing it in partnership with developers and companies to understand the implications of what we are trying to build but if liability falls on organizations not on developers it can start to incentivize an ecosystem where they're not abusing developers where developers are feeding into something that doesn't disincentivize them from participating and from creating these things that are a system apparatus this should help with those market shifts I mean it won't be the only thing that drives that but I do think software liability will need to be a tool in our toolkit if we are going to drive towards remediating that kind of abuse of developers and driving towards the security outcomes that we want to see that actually I just want to add to this so one of the other problems that we have with open source this sort of paradox is available to everyone on the peer review but as we all know the only people who appear review in the security bits are usually the people who want to break it and attack it is there any way that we can resolve that issue in the future or is that really part and parcel of what you're putting forward in terms of shifting the market so that the companies take responsibility for the open source they're using and essentially pay these people is that right? Yeah I think so and what we need is a structure that makes it very clear who's accountable and then that will drive where we see that review happening if it's not the developers and it's the organization then they'll do the peer review to make sure that the code is sound before they put it into their code base they will keep an eye on it they'll maintain it they'll know of vulnerabilities they're introducing into their environment if we don't do anything about that and we'll continue to hope that developers will maintain their code that they created out of the goodness of their heart or to assume that companies especially the smaller ones not necessarily the bigger ones they might have the infrastructure and the staffing to be able to review code line by line but these smaller organizations that are trusting that this thing that they grabbed or have no idea that some developer they don't have the infrastructure and the support to be able to to do that peer review so how do we build out an environment that really helps you know where the accountability lies and then you'll build in the infrastructure to be able to use that so there are some built-in advantages to the ecosystem as well so there are a couple dozen startups out there that just say oh well how are we going to test our tool and show that it scales and identify and propose things and everything so that's one piece where there are some structural advantages that we can start to do and there are innovators that are starting to engage in this one of the challenges we're working on collectively with Sys and the White House is you'd be surprised everyone thinks that government has a lot of money and they're not wrong but it turns out it's surprisingly hard for us to spend it in new ways and so that's some of the work we're doing so that's one piece that we are figuring out where to prioritize so there is great research by old friend of mine Frank Nagel up in our business that has sort of said well what are the most used open source libraries but like a lot of research that data came from the data that we had which was modern applications so we found out right this is what's used in modern apps Sysa cares about web applications we do but our real priority is saving lives it's the fundamental risks that affect all of society and so one of our projects that I'm really excited about that believe it or not Sysa has an office DHS is an office of the chief economist so our economists are partnering with our vulnerability experts to figure out what are the most important pieces of open source software measured quite broadly and so that's going to allow us to sort of say this is where we should be focusing I also want to flag one thing I feel in security we don't often gives ourselves a little pat on the back so I want to acknowledge something that we've learned to do one of the first things I tackled when I joined government back in 2014 or something like that was look at vulnerability disclosure and those of you who have been coming to DEF CON for a while excuse me those of you who have been coming to DEF CON for a while remember it was not that long ago when that was a controversial topic that was fighting words right you remember no free bugs and this is now a solved problem our friends in the UK have openly embraced it our friends in the Netherlands have been way ahead of the United States on embracing the role of the legitimate security governments from around the world in different parts of the US government are all coming to DEF CON why? because we want to embrace the security research community so in the open source that is sort of one of those fundamental pieces of resilience that I think we've got solved even the most vicious corporate lawyers have at least been yelled at by their PR team say if you sue this hacker it will go poorly for us even if we're right it will not go well for us we'll lose in the court of public opinion and so that's one of those areas where we can sort of build on that success and I think we should acknowledge that that's progress that's been made Charlie I think we did look specifically at the barriers of the open source community to sort of high levels of security as a part of our call for views and I think obviously there's sort of the first thing that comes up when you speak to people about it is essentially funding and we need more funding and I think actually one of the things we really wanted to investigate was the governments as an actual player in the open source system a developer that uses open source code but also someone who buys code which is based on open source components and I think that's definitely something that we're thinking about moving forward and how do we lead by example as a responsible player ourselves in the open source ecosystem and I think that's really important the only thing I have I think we have made loads of progress I do want to pass it on the back but I was actually quite surprised during our call for views how many security researchers we have still heard contacts and companies with vulnerabilities and being either ignored or threatened with legal action on the perspective we're really keen on doing is understanding what really needs to be completely consistent practice here and I think vulnerability disclosure is one of them and we're not at 100% yet I don't think we necessarily will be but it's definitely something that we want to continue raising the standard on because security researchers play an absolutely essential role not only in building that evidence base but also in actually holding these companies to account and I think that's something that we need to understand and to respect I just want to double down on the point about looking at governments and interrogating our use that is a big part of the work that we're doing as well and I'm glad to hear Charlie mention it because it's really important for us as players in the space and as some of the largest users of open source software and software in general to use our purchasing power to drive the market that they think about these issues is a really important first step towards shifting the market and towards understanding the problem better I mean you'll see that in EO14028 we talked about S-bombs and asking of our suppliers to think about how they will leverage S-bombs and helping us map out how we make sure that's a part of our procurement process and thinking about how we are leveraging our purchasing power to drive the market so I just really wanted to underscore that that federal piece is a big, big piece and we should be leading by example and that interrogation of our opportunities to advance this work are really important so not only our purchasing power but how can we spend grant money how can we wrap this as a requirement into grants that we are already giving on relevant topics, things like that are ways that we are also looking at shifting the market Okay and I mean I've also noticed as a developer where things are shifting forward on parallel fronts so you know where we are putting stuff on GitHub you can scan that code to see whether you've introduced a vulnerability also Google for example have done some good things with their IDEs to make sure that for example you don't have an HTTP only connection that's a dangerous function or you're not using the SMS class that's a dangerous function that's a dangerous function on the right track but of course we, Alan alluded to the fact that we have so much legacy out there in existing technical and security debt and so for all those factories that've got embedded systems and those things that are out there globally because we relatively we're British and American so we live in rich countries but most of the world isn't rich and for those we can't just rip and replace bob ddweud o'r ddim yn ffordd yma yma oedd ymchwil yma oedd ymweld yma gweithio'r dynion. Dynion o'r ddweud o'r ddweud i'r ddweud o'r dynion fel myfyrwyr haf amser, mae yna'r ffordd yn cyfrifio'r ddweud. A'r ddweud o'r ddweud is ffwrdd, mae'r ddweud o'r ddweud o'r ddweud o'r ddweud o'r ddweud, ond mae'n ddweud o bobl o'r ddweud o'r ddweud, Maen nhw i'r mod i'r mod i'r holl iddyn nhw i'r holl yn cymdeithasol. It's weird just as for me, it's bananas, that people sell software without knowing what's in their own software. We also want to extend that to what's it going to do on my network. Is this ever going to use UDP? What's the maximum size of a DNS look-up that it's going to make? What's the ports are going to be open? ac ydych! Cyfnodd fan hynny i'r wneud mewn gwahodydd y trafod ac yn ein sgwrs iawn oherwydd neithio ei ddeinew'r ffordd ac mae ffordd gwahanol yn y teimlo i gyda rhoi, gweithio gyda'r ffordd wedi gwahod o hollu y newydd a hwnnw'n ddeni y ffordd yn cael eu cael ei ddefnydd i'r bobl ac mae'r cwrs ei ddweud i'r sgwr, y gwaith wnaeth ddwy'r pwgethau fyddan nhw i ddim yn y cyfrif mae'r ddwy'r ffordd. Gweithio ar y pwgethau'n gydag. Yn y cwnghien i'n gynhyrchu sydd wedi'u cael y bydd yma yna, mae pob ladwch ar surrenderedi Senodeig arall. Felly, sy'n fyddiais ym하게b yn cymryd i'r tyfân. Yr sefydliadau'r ysgoel mae'r ddweudweithio'r cynhyrchu'r ddiweddwch yn agor i'r ysgoel sydd yn cael y model sy'n gysylltu â'r ymddyllai yma. Mae wedi bod ni'n meddwl Cymru, efallai yma i gynhyrchu'r ymgeiswyr a'r ysgu twredig fyddiad, o'r amryl o'r syniad hwn, ac yn adeiladio gyda cyfnodol, oedd yng Nghymru Cyfraeg줄beth ymdrwysgol i ddweud y pethau gwahanol i'r ysgol .. .. yw hwn yn cofod dysgu wahanol, hynny o hynny i'n gweithio, hwn yn gweithio, hwn yn gweithio, hwn yn gweithio i'n gweithio, ac maith arall yi mynd i ddim yn y gweithio, ond y gall gwahodd yn ni'n mi'n newydd, nid oes ychydig am holl nhw rebynnig gwaith… doedd yna, ond y meddwl y gallu eisiau felly i'r renwb llwyll yn ni'n mynd. Rydyn ni'n rwy'n dweud yw'r cyllidol, rydych chi'n gweithio cyllidol wedi'i gweld o isoedd dros y chynnwys cerdd, i wneud eu dweud â'r cyllidol o'ch gweith i ni. Wel, dyna'r cyfnodd o rei, pan ei oeddi nhw'n gwneud hynny adeilad yma i'r ddweud mewn piwyd i os ym mherwydd? Wedyn greatest of the number of rang-diams wedi feddwl. Rwy'n cael chi'n gweithio i chi'n gweithio siaradau. Ie bryd o'r ffrindio informau wilio'r cyfwyr competition gan y gwasgfa? Rydyn wedi'i gweithio i chi gweithio i chi'n gweithio, i chi'n gweithio i chi bwng yn gweld. Mae'n rhoi wef ni ar ôl, Mae cyfnod o'r个fynesion i gwyfrwng bob panathio fel gooedd. Fe wnaeth yn erbyn cyfnod ar gyfer cyfrwng gwneud yn unig ar y llwydd. Felly mae'r cyfrwng gwneud yn figuriosol i'r cyfrwng gwneud, rwy'n cael eu chynol. Mae'r cyfrwng gwneud yn wneud yn holl o cael ei plus, oherwydd mae ar rwynt yn gymdeithasol. Mae'r cyfrwng gwneud yn llwydd o'r cyfrwng gwneud yn pob gweithio. Ond dwi'n fawr i'r sgwladau ymlaen, ond yn ddechrau'r cymaint trwy'r cymaint, yn ysgrifftd o'r cyfnod, o blwyn fawr iawn. Yn ystod y gweithio, mae'r ddechrau'r ddechrau'r ddechrau'r ddechrau'r ddechrau ac yn gyfnod peth yn ymlaen, oedd yma yng nghymru o fawr i ddweud ymlaen i ddechrau. Rwy'r ysgwladau ymlaen i ddechrau, oedd ymlaen i ddiweddio'r ddechrau. ym ysgrifennu'r unig a fyddechrau'r ddweud y rhagleniaeth a'r cyfrifeddiant yn cael ei ddweud i'r ddweud i'r srwys. Felly, ydych chi'n meddwl am y cyflwyng ddechrau. Roedd ynddo ar yw'n cyfrwyng sydd yn cael eu myfwyr ar y pas uchyn nhw ymddangos ac ydych chi'n ddiddordeb mewn fawr o'r fawr i ei dweud mewn gweithio? Wrth gwrs, caon fawr o'r cyfrwyng ymddangos eu cyfrwyng ymddangos i mewn ffellyig o gyrfa yw'r arwain? Er o Sut wef yn dduol gyntaf sydd yn llawer yn cael ei ddweud o'n cyfrwyng o ddweud ar y gweithio ar y swyddin, As part of releasing that strategy, we recognize that the implementation is very little bit federal government and a lot of it everybody else, to be frank. So how do we coalesce and convene everybody around that. And one of the ways that we have done that is that a lot of our partners have already stood up to say like I'm ready to implement this work. I see how important this is. One such organization is Cyberstay Foundation based out of Nigeria but is working in seven plus countries across Africa. And they are going to translate the ecosystem model that we've articulated for the continent of Africa so that that can be a tool to help developers to help cyber talent as they emerge and as they continue to reskill and upskill be able to do exactly this. Be able to address the threats as they manifest themselves in that ecosystem and relevant to that ecosystem. So it might not be that we can just upgrade all the technology but how else can they leverage the resources at their disposal and the skills at their disposal to mitigate the risks in front of them. I feel like there's a number of directions we could take this conversation and I was just about to ask some questions about post-quantum but if so, but focusing, I mean obviously there are strategies we can take that are not just about written replace, we can maybe contain things that we know are sketchy and maybe monitor them more closely or validate the input more closely or whatever. But just concentrating on the end of life question here. So there's all this debate around right to repair and code escrow and we see, you know, my specialist is really IoT and you see lots of these IoT companies that were around in 2015 next big thing now going bust and people are going, hang on, I've got all this stuff in my house. What do you mean? I can't use it after 12 months. So when that becomes a bit more critical, but some of that's critical, right? Some of it's maybe life affected. What do we do in that space around resilience? Is there a clear answer? Is there a strategy from governments on that or is it something that's in the too hard box right now that we need to look at later? I don't know if that's all the way in the too hard box, but it's in the process. I think there are a couple of initiatives that seek to address things like that. The IoT labeling initiative, which is an international initiative, is seeking to make clearer to users what security features their devices have, what end of life looks like that kind of thing. I think that's more of an awareness tool than it is going to be a solution to the problem, quite frankly, right? Because any logo on a box is going to be outdated and the validity of that information changes rapidly. But if we can train folks to actually go beyond the logo and read the data embedded underneath, right, that'll be included with that, they'll learn a lot about the devices that are in their homes. They're probably going to do that on the front end. Will they continue to do that? Maybe not. So that can't be the only solution, but it gets people starting to think about the security of the devices in their home. And as people start to realize and see in real time how that can impact the safety for them and their families, that'll be an important tool in our toolkit. The other thing I talked about in the workforce strategy is foundational cyber skills for everyone, right? And if we posit that reading, writing, and arithmetic are the traditional tools in everyone's toolkit to be a functioning member of society to help you advance and thrive, we say that foundational cyber skills need to be a part of that toolkit too, that everyone should have them, right? And that's not just digital literacy, which is the ability to use the device, turn it off and on. But that's computational literacy, the ability to understand when you turn on and off a toggle, what does that mean in terms of your security and privacy? How does it change the usability? What does that mean for your family? And then digital resilience, the ability to adapt to a changing technological environment. So as there's new features, new risks that you are able to keep up, not make you a cyber security expert, but just as you assess the risks of your physical security, I want you to assess the risks of your digital safety and security, right? So you might, Charlie, you might walk down the alley because you know karate and are willing to take that risk. Bring it on, right? But for me, I don't know karate. I will not be walking down the alley. I will be walking down the Wellard Road, right? I've made a risk calculus, a decision based on who I am, what skills I have and what I know about the environment. I want people to do that in their digital lives as well. And so that's what those foundational cyber skills will equip people to do. So those are tool tools that we are using to help with problems like that, but they're not going to be the end all, be all. And they also aren't a solution that manifests in an outcome that we'd like to see today, right? Yes, I completely agree. I don't know karate, but I'd probably be running rather than anything else. So the UK is taking a very similar approach. We have a cyber-aware campaign, but also we do lots of training in schools and partnership with private organisations to try and generally make sure that particularly through the educational process, people are cyber-awareness and security is now being taught in schools from a foundational level. And I think that's great. Regarding the sort of end-of-life question, I think, so, you know, the UK earlier this year, the Product Security and Telecommunications Infrastructure Act, PSTI Act came, well, it entered Royal Assent, but we won't get back into the full question of what that means in the British Parliament. No, no, let's reopen the constitution today. If you guys do want to hear about that, we talked about a reasonable length at B-Sides a couple of days ago, so I can send you a link. We got through quite a few names of monarchs, actually, didn't we? Yeah, yeah, we did. Anyway, so it received Royal Assent in April this year and it will be coming into force in April next year. One of the three core requirements that that has for IoT, consumer IoT manufacturers is that they have to communicate the lifespan of the product to consumers. They have to give an indication of how long it will receive updates for and when it will be sort of closed down. Which I think is a real success and something that we are interested in looking at in other circumstances. I don't think we're at the space where we're considering legislative options yet, but I don't see why that shouldn't also be a consideration for software more widely, if it's something that consumers or organisations are relying on. But I think that doesn't solve all the issues. A company can tell you that they'll update the product for ten years, but if they then go bust the next year, that doesn't necessarily help. And I think that that is a real problem, which we are, I think, still trying to grapple with. So I'm sure that in the B2B case a lot of companies already require Codesco because they're worried about partners going bust and stuff, but it's the B2C space, that customer space, where they're just left high and dry. David, I'll ask you a question at Codesco because I had a great conversation about this over lunch. So Codesco, as you don't know, is before I sign a major contract, I won't demand your source code, because you're reluctant to do that, but you need to put it in a vault, so if you go bust, I can access it. But here's my question, which is, anyone who's ever looked at someone else's code, is there a successful case study anywhere, and I don't know the answer, but it takes a great research question for anyone, to find successful examples of code escrow actually leading to sustained maintenance? There is one, actually. There was an IoT product that I think Sony took on, but it was the original developer, when Sony dumped the product, it was the original developer who stepped back in. So I think that's maybe a little bias sample issue that I thought, you're going to ask me a different question, I thought you're going to say cryptographic keys and certificates. Oh no, I'm not touching the tension between right repair versus signed updates. Yeah, but not even just signed up, I'm talking about like, the thing just won't compile and it won't run, and the services aren't there to be able to make this thing live again. So, and nobody's looking at this problem, at least as an IoT person, and I'll be honest, doing the code of practice, the UK code of practice of IoT, we did put that in the two hardbots because there were fundamental issues like no default passwords. Does anyone in the audience have a solution to this? It's in the old low code, did you see that? I'm on my phone, I'm on Twitter. So, I want to have sort of shared two, because again, when a problem is too hard to solve at one level, you sort of look around for what are my other levers, and to make this international, I don't speak for our friends at the Ministry of Information Communication in Japan, but I do want to highlight something that they did and one of the surprising findings. So, four years ago, and this is why I love, and the importance of giving a global view, by the way, is their innovators everywhere on the planet, and so that's where we should all be learning from each other. That ministry obtained the ability to scan the Japanese domestic IP space for home engineering users, and they were looking, their plan was, we'll find all these vulnerability devices, and we'll have a way of notifying them through their ISP, so it won't be the government coming, that's not what they say, but yeah, that's what we said, it won't be the government coming and interfering with them. But the surprising result, which was obvious in retrospect, is they found very, very few vulnerable devices. Now, Japan's a pretty connected country, why was this? And it's because we've accidentally built a really powerful and almost flawless firewall for every internet user in certain areas. Anyone know what it is? It's a gentleman in the front. Yes, they said, never got this translation. We failed at the IPv6 translation model, we're still using IPv4, so now everyone sits behind a data diode for the safety critical stuff for medical devices for the rule. That's where we actually need to say, once you're on the internet, this is why Mirai affected security cameras, because those were connected directly to the internet. And so one of the things we need to do is look at our large-scale protection systems, the Sassies and the WAN firewalls and things like that, and say, how do we know that they're doing their job? Because, I guess what, over the last couple of years, there's been a lot of very high-profile vulnerability around that. So let's look at the network level and the appliance level as an alternative model for this. OK, so then resilience further up the supply chain, and I guess accountability is then the next question. So we don't have a global model for this, and a lot of those suppliers, those products are in, say, China, for example, or other parts of the world where we don't have any real jurisdiction. We can deal with maybe the importation, but how do we ensure accountability to create resilience? Does everybody have a view on that? I can go first for your question. I think there are lots of different aspects of accountability, and one of the things we've really been looking at in the UK is mechanisms for accountability that are actually already there but just might not be being used enough. So actually one of the key ways in which suppliers of software can be held to account is by their customers through the contract. But a key challenge there is that, one, you have, when we're looking at organisational resilience in the United Kingdom, the significant majority of organisations are small and medium-sized businesses, often who don't have a dedicated security person at all, let alone serious resource to actually dedicate to really looking into this. Often the people who are actually making these decisions and agreeing to contracts are procurement professionals, and for them cyber risk is one of a great number of risks which they need to take into consideration. So the question really is how do you empower those businesses to actually hold these companies to account in a way that is contractually viable as it were? So one of the things that we've been looking at is whether you can develop standardised clauses which any company could essentially take and put into their contract, which would at least set a baseline standard of security expectations in any relationship. But I think there are key challenges here in terms of just the basic ability of procurement professionals and small businesses to interpret and understand the challenges. Even small businesses are often using more than 100 different kinds of software a lot of the time, and that is a real challenge. So I think we do also need to think about how we can upscale small businesses as government, whether through funding or programmes or otherwise, in a way that actually lets them hold software vendors and software developers to account through the contract. But I do think that there is another aspect here which goes back to transparency. So even if you have a company which has the skills and has the resources and has the expertise to ask the right questions of their software vendors, quite a lot of the time they're just meeting a brick wall, they're not being given the information they need, and we have heard that particularly with the larger vendors things have been getting better in the last few years and they've been a bit more open about their security practices and how they securely develop their software, but I think there is clearly a long way to go, particularly with smaller and medium sized software vendors. So I think there's a big question about how you empower people to hold companies accountable, but the transparency question is also key, and I think you do need both the accountability and the transparency if we're really going to raise the standards of software security across the board. I think where you leave off is where the government needs to pick up. One of the things that I mentioned about the national cybersecurity strategy is we're trying to shift the burden from the small players to the big players. It should be that our cloud service providers, it should be that our large software vendors are held to account to make sure that code is secure, to make sure technology and hardware is secure, and they should be thinking about how all of those things come together. And so a lot of the work that we're doing is kind of picking up where that leaves off and driving towards how do we hold those big players accountable. Some of that is by starting with our own infrastructure and the things that we procure, but also that's where those liability frameworks and some of those other things come into play is holding them to account and engaging them in this process. That ongoing conversation has been a big part of shifting some of these bigger players to act in the manner that we have wanted them to. So we engaged very directly with a number of companies on the open source software question, particularly around memory unsafe languages. And recently you probably saw that Microsoft announced that they'll over the next ten years or so will be migrating their code base from C and C++, memory unsafe languages, to Rust. That's a huge win and that started just from engagement. We didn't even have to get to the point where we were building out liability frameworks and that kind of thing. And so this shift that we've got of engaging ecosystem partners to be a part of the ideation and solution and having these conversations, especially with these bigger players, is starting to drive and catalyze the kind of action that we want to see. The other venue I just want to highlight is international standards bodies. We really need to engage more in international standards bodies because that's where a lot of the frameworks that folks are using start. And that is where a lot of our engineers and developers are getting some measure of training, right? Talking to their peers in these forums and organizations, small, medium and large, should be encouraging their security teams or technical teams to participate in standards bodies to both share and to learn as those standards evolve. And so that's another opportunity for us to evolve that. Hi, a question about international standards bodies. Standards bodies like the W3C have IPR policies where you have to covenant not to invoke your IP against people who implement standards. What about comparable covenants to say that if you are given a right through standardizations such as an anti-circumvention right or torsious interference right that would allow you to silence a security researcher who comes forward with a bona fide claim about a defect in a product covered under the standard. That is a condition of membership that SDO can force vendors to covenant non-aggression against security researchers who make those disclosures. I wonder if that's going to happen. I was actually just thinking about W3C just then because I was involved with W3C for a long time. And I remember Thomas Rosler who's there encouraging somebody to come and help out because this guy who worked for a big security software company highlighted all these issues of HTML5. And he said, come and join us, come and help us because we really need more security people. And he went, adult do standards. But on that bit about the pulling those rules in I think that would be really, really helpful. And actually we're just trying to get to the point where we're getting CBD specifically for standards. So, for example, for 3GPP and NC, they have CBD schemes in place now which is great. The one thing I was saying is I like that idea. We had a fun discussion about CBD is now a very common practice not universal, still some edge cases, but we've made a lot of progress on that. The one challenge for things like widely implemented standards especially if it's a flaw in the standard itself is that multi-party CBD is an increasingly common use case in CBD. There's a very good guide that came out of the Software Engineering Institute at first.org that we're working on but that's something where we want to help promote the muscle memory of that and it's sometimes contentious. So, for example, there was a couple of years ago, let me phrase this as neutrally as possible, a couple of years ago there was a ship-based vulnerability and the folks that found it and the primary vendors had to figure out who to have inside the trust circle and they made some decisions about some of their downstream customers who were the ones shipping it, but not certain governments. And there was a lot of kerfuffle and I don't know if that's an area where there's a single right answer, wrong answer but as we think about this and this goes to the bigger picture resiliency the easy ones have been solved and so now we're dealing with these more complex issues where we may have to make situational-based judgments. So, I think if I can zoom out a little bit from that and talk about government's involvement in international standards generally I do think... Well Charlie, let's just finish the CVD part because I do want to elaborate much more on standards but I think there's a really important point here isn't there about basically treating security research and respect and fundamental issues of software resiliency when it comes to the standards themselves. So, there are lots of gaps like we probably both know a lot of bodies are buried in IETF and W3C not just in the building. So, for example, so you talked about multi-party disclosure so in the mobile industry we wrote an industry-wide CVD scheme and I think it's the only industry that has one like that so security research is instead of having to go to each company individually they can come to the GSMA where it's seen as an industry-wide issue and so what we've seen is that a lot of those are actually standards-based vulnerabilities but what we've also done there is facilitated because those vendors and the companies often didn't have a clue how to deal with security researchers so half the time it was about creating that relationship and saying look, don't hammer this guy, you know this person's coming to you in good faith. And I'll also say that's where governments can play a role my wonderful colleague Elizabeth Cardona went off to say something much more interesting than us but CISA has a team helping with coordinator as last resort with a particular focus on safety critical. We run the ICS and we work very hard to focus on how do we get things fixed as quickly as possible and I love David's model which is like, say, respecting the researcher but also taking the view of how are we actually protecting the humans. We can trail our CVD panel as well. The CVD panel, look it up on the channel. So to go back out, are we going to continue on the CVD or are we going to stay? I was just going to make two quick comments. One was in an effort to advance that as well NIST is doing a lot of work to make sure that security research researchers are respected in international bodies but also we kind of wanted to elevate the collaboration and the visibility on the importance of security researchers so we hosted our version of hackers on the hill hackers at the White House and had a bunch of security researchers come in, learn more about what we're doing so we can learn more about the challenges that they're facing in these international standards bodies and others otherwise so we can start to ideate on how we solution that and how we can advance rules that will help support them in that environment. Cool, OK. So Charlie, I cut you off, but I cut you off for good reason so because standards is a favourite topic at the moment, right? Everyone wants to be in standards. Five years ago, nobody wanted to be involved in standards but so we have all of these standards bodies around the world. Charlie, what's the answer? It's simple. I think the UK perspective is unsurprisingly that these are global issues and the new global solutions and I think standards bodies play an absolutely essential role in that. During our course of views, one of the things we did consult on was whether the UK should support the development of new standards around software development and I think generally the view we heard was that the standards are largely out there. The problem is that people aren't actually using them and there's that classic SKCD comic again, a second time, there's a reference, there's always a reference about there being 14 international standards and someone says, we need to create a new standard that brings together all these standards and then the result is that there are 15 international standards and that's definitely something we want to avoid. So I think one of the real values of standards is in encouraging that sort of consistent approach that's sort of interoperable between different countries any of the solutions which we're potentially looking at from the UK perspective, we would want to align wherever possible with the appropriate international standards. Which standards body you use is another question and it's a hard one, but I think we're really keen to work with other governments, with the US, we've been working with the Singaporean government on this and a few others, but really to identify which of those standards are key for getting the fundamentals right on software development and secure development is the most important thing that we need to get right and getting that international alignment is going to be essential and I think standards are the main way of delivering that. I'll just say that that work should be complemented with the bilateral multilateral engagement that Charlie just highlighted, because there are so many standards bodies that can get confusing and you don't know where your partners are aligned if you're not engaging with them directly on their direction and the things that they have prioritised and so that's just a really important complement to that standards organisation. I think, sorry, I guess the thing I did want to add to that is that the role of governments is to add extra authority to those standards by telling people you need to look at this and that's really where that symbiotic relationship properly comes into its own. I'm going to tie things back to Esfam. One of the things, and again this is in the National Strategy Implementation Plan, is to explicitly partner with our peer governments around the world to make sure that we're all on the same page because the last thing that people want is to say, alright, I've got to have, here's the American solution and here's the Japanese solution and here's the European solution. I also, we have a national standard strategy which embraces industry-led consensus space, there are a bunch of buzzwords in it, but I do want to put a flag in there which is especially for the evolving security space we need to be careful of premature standardisation and even in the IETF which embraces rough consensus and running code, guess what? It's like two vendors that are driving it and not even two vendors, it's like two very excited people who are in a tiny corner of a vendor and so this is where we want to say, let's show that it works, show that it can scale and then make sure that everyone can weigh in as we fine-tune it and so that's one of those areas that nothing in security is ever easy. OK, we're going to continue on standards in a minute and I think we're going to have... This question leads right into that a little bit. I was going to call it something that we can't call it but anyway, think of the standards body you most hate and then we can come back to that. So, gentlemen, I was going to say gentlemen in the grey shirt... Sure, sure. So speaking of international standards, we have the RED coming, we have EU harmonisation, which also encompasses the EU Cybersecurity Act. These dates are eminent, they're within months for some of them. I was wondering how you were following those policies and how they should be ingested. Was this the Cyber Resilience Act? Well, it's the EU Cyber Security Act and that feeds into the Harmonisation Act and then also we have the RED out there, which is even more... So, basically, we've got a ton of policies that have come through from governments in this overwhelming industry and individuals who haven't a deal with this and answer vendor questionnaires and all sorts of stuff from Europe, from America, wherever. And what was the question? Do you realise... Well, so, from a UK perspective, in terms of the burden, so we, in 2021, published a higher level strategy called the Plan for Digital Regulation to set out our overall plan to make sure that, while there are lots of new regulations in cyber competition and online safety, we need to be looking at the picture as a whole and we also need to make sure that, especially because a lot of these regulations are affecting the same companies, that we're not creating unnecessary duplication, that we're not making it overly burdensome, and the international aspect of that is absolutely key. So, in all of the policies we're looking at in the UK, that question of international alignment or at least interact international operability is absolutely at the centre of our thinking. I think we don't want to create unnecessary red tape despite the reputation that most bureaucrats have. And at its core, I think we share the same objectives with, you know, a lot of the outcomes that the EU is trying to get, a lot of the outcomes that people in the US are developing, we are going to have different ways that we get there, but I think particularly, you know, from the UK perspective, one of the things we really want to prioritise is sort of outcomes-focused regulatory questions and outcomes-focused processes which allow fenders and any other organisations involved to take their own approach to delivering these solutions. And I think as much as possible, that's what we're trying to do, so international alignment I think is a key part of that. Yeah, as we develop a lot of these standards on our path forward, quite frankly, we have made sure to not only include our private sector, but include our international partners and to share the lessons learned and best practices in the hopes of all of us driving towards similar solutions and outcomes. So, for example, we just had the one-year anniversary of the CHIPS Act. You're seeing CHIPS Acts pop up all around the world. We have been in direct engagement to share what we are learning about the semiconductor space with our partners. That said, I think the point that Charlie just made is a really important one that we will have different pathways for getting there. We will choose different models in hopes of reaching the same or similar outcomes. I'm not exactly sure that any one person or government can solve for that, but that continued bilateral and multilateral engagement and our engagement in these broader bodies has been the best path for us to kind of share those best practices and lessons learned and our path forward. But what tends to happen is one government is going to move out on one thing while we're focused on another thing and then someone gets ahead of someone else and I don't think we'll ever get past that. But what we have been doing and are very aware of and are talking to our partners about how we then mitigate the pain on the industry and so I think you've seen more bodies designed to bring industry into it so that they can, even if it's the row out there like, hey, I have this new obligation over here, how do we marry these things up that's getting into the process? And so I'm hoping the pain is reduced, but I just want to be frank that it will take time because we will inevitably, in certain areas, have different plans for how to achieve the same outcome. If I can just very quickly add, I think this isn't just to be supportive of industry, I mean that is, it's a strategically desirable outcome for us as well. The more inconsistency or duplication there is in the regulatory environment, whether domestically or internationally, the more the companies can sort of, to some extent, just pick and choose which ones they want, right? And regulatory arbitrage is something we want to minimise that benefits us as well as you guys. Yeah, really quick, I'll just say, we have a regulatory harmonisation RFI out right now. Please respond to it. These are our opportunities to advance a better understanding, yes, for industry, but more so for the entire ecosystem. The one other point I'd make in terms of progress that I've seen inside my short tenure in the US government is we have chunks of governments that their job is to work with other countries. And for a while, they were pretty stoked by it. They were diplomats and they were only at a very high level. And increasingly, they're now bringing in the technical experts along with them. So when we meet with our counterparts, we actually have engineers in the room or security experts in the room who can say, okay, that's great, but here's a specific issue that you're glossing over something like that. So that's where we're trying to make progress in the process side. And just to brag a little bit about the UK's IoT work. And together with the US as well, I think the countries that work together to basically create defragmentation of what good looks like in IoT standards I think does set a really good model for how we can work in the future because it is really the opposite of that XKCD comic that you mentioned. And we actually did manage to achieve a lot of harmonisation. But sorry, you've been waiting very patiently there for the question. No, that's perfectly fine. But it's a question in a completely different direction. I hope it's okay. We're happy to go in another direction. NFTs, is it? No NFTs and no AI. We're talking about resilience. As I were talking about resilience and covering from ICS all the way down to SMB. Now the harsh reality for most organisations is that they are not going to, they are being hit by open RDP. They are being hit by poor adoption of MSA. They are being hit by difficulties patching existing code. I would love to hear your perspectives on what can we do to help these organisations. This is not about asking them to meet another standard. This is not about pointing them to, as much as I love them, memory-faith languages. How are we helping the proverbial dentist that needs to fix RDP? This is practical on the ground stuff they can do. Actually we talked about agri-tech and I actually looked at some British stuff from the National Farmers Union, which the NTSC had written. It was basically generic advice about MFA and stuff that was too technical I think for any farmer that I know that's a may or might. I don't think it would achieve the desired objective. I'm kind of interested as well to see what practical stuff can we do to protect retailers and small businesses now. It's a really tough question. I think part of it needs to be industry-led solutions. I think we as government need to be really encouraging the industry to be creating new ways to reduce the complexity and to make it easier for organisations to secure themselves. We as a part of our call for views consulted on whether the government should even be looking into trying to fund these industry tools to create the market for ourselves. That is I think we had mixed reviews on that and I think we'll see where it goes. The point is that technology can often really be a helpful solution and guidance is useful but people need to actually read it and people need to use it. Obviously as we talked about before these are often small businesses operating on an absolute shoestring and none of whom are cybersecurity experts. This does go back to what Camilla was talking about before and about the shifting of responsibility to companies who can actually have that resource but also have the more centralised control. I think in the UK we really agree with the point made in the US cyber strategy that we need to be giving responsibility to the people who are best placed in the whole ecosystem to actually deal with the problems. Immediate solutions are hard and I don't think we're going to solve them overnight but I do think the more we can move control over that in terms of maintenance and use to the companies who actually develop and maintain the software the better it'll be. Obviously there are cyber challenges which are even outside of that. You can have the most secure software ecosystem in the world but still in the UK the majority of cyber attacks are phishing and simple things. You can't solve all of that but I think a big part of it is training and a big part of it needs to be empowering organisations to actually know what they need to do and to take those basic steps. So what does CISA have in common with 90s metal fans in the Lockpick Village? We love tools. Right, so we have some lessons. We can look at the past where things that used to be very advanced and had to be described in sort of very fluffy, non-actual guidance, threat intelligence, static analysis. Any remember what static analysis used to be like ten years ago? Here's a giant list of things that are completely irrelevant to your product. And now we've had the tools and the entire marketplace has now shifted to we will help you parts which ones are important. So this isn't a perfect approach but we do have to acknowledge that things start rarified and being used by very resourced organisations and then everyone looks to say well yeah I can make money selling to a bank but you know what would be even better is selling to 10,000 medium sized enterprises and so that's understanding what's the policy part and where's the engagement part to actually help when it's the term trickle down but actually helping these things scale and integrate and be part of what companies are already spending money on. Yeah, so to echo that that shifting of burden, education and training all of those are some of the longer term solutions but more tactically one of the things that I found most frustrating when I used to work at the Department of Homeland Security was how few people, how few organisations understood the resources out of what was then MPPD announced as a to support small and medium businesses and businesses of any size really but small and medium businesses. There are educational materials there but also you could get actual support to harden your networks to remediate a attack and that's something that we have been working really hard to make sure that organisations across the ecosystem in the landscape know about. And there are similar resources in other countries that really are there to support the small and medium businesses and all businesses but really the small and medium businesses that don't have the infrastructure themselves to be able to address the attacks that they're seeing. More tactically also, I mean we have a small business cybersecurity summit in October through SBA they're doing a lot of work to educate small businesses we are quite literally as a government on a road show to make sure that small and medium businesses feel seen and are just kind of tracking this issue in general and know where their resources are. We've been engaging philanthropy in building out tools that can be more widely available and so we're getting more creative about how we engage all the members of the ecosystem, right? So we're convening philanthropy, we're convening the non-profit sector who often provides a number of tools for small organisations. How do we make sure that they're getting the support they need from the large organisations or have the tools that they need to be able to make sound purchasing and secure and defensible purchasing decisions so that when they then build this other tool that supports all of these small and medium organisations particularly in the non-profit sector that they are buying something that has already contemplated these issues. You'll see in the National Cyber Workforce Strategy we talk about fractional hiring. How do we share talent and resources such that an organisation that cannot afford to have a cyber security resource on staff all of the time can have one fourth of their time while this other organisation has one fourth or one eighth of their time and making sure that somebody has access to the information, to the skill sets that they need to harden their systems. And I'll just highlight those things but we're trying to get creative with how we reach out. There are so many small businesses and this is not their type priority, right? We have a mom and pop shop who is just trying to service your neighbourhood and you're not thinking about this but there are a lot of great programs coming out of academic institutions where one of the ways they're training students is to go to those stores and help them, you know, implement multi-factor authentication help them make sure they have all the security settings turned on in the programs that they've deployed in their stores. Those are great opportunities for not only training and skill development for the students and for the people who will soon be in our workforce but it also supports small businesses so some of that ecosystem model that we're building out through the National Cyber Workforce and Education Strategy are designed to create an ecosystem that feeds itself and supports organisations like that small and medium businesses, community organisations, etc. I have noticed, you call it trickle-down I think maybe of a different term but I don't think it's trickle-down because it's kind of like a top-down direction on you know, security by design and product security measures being put in all products that are sold into enterprises that end up, so I guess that's the trickle-down that end up in those small businesses but maybe there's this gap in time when they can't afford to replace it and that's obviously the money issue is the who pays issue is always the issue, the same with rip and replace in mobile networks there's a lot of burden on industry to create resilience and a lot of burden on small businesses and consumers who are stretched at the moment as well Gentleman, Peter and Blusher Great, thank you, and this may be outside the scope but if so, just feel free to just gloss over it but since we're talking about resilience I'd like to back up to the real is anybody looking at we're technologists, we like to have technological solutions we like to make our technology better we argue by different ways we can do that but when it does fail is anybody looking at or trying to recommend to companies that they have backup plans for when these things fail for resilience, like literally is there an analog solution to this digital problem that we used to have and we used to know how to use like I am a cavalry did that report where the billing system went down employee records went down and they didn't know how to do paper records anymore they couldn't do that, so it was already kind of non-technical, do we actually suggest non-technical resilience solutions So some companies have had that happen, haven't they, where they had to go back to paper and then realize that maybe they should have done that exercise Yeah, absolutely I mean, if you look at the national cybersecurity strategy it focuses on defensibility and resilience, right, the ability to bounce back and you've probably seen this a lot in the election space where the guidance that's going out especially best practices are really focused on should you focus on paper ballots should there be a backup where folks can vote through the machine but there also is a printout that they can verify their votes, right so yes, there are definitely efforts to focus on that resilience piece because inevitably even systems that have been designed with security in mind from the very beginning and privacy in mind from the very beginning will see an attack will be compromised, right and so when your defenses are down, how do you get them back up but some of that is an analog solution so how do we think about all of those things and how do you have a plan for your organization that includes the entire C-suite that everyone is engaged and involved those are the parts of the conversation that we're having not only with the small and medium businesses but of course with the larger ones as well So I don't think I have a I'm not sure about specific rules around having an analog solution but in the UK we have the network and information services regulations I think that's what we call the NIS regulations they're normally known as and what that really is are rules which focus on critical national infrastructure and organizations which are foundational to the operation of the UK and those regulations do have a more strict set of rules around how organizations which includes all the major sort of C&I sectors but also digital service providers things like cloud computing we're looking to bring into scope very soon so that they have plans in place in case of everything from small outages to pretty catastrophic risks so I think broadly our approach is to prioritize the areas that are most critical in that space but I think it is a really interesting question about how organizations which aren't thinking about that in the way they have to in C&I how they deal with that I think it's a real challenge Just two other data points of how this is happening today and how we need to expand it One, five years ago the Post Mirai botnet the Department of Commerce published what it referred to as the botnet report how do we think about resilience against massive scale automated malware and one of the things it recommended is hey, if your network is hit either directly or indirectly binders the phone numbers these are the IPs this is again if you have a secure facility how are you going to reboot things what are some of the less used details that you need to have at your fingertips second piece that we've seen as more and more organizations tabletop against ransomware is again having that explicit behavior where the executives become aware through engaging against this and I love there's now movement to bring in influence from role-playing games which I think a lot of us enjoy to make tabletops more interesting more engaging and more real nothing like a couple of detents to help you figure out will your decisions actually have an effect and that also gets to people thinking through what happens if I lose this organizational capacity how will I stay afloat, how do I prioritize and that speaks directly to the concept of resilience which is then but not break I look forward to the invite to your organizational resilience organization resilience one shot and the hell is the door so actually that goes back to our previous discussion about IoT and the sort of death of those IoT companies and what do people again back to consumers as well so they're the victim of a lack of software resiliency in the end of life or that company going bust or whatever do we need to be preparing users for that imagine a large cloud service provider goes bust type situation or when we wrote that IoT stuff for example I couldn't believe that we had to write in there that a door lock should still operate because some of these companies are just doing this stuff on the bench and assuming I call it the Silicon Valley problem which is that oh we've always got internet right and actually for a lot of people they do have power outages repeatedly especially if you live in Wales so but I think this speaks directly to sort of this mission we have of secure by design secure by default there are some things we should keep from the 90s the kids need to know about ska but consumer consumer empowerment as a primary technical policy philosophy didn't work we've successfully taught a decent chunk of consumers to do one thing for security which is look for the little lock in your browser and five years ago that stopped being relevant with let's encrypt it's not that let's encrypt is bad but now that means something that conveys in those metrics so if our strategy depends on teaching consumers things we're not going to succeed and that's why again as Camille's been saying and Cissa has we've got a lot of workshops coming up this week and there's Bob Lord who's pioneering this effort to say how can we make it so that people don't have to make those decisions I do also think I mean I think we do actually need to be careful in part as well I mean so the UK has sort of various parts of the UK government which looks at civil contingencies and catastrophic events but I also think you don't necessarily want to start going out with campaigns telling everyone you need to be prepared for what happens when the internet goes down across the country because ultimately that's the job of governments that's the job of the companies who run them and I think while it is important people remain aware and I'm sure many people in this room do think a lot about those kind of things you also don't want to be scaring people unnecessarily and I think fundamentally the responsibility is of government and the organisation is involved in that critical national infrastructure but I think consumer empowerment where it works and where we can is great but as you say it's not the solution which we can rely on so this is merely a thought experiment just leading on from your question so what do we think will happen if say a large cloud provider suddenly went bust overnight and we think it would be like the banks where you have to bail them out for the greater good of the of the users and the disruption that would ens you and I know that's a controversial statement in itself because that's a point of view but if any of you guys thought about this just personally or yeah I don't think I can commit the British government to bailing out cloud manufacturers if they go bust yeah just google can you bail out google yeah I mean the risk of repeating myself I think part of that is why we are currently looking to update the network and information systems regulations to include more digital service providers and digital infrastructure providers because people are reliant on cloud services now in a way that they were not 10 years ago and we need to make sure that and I believe I would need to double check but I think that sort of question of solvency is a part of that you know it's not just about having secure data centers and having everything in place I think it's about sustainability and I think from both a government perspective and from a company perspective making sure that the availability of those services are there in those kind of circumstances but yeah I mean I think it's an interesting question I don't imagine most cloud companies are looking at going bust it looks like they're probably going the other way but you never know Yeah that's what I was going to say I mean in terms of prioritization of the list of problems that we have discussed today yeah I mean I don't know how that would happen if the major cloud providers I mean to everybody stops using them and then the money runs out I mean I think we can contingency plan for outages but them going bust is not a thing that I think most of us have put at the top of the list of challenges to triage nor do I think it's a realistic possibility any time soon so I think we're okay What was that clip when next week? I know it's right Okay so you heard it here first so we've only got a few more minutes left so has anybody got a burning question that they would like to ask any of the panellists? Or just views they would have shared as well I was going to say you can give the closing remarks So we've heard I've kind of gone across a lot of the landscape and I didn't get to what I wanted to talk about with standards but has anybody got any ideas about software resiliency in terms of what you would like to see it's too late and you have to do and you need some coffee, don't you? So okay so what do you think what keeps you awake at like Alan? So I mean for me the question that I sort of worry about is why haven't we seen the catastrophic attack and I that's not really something that keeps me awake but the fact that we spend a lot of time in security and you know I use the analogy the Drake equation a lot which is you've got so many humans 30,000 people come to the desert every year and learn how to pop shell many of them really are crazy and many of those people have very strong opinions about very important institutions and so I don't have a good answer for it and that makes me sort of worry that it's just sheer luck that we haven't seen sort of the very large event and so that's sort of the high level policy question that is way above my pay grade but I like to raise it for people like Camille I think that the real challenge for us that we've sort of covered here is really a kind of a core problem of government which is that you have a limited number of resources and an essentially infinite number of challenges that's true in cyber as much as it is everywhere else and that prioritisation is hard and there are always people who might lose out on that I think one of the things that we're really hoping to do in the UK is listen to as many people as possible to make sure that we are making the right decisions and that we are the right people who are being protected but there are always things which don't get done and we are we are really keen to make sure that we are at least if we make that wrong decision we've been as informed as possible so actually it would be remiss of me not to note that our call for views we are probably publishing our sort of response to it in the next few months so anyone who's interested in learning more about what our next steps are in the UK so sort of actually sharing your views on it keep an eye out for that but also obviously happily have a chat afterwards as well the ability to mitigate risk across a number of different issues all at the same time at scale at the pace that we need them to be mitigated that's what keeps you up at night all night but I do take heart in the progress that we have made and the evolution of the role of government and private sector collaboration we've moved from that public-private partnership when really was just like throwing threat indicators across the fence to true collaboration I mean I think you've heard in everyone's points how they've worked with a private sector organization of one kind or another an international organization of one kind or another to try to advance and end and better understand the implications of the things that we are trying to do that's an important evolution it will take time so it doesn't really help my speed and scale piece but I take heart in the fact that now is one of the people who like moves back and forth between government and the private sector and has seen how much harm on both sides can be done when they're not talking to each other that institutionalization of the collaboration model and a true understanding of the challenges we seek to solve I think it's really important Thank you, we're going to have to shut this down because I want to ask the audience one thing so based on when you stepped into the room and I know some people joined us on the way like Peter who's just walked one minute before the end you enjoyed that panel so based on when you walked in the room do you feel now more confident or less confident so about government's approach to software resiliency so if you feel more confident raise your hands now that's all the government people isn't it that's the stuff I need to put my hand up okay and who is less confident okay not so any who's in the kind of don't know category middle ground okay hasn't changed your view hasn't changed that sounds pessimistic so all that remains for me to say is thank you the audience thank you to Camille thank you to Charlie thank you very much to David for his excellent sharing as well