 All right, we are going to get started then. Thank you for joining us. This session is called Moving Regulation Upstream, an increasing focus on the role of digital service providers. And the gist, basically, is that for those of us who have worked in or collaborated with cyber policymakers over recent years, there's been a lot of focus on what end user organizations should and should not do. And I think more recently, as two things have happened, one, there's been a sort of rise in supply chain attacks. And two, there's been a greater level of acknowledgment of how hard it is to reach SMBs and those that operate below the sort of security poverty line. And for those who are not familiar with that term, it's fairly self-explanatory. Organizations that don't have the resources or capability to really build a mature and maturing security program. So as there's been an acknowledgment of those two things, I think policymakers around the world have started looking at the role of digital service providers. And then in the past two years, I don't know if you guys know, but a thing happened a couple of years ago. There was this pandemic thing. And so, oh, this is terrible. And so we all suddenly really quickly moved to remote. That happened very, very quickly. There was huge adoption of cloud. It sort of accelerated massively. And so all of a sudden, that role of kind of third-party digital service providers, particularly in the cloud, became even more important. So I think it's kind of accelerated the way that some governments are thinking about this problem. And we've seen things like in the network of information systems directive in the EU. There has been an acknowledgment of the role and importance of cloud service providers to essential organizations for a while. But the EU is currently updating this and is looking at really strengthening those pieces around the expectations of digital service providers. The UK has also been looking at that role in their update to NIS. And so we thought that we would come together today and talk about this issue because I think it is something that policymakers are thinking about today. So it's your ability as a DEF CON community to help influence those policy conversations. And so I have brought two tame live policymakers with me. So I'm joined today by Adam Doble from the Australian government. And I'm gonna let him introduce himself properly in a second. And also Irfan Hamani. Is that right, Mike? How badly am I slaughtering your name? It's okay. From the UK government. So I'll hand it off to Adam to just quickly introduce himself and we'll move on. Thanks, Jen. Hi all. And it feels quite odd to be speaking at a conference here in the USA without a representative from the US government. We'll make do. I'm Adam Doble. I'm from the Department of Home Affairs which is the principal agency in Australia responsible for cybersecurity policy and strategy. I'm based out of the embassy in Washington DC. So primarily engaging with US government agencies, Homeland Security, NSC, Department of Justice and the like. Hi, I'm Irfan Hamani. I'm from the UK's Department for Digital, Culture, Media and Sport. The digital bit, not the culture, media and sport bit. I'm not here to talk about Commonwealth Games and all of those other fun things that are happening in that department at the moment. I've over the last year and a half worked with various countries, businesses, individuals, organizations to look at what we need to do to adapt UK government policy and UK legislation to address these challenges. So I'm very, very happy to be hearing from people here. I'm exceptionally excited to be hearing criticisms of what we've all been doing and how we've all got things wrong because we hear a lot of it. Careful what you wish for. Well, we get a lot of encouragement and I think that's partly because not much is done in this space so I think anything is good in some senses but we're not gonna do this well unless we have real kind of challenge and that's why we were quite keen to come to DEFCON and talk about it because if there's anywhere that people hate policy makers and government, it's DEFCON. I mean, I think that's more law enforcement than policy makers, but sure. And of course I realized that I've done a terrible job at introducing myself so I'm Jen Ellis, I'm Rapid Seven's VP of, see this is why I don't do it because I don't know my title, the VP of Community and Public Affairs. So thank you for joining us. So just kick us off. Like I wanna ask the guys some questions but I do wanna keep the conversation as interactive as possible and give everyone a chance to sort of jump in. So what I'll do if it's okay with everyone here is I'll ask a couple of foundational questions just to get us started with the guys and then if people have questions or comments they wanna make and they are a follow up, can you do like a two finger and I'll try and keep track of it and call on people. And then yeah, two fingers like this is also fine. Whatever works best for you. Thank you so much. And then if it's a new point, just a single finger again it can be this, I don't really care. Just one finger is good. Okay, awesome. So we're talking a lot here about resilience and I think that's a word that we bandy around quite a lot. What do we actually mean when we say resilience? And like one of the questions I'm always interested in is how prescriptive can or should governments be on the topic of resilience? Fight. Fight for it. I'll take the first bit and then maybe go on for there. So I'll need the harder bit. Definitely, absolutely. So resilience really broad topic being talked about well outside of cyber policy particularly at the moment with kind of geopolitical tensions but cyber resilience. So the UK's talked a lot about and the US even Chris was talking about this this morning. It's talked about cyber power which is often talked about in the context of offensive or destructive capability but actually certainly to me and to many in the UK government it's around everything else around what we're doing around cyber. And a lot of that is around cyber resilience. And so I think often when we talk about cyber power with thinking of the national security side of things but actually there's a massive economic angle to this and economic prosperity angle to this. And I think getting cyber resilience right means that all of the great things that happen around digital technologies can happen well and with kind of risk minimized and people kind of getting the most that they can out of them. And so for me getting cyber resilience right actually means making a country or getting a country to take advantage of its digital capabilities and actually building it as a cyber power. And it's really impossible to do that now without thinking about digital supply chains. There are very few things that we rely on as much as digital supply chains. I think making sure that we've got the right balance of encouraging them to grow, encouraging people to take them on to use them and making sure they are well protected is really important. So for me when we talk about resilience this is not just a national security issue this is an economic prosperity issue. But the UK was if rather thorough also rather late in looking at the wall way issue. And it seems hard to re-counsel the idea of looking at cyber power and cyber supply chain risks without fundamentally trying to reorient how we deal with China. Immediately throw some balls at you to get some stuff going. Meet Dave. I mean I'm having to jump in and provide a bit of cover for the moment. Aw, look at this, it's quite nice, it's beautiful. I mean Australia was quite an early mover on that front. I think because we did recognize that- Wait, was he providing cover or was he digging it? I'll let him, if I can collect his thoughts. On that, just on the fact that we were concerned about the transparency of this particular high risk vendors and their autonomy that they would have from any government. And I think that drove the decision back in, well back to 2013 when we made our decision about our national broadband roll out through to 2018 when we made the decision on 5G. And if you read the legislation that was put in place, it's agnostic in which country or supply just wants our telco providers to offer assurances to government that they're able to, of the continuity of their services. So that has formed a larger discussion in Australia about our national resilience and what kind of approach we want to take to tech writ large and that's kind of underpinned by a few different things. It's the security by design, it's transparency of suppliers and it's also about the autonomy piece which goes back to the 5G decision because we want to know that providers in Australia are going to be subject to undue influence from outside parties. So, and that's, we don't want to be prescriptive in this space, we'd rather take a principles-based approach. Do you want to jump in or should I just, should we carry on, go on? I feel like China will come up a few times. Yeah, and I will say like, from my point of view, I don't think you can, no government's just doing one thing. And so, what we're talking about here specifically is the idea of moving regulation for cyber resilience upstream and looking at digital service providers. The matter of China also needs to be addressed but it's not the same thing and they will be handled separately and are being handled separately. So that's my take on it. So then, I think we got as far as why resilience matters but not how prescriptive government should be. Yeah, I think, like I said, we've in Australia want to take a principles-based approach which involves businesses taking decisions on relevant governance and technical standards in their sectoral context. So, and reviewing their cyber posture regularly, having understanding of their supply chains also is paramount in our view. And we've done some reviews of the cybersecurity framework in Australia and recognise that it's, there's a plethora of different legislation that exists that across where a federal system. So we have both federal legislation and regulation as well as state regulation. And across the board there's about 51 different touch points for cybersecurity entities in Australia. So we don't want to make that environment even more complex. So we really want to ensure that we're harmonising with international partners, those cyber security frameworks. So for instance, we've introduced some security of critical infrastructure legislation which expands out to 10 different sectors, a broad range of requirements, one of which is to put in place a risk management plans that are holistic in nature, so physical risk, supply chain risk and cyber risk. And within that, we were looking at, do we want to have really prescriptive cybersecurity requirements in each sector? And the feedback from industry was that, look, can you harmonise as much as possible? So we've basically said, pick one of five international standards and the NIST framework is one of those. If your board is happy to attest to the risk, that's within that your risk management plan that's underpinned by the NIST framework, for example, then we're happy. And we'll engage with you, we'll have a discussion to inform that risk management plan, but we're not gonna be very prescriptive about what it looks like. You sound like you wanna jump in. I absolutely want to jump in. So I think it's quite hard to be proportionate if you're being prescriptive. I think the way we've approached policy in the UK in this area is that you are looking at cyber risk in a balance of risks. And it's very difficult for government to say every different industry is facing the same risk and therefore must do the same things. And so the approach we've taken on this in the UK is a sector by sector approach. So the financial services regulator will make the decisions on what exactly it is that companies in that sector need to do and likewise for energy, water and the others. So not a big fan of the prescriptive approach we've gone for a principles-based approach, but at the same time, you can't have 13 different sets of regulations for 13 different companies because actually a lot of what keeps, a lot of all of those sectors will have similar needs and similar, will need to respond in a similar way to a changing cyber threat landscape. And so what the NIS framework does is it says there are a number of things that you need to achieve as a, or that you need, it gives regulators a set of powers and requires companies to adhere to a set of, I think 14 principles, that number might be wrong, under the cyber assessment framework which is put together by the National Cyber Security Centre and work towards achieving those. And that framework doesn't actually say you must do this, it says you must achieve this. And then it's up to the company and working with the regulator and getting advice from technical experts and the NCSC to understand what needs to be done that's effective to get to that answer, avoiding a sort of checklist approach to saying, yeah, we've done good cyber security, tick. Okay, thank you. I think just to start you. Yeah. No, no, you've got a tripping, go for it, yeah. Vendors or highly dependent providers that something specific that's come up with. So, right, there's three big providers and one of them is completely wiped out by something. Can you, sorry, can you talk more to my question? So, in considering resilience, how does monoculture or, you know, not quite full monopoly, but limited number of large, oh my God, can you see my lips moving? I'm concerned about monoculture still. It's maybe not as bad as it once was on the internet, but we have, right, customers and suppliers, right, there's aggregation at the very least. Does that, or how does that play into your resilience concerns? Yeah. How about that? I'm asking if that's a piece of the resilience problem. And it can be a Chinese supplier or it can be a, you know, Western supplier, but, or provider, right? Concentrated risk, single point of potential disaster. Okay, so. Here, what if Apple's wall garden is completely compromised by something? Right, so I think, I mean, I think what you're, I think what you're asking is, what are we building resilience for? Or against? Possibly, yes. Maybe I'm assuming a different resilience than you all are. Are you looking at sort of the risk that one of the major providers goes down and actually how do we deal with that? Is that basic? Sure, or it could be a supplier's completely compromised and it's a major linchpin supplier. Right. Yeah. So I think, I think it's, it's, it's a risk that, you know, I don't think we've quite got right yet. And I don't know how you get it right because a lot of what you can do now with cloud services as an example and what we saw before on telecoms as an example is scale provides opportunity, right? So you have these huge provide, you know, the market, the market for telecoms was only, could only accommodate for that kind of scale a few providers. And actually that meant that a lot of companies dropped out as a result because of that. And we're seeing a similar thing with cloud scale provides competitiveness. I think making sure that those companies are, sorry, can everyone hear me? Oh, okay. Making sure that those companies, is that better? Oh, I can actually hear myself now. That's terrible. Okay, making sure that those companies are doing what they need to for resilience is important. And so the, I mean, with the way we've tried to approach this with our new set of regulations is if you are an important digital service provider, you must adhere to certain set of principles and achieve kind of cyber resilience. The other thing that's important is, even if you're not one of those big providers, if you are a important provider across critical national infrastructure, you know, in discreet parts, you know, you supply a bit of the water network, a bit of the electricity network, you might not be individually important, but on aggregate you might be, you must also then be in that. So I think there's a scope question on how do we make sure we're getting the right companies. Exactly where I wanted to go, but I also want to add just a quick clarification because, delightful, whoa, okay. Because I think that, you know, regardless of which government we're talking about, when it comes to cybersecurity, there are going to be parts of the government that set policy, and then there are going to be parts of the government that are operational. And so they work together. In an ideal world, they work together. I think they do for these two governments. And so, you know, the policy makers are going to be looking at, you know, what are the general rules that you set? And how do you make sure that they're adapted to the right people in the right ways and who are we trying to target? The operational folks are going to be building the redundancy and resilience plans to say, okay, if, okay, so we've got the policy makers and they're proposing policy to make sure that all digital service providers that meet X requirements are covered by these baseline things that they should be doing. But then the operational folks are going to be like, okay, but if AWS goes down, what happens? Right, what does that mean for the UK and how do we respond to it? And so I think, you know, what you want is for them to be working together in lockstep and thinking about those things. But I think the question that you're asking almost orients more towards that operational piece than that policy making piece. But I do think one of the things that we should talk about is the term digital service providers is extremely broad. And so like, what does it actually mean? What are we trying to get at here? Is it everybody? Is it specific groups of digital service providers, specific profiles? You know, what is it? Yeah, I, sorry. Sorry. Thank you, Jen. If I could be very specific for a moment as an example. I may have a policy that mandates you must have independent service providers and two of them. And they may not, or maybe I have anti-trust sort of anti-monopoly policy, possibly intentionally. And not to say that the principles aren't good and a certain criteria of provider must meet them. They could still completely fail. So that kind of redundancy over. Yeah, thanks. I'm sorry, please. Yeah, so, and I think just on finishing that, I think we also need to be looking at that sort of risk diagram on impact and frequency, right? How light is it that that's gonna happen? And actually, what, you know, what are we worried about? Are we worried about, you know, a company that has built its business model on being resilient and therefore providing resilience to others, right? Or are we worried about the kind of 10 person operation that's done phenomenally well is serving eight different parts of critical national infrastructure and government hasn't quite picked up how critical they are and, you know, hasn't quite got its firewall fixed, fitted out properly. So I think we just need to worry about what risks we are trying to address. And also, what are the service providers? So I think there are, you know, under UK law, there's the kind of cloud services, online marketplaces and online service, right? Yeah, so those are included. So I think, I mean, even the time cloud service provider is super big, right? Like, do we mean Netflix or do we, and if anyone's here from any of the companies I'm mentioning I'm not trying to victimize you in any way, I just want to kind of get at, like, every company is doing something in the cloud now, right? Like, I mean, actually, quick show of hands. How many people here work for a company that they think is a digital service provider in some way? Really? Only that? Many? Right, I was going to say, how many people here work for the government? Right, right, I see you over there. So our cyber breaches survey, which we released earlier, said 62% of companies think that they use a digital service provider, and I think that's a massive underestimate. Yeah, I think it's a massive underestimate as well. And I think this is the problem. I think people realize how dependent they are on... Because, I mean, that would suggest that, what, like, 38% don't have email? Well, exactly. Or a website. Yeah, okay, so... So, when we say cloud service providers, we mean infrastructure as a service, or more? Infrastructure as a service, parts of software as a service. Not yet quite managed service providers, though. Okay. Which will actually use cloud services as well, so you have a kind of compound effect of those. Well, and a lot of managed service providers are in themselves, small to medium businesses, which is a challenge. What about MSSPs? Not yet included. Okay. Sorry. Interesting. Is that gay? Gay, were you trying to say something? We need to speak louder. Oh, we can't hear. Okay, so we're talking about the kinds of cloud service providers. Hello. I'm very scared of this microphone now. So is he. All right, so... There was a question. Oh, brilliant. Yes, please. Sorry. Right, that of them. Right. Right. Governments, I'm seeing with DLT that governments like Australia or Singapore are already going into creating policy around crypto and stuff like that, or wanting to have... I did tell you come up. Wanting to incorporate things like, in terms of outcomes, allowing us to control our own data, so things like self-sovereign identity or decentralized identity, what not. So to what degree is I guess these emerging technologies that if they are actually, if there's actual, what's the word I'm looking for? People start using them, right? They're not just ideas anymore, but they become, what's the word I'm looking for? Like they're picked up and everyone uses them. I can't, my brain is... So they become... Adopted, thank you. If they're widely adopted, then the outcome could be as a policy measure that we're saying that like, companies shouldn't be managing our identities. If we do, we get both security, privacy. We manage our own data. We don't have leaks. There's no central point of failure. All of these great things that we advocate for companies to do. I'm curious what the governments, do they want to give up control of managing our own data? And I hope we're not diverging from your topic. So yes, because I think that's a privacy rights thing, not a security rights thing, but I do think it's a great question. It's about integrity. It is, because you can't have one without the other, but I think the way the governments think about it, it's a regulation that tends to stem from the privacy side of the house less. I mean, I don't know, you guys... Cryptography. If you guys are... Zero non-truths, you know. Is this a thing you guys have been in? I think it's a perennial problem for governments is that we're struggling to keep up with the pace of change here, but there's efforts underway in Australia, a review of the privacy act, for example. We're pulling together a national data security action plan and we're working with the UK and the US on harmonisation there on data security issues. But yes, look, frankly, it's an area where, even for a pretty agile government like Australia, in which we have a joined up Legislature and Executive Branch, we can pass legislation quite quickly. We still are struggling with. Yeah. I mean, look, I think we try to focus on cybersecurity, which is a really hard line to draw, because I think there are times when you can't separate cybersecurity from privacy and you can't separate cybersecurity from other things. So, you know, the Chinese tech question that came up was partly a cybersecurity issue, was mostly not a cybersecurity issue, but falls into a similar kind of bracket. I think the other thing about what we try to do in government is, at the moment, we are trying to reduce at scale cybersecurity risks in the economy, and that means things like regulating, putting in policy for technologies to reduce that at scale, and then looking at how we make sure companies are doing the right thing to manage residual risk after that. I think when a technology hasn't been kind of adopted in a wide-scale way, I don't think we then necessarily look at, unless we think it's gonna form the basis of a future kind of civilization, you probably wouldn't be looking to do anything. You know, and even when technology is being taken up quite broadly, legislation is still a final, it's like the last thing that you go to. So, when we looked at, there are people in the room that have been involved in the Secure by Design stuff or, there's no one I can see, but there's, sorry, the Secure by Design security principles, right, 13 principles, and the hope was actually that we wouldn't need to legislate for this. The hope was that this would be taken up by manufacturers, so that actually government doesn't have to get involved, and then when we saw that actually that risk wasn't reducing, we then went to legislation, and I think that's generally the principle. Now, you know, cloud services, manage services, I think it would have been wrong to do this too early because their uptake has provided incredible benefits for businesses, right? You do not have to have a financial account, I think back 25 years ago, you'd have a financial accounts and systems department in a medium-sized company with two or three people managing those ledgers and that kind of thing. You can now run a cloud-based finance system, right? So you do not have to have all of these specific expertise in your company, you can run a business and focus on the bit that's the business. There are risks that we now need to look at. So I think what we saw was, I think it's something like 7% of companies understand their supply chain risk or actually actually look at their supply chain risk, and that's not going up. And so that's the point at which we say 7% of companies in the UK. Well, how could they know it better? Sorry. How could, sorry, this is really loud. How could companies know their risk better when companies like AWS don't actually say here are the potential risks you might have? In fact, it's been like, here's a shared responsibilities model, you're responsible for whatever you put in the cloud, we're just responsible for the security and the privacy of the cloud. So it's a liability model for contracting that everyone extols as being like really good when in reality it tells the companies nothing about how they're supposed to protect data or how anything about their cybersecurity risks. I actually, AWS does a better job with security than privacy when it comes to educating, but I guess the big thing I wanna say is education is broken around privacy and security. Security is obviously the topic of the day, but so it's broken in that if we wait 20 years before DLT is regulated. When the goal of DLT is to, or blockchain and hashgraph or whatever, is to regulate without, is technological regulation without the need for government regulation, right? Like at the whole point, so if we don't have government kind of like making sure it's safe now, so many people will be harmed or because they don't understand the risks or the fact that DLT plugs into distributed apps that are riddled with insecure code because the developers and then basically network effects is what each company wants. That's what AWS wanted, Google Cloud wanted. Now each of these layer one networks wants network effects. So they're throwing hand over fist millions of dollars to fund developers on their networks to build without saying, hey, we need the secure coding. We need to teach people how to do this so that it's safe. And so all I'm seeing is just constant breaches of constant not advancing anything forward. Instead, it's making people fearful of using these technologies where, and if it was an obligation that the government made that said like, look, here's a principle, you have to educate your networks on secure coding or like you have to give them tips and tools of what not to do. Otherwise, people lose their keys easily or people, there's a lot of danger. I mean, education awareness is incredibly important. The other one that came up was actually people don't know who's in their supply chain, that transparency and that's not just a service. I mean, that's something that we're seeing more and more of. I think we've had really complex kind of, we've had models built that are actually much more complex. It's quite similar to the software Bill of Materials conversation, right? You have this stack and it's great, but you don't know what's inside it. Similarly, you are buying in a service or you're developing a service and selling it on. People don't necessarily know what's inside it. Which is where S-bombs come in and that's really exciting. The software Bill of Materials, yeah. No one's asking that. I sense that we're gonna pivot to S-bombs soon. I was educated on S-bombs yesterday. Yeah, Gabe. What are the limits of government responsibility in this space as you see them and can you give me two examples? So I think that was what are the limits of government responsibility in this space and can you give him some examples? Is that right? I think I'll go back to the kind of wicked policy question over here about like the mono, what was the phrase used? Monoculture. Monoculture. So, and I'll go back to the example of the critical infrastructure reforms that we've taken place in Australia. So we wanted to, in 2018, when these reforms were implemented, we want to have a good understanding of the owners and operators of entities that existed in the water, ports, gas, and electricity sector. And so we can have an aggregated view of who was operating and what risks existed. We've subsequently decided we needed to expand that out to 10 sectors which includes data processing and storage. So then we can map, we can have an aggregated view of where and do some modeling on where risks exist. What we're not going to do is then say to a data storage provider in Australia that you need to do X, Y, and Z because there is a supply chain issue that we've identified and you are spread across X amount of sectors. So that's where I think from the Australian government perspective that's where the limit would go. We have a pretty good understanding, we're hoping to have a pretty good understanding of the risks across our broad economy but we're not going to be prescriptive with how you go about rectifying that, particularly in a market like Australia which is quite small and we're a tech taker not a tech maker, so. Just who asked that question? And it was- Why is the government going to come again? Well, so was the question, isn't it the, like where does government responsibility end? Is that what you were asking? Or where does it begin? Yeah. What are the limits? So, so. Okay, so what have they tried and what's not worked? Totally different question, very fun question. I can give half a dozen examples but you go first. I was going to answer it slightly differently in that this is absolutely the responsibility of organizations, organizations are responsible for their cyber resilience and it is a big part of UK government policy to help make companies aware of what to do, like what is good, that they are responsible for this, that there is a cost to not doing this well and to factor that cost into how the board makes decisions about investments and priorities. I think there is a point where it crosses the line of being a business decision when it starts affecting safety and security and national security and in areas like critical national infrastructure there is a responsibility of government to say it's not just down to organizations to make this decision, government must make sure that a minimum is being done in these areas and I think that's where government responsibility begins. I think it's really difficult to then say every organization in the economy must adhere to these certain things because actually there are companies that companies are best placed to make decisions about their risk and I think the example I use at the moment is a lot of companies in the UK are more worried about inflation and what that does to their business than cyber risks and they're quite right to do that. If we start telling a corner shop that they must protect their point of sale system and worry about that more and it's what Chris was saying this morning, I don't know if anyone caught his thing about putting cyber in its proper place, right? That's not saying it's not important but it's saying it has an importance and you must judge. So I think most of this is how do we get companies to do the right thing and give them all the tools so that they can do it. Yeah, I might jump in here as well. I think one thing we're learning in Australia is that it's government's role to also identify market failures as they're occurring. So, and in the Australian context an identity that's unique to our economy that's happening across the board is we've identified a couple of key market failures and then we need to understand what leavers, policy leavers we can pull to address those. So those are really the two key ones that we've identified a kind of negative externalities where the investment in cyber security, there's no incentive to do so and then that risk is pushed down onto the consumer and you hear Chris Singles talk about this all the time that like it's a whole society issue we need to push the risk upstream. That is one of the key things we've identified. The other one is what we are calling information asymmetry where the sellers of products and services have greater understanding of those products. The buyers do not and we're hearing from small and medium enterprises across the Australian economy that they just don't have the technical wherewithal to make informed decisions and that's where I think as governments need to step in then you get into a really vexed point of what to do because you basically get into a philosophical argument as do you have government intervention in the market or do you go with and this is born out in our stakeholder engagements, do you let the market correct itself because there's a tendon, I guess, reputational and financial risk that will occur in the marketplace. So that's kind of where we're up to and then we can probably go into what levers we could potentially pull, but I think that's something that we're learning in the long run. So I love the fails question, but is this a, have you got two, you've got one and a half. All right, so why don't you go and then we're going to come back to fails because I think fails is a great question. You go ahead. So you've already kind of half been getting to the question that I was going to ask but not quite. I was going to ask, government historically tends to be reluctant to take large risks by adopting technology early but adoption of technology tends to lead policy. So whose responsibility is it to lead technology adoption? Is there any way to flip government adopting late and then writing policy for business to government adopting early so that they can develop policy for business if you were of any programs? So I think, yeah, so I think government has a real, I mean, this is one of the things that government can do that businesses can't, right? Is that can make investments and get it wrong. And in the odd event that government gets technology right, it will have set a precedent. I think actually, certainly from what I've seen in the UK, we're starting to get better in not taking a blind punt on these things anymore. I think it's that, you know, when, and my reference point at the moment is something around digital identities in the UK where we are actually looking at enabling that entire market by kind of opening up government data sets and creating that market, but actually not doing it in a kind of quick and rash way but taking time to actually understand, you know, what's the security implications? What are the privacy implications? What are the market implications? Are we going to create a market with two or three players and is that where we want this to go? Or do we want to market with 15 players? Are we focusing on a particular industry or is this going to be across the economy? And I think more and more, and I think part of this is as government has opened up its process to non-government actors and actually has built into the process of policy-making, making sure that businesses, civil society are involved in not just consultations, but the bit way before the consultation on actually formulating the ideas, you know, what is it that we want to see at the end of this process? That's not going to be true across the board, but I think, you know, those models of what, you know, what do we want from this and how do we get the right answer before actually plunging tens of millions or more is quite important and seems to be getting better in some areas. Okay, all right. I'm going to jump back to fails then because I think that was a question that both of you were like, oh, we've got answers on this. What have you tried that hasn't worked? The most immediate, I guess, answer to that question kind of was not necessarily in the pure cybersecurity policy realm, but more when it comes to, and I'm conscious that I'm sitting next to the expert on this in the Australian government, so I'll have to defer to her on this. Thanks for calling me out. I'm calling you out. I'm calling you out. I'm calling you out. I'm calling Cognito for you. It was lawful access to encrypted data and that was an area where we tried and we're still trying and we're not, the framework that we have in place has worked well in terms of voluntary, kind of voluntary assistance from particularly social media companies, but it's an area where I think we haven't quite got that right and we didn't necessarily get it right in the first go round with our legislation, but I'll let Yifan jump in as well. Can I just say though, I think you're very brave in the DEF CON room to proactively bring up encryption back doors. I will say that we're a strong supporter of end-to-end encryption, didn't ever change that. I guess the fundamental issue was about ensuring that service providers, like they enforce their terms of services on the networks and work with law enforcement, cooperatively, so we're not against the encryption by any means. I just want to go on the record and say we're not against encryption. Well, they keep coming up at every conference I go to, so I'll put that marker down. Thank you. In terms of cyber, does anyone across the table want to give an example of when the UK government has put out a policy question and got it completely wrong? Sitting side on this discussion. So first, let me credit the UK. I've observed the consultations process. I'm in the US, so it doesn't really affect me, but in the US we have requests for comments and various things as well, but it's great to see governments actually asking. Their citizens what they think. And there's been a bunch of cyber stuff out of the UK, which I've had some friends, policy friends, helped me notice that they're happening. Yeah, the certificate, government certification, qualification of cybersecurity professional, which it's easy, I mean, it's a dunk, but in a way, I certainly can appreciate sort of the desire for that. We have lots of professional societies where you have a credential and you need to keep it up to date and it matters. And I would perhaps suggest that cybersecurity is just too fluid and new for that. So I appreciate the effort, but no, yeah. Yeah, yeah, and that's what, I mean, I'm glad you didn't say Brexit. Can I say that? Is that a cybersecurity topic? But you're absolutely, I mean, we put a consultation out that said, should cyber professionals be licensed? Should they have a license to practice? It's not, it's an important question and it's not the last time this question is gonna be asked. And in the future, the answer might be different, right? I think there's a lot of pressure now to get the right people on board to actually understand what's going on and have some kind of a way to make sure that they know what's happening. Licensing professionals was also seen as a way that maybe we could improve the quality of cyber professionals in the UK. We have two issues. One is that it's very difficult to tell what credentials a cyber professional has and how qualified they are. And then the other one is we have a gap of 14,000 people. 14,000 cyber professionals a year. Yeah, so more barriers and roadblocks to them coming into the profession seem like a great idea. Well, it solves one and not the other, right? It can help solve the quality question but not the quantity question. And actually, when we don't know what good looks like in that space because we're still developing the frameworks of what's a cyber pathway for the 16 different professionals that have been determined by the UK Cyber Security Council, if we don't know what those pathways look like yet, how do you actually license someone? So it was a question that we asked looking at what was happening around the world in different places on, you know, some countries do have license requirements for cybercrashinals. The answer came back as no and we scrapped the idea for the moment. Yeah, at the risk of sounding like I'm being horribly patriotic here. The DCMS did come back with a response that kind of said, yeah, we heard you and we're gonna take it seriously and we're gonna not proceed on this. And similarly with Secure by Design, you originally were like, hey, how about a label and people went, how about not? And you went, yeah, okay. And I think, you know, the thing that I would say for people in the room who are interested in public policy and how it develops and how they can impact it is that, you know, in democratic nations, they tend to run these processes, right? As they're building policy, they tend to have consultations. The US runs them, Australia runs them. And actually through this amazing thing that we have called the internet, anybody can reply. You don't actually have to be in the country. And the government will take it seriously. They're not gonna look at it and be like, wow, you're not here so we don't really care. And I would just say the examples that the airfans just shared do sort of highlight why the consultation process is so important and why you should absolutely get involved with it. You know, the UK government just ran one on app store security and privacy. And I think as with many of these things, what's typical is you see the people who potentially will get regulated respond a lot and they will get together and they have trade associations. And so they like respond individually and as trading associations, maybe they create some new trade associations just to really amp up the noise. And they're all like, no, regulation is a terrible idea, we don't need it, it's fine. And actually the people who care about security are the people who come to these events. And you guys are on the sort of front lines of why security matters so much. And so your voices, and maybe you don't agree with regulation, maybe you think actually it's government overreach and it shouldn't be regulated. But the fact that you have that perspective from the front line of security is so important and you can really help shape outcomes. The other thing I'll say just like last thing on this, while I'm on my soapbox for why you should participate, one thing about DCMS, and I don't know if this is true of every government is they really, really like to have a survey. And it's because they like to have both a quantitative response but also a quantitative response. And it can be a little annoying, let's be honest. Sorry. I'm washing you down. Right, that's fine. Trust me, I think I've put this in writing already. You don't have to stick with that. There will be somewhere on the website that says, here's an email address where you can send us your thoughts. So you can do the survey or not do the survey depending on what you wanna do. You can also just sort of write a letter that says, hey, here's what I think. And it doesn't completely conform to your survey questions but like this is my view on what you're suggesting and why I think it matters. And you can email it to them and they will read it. They will consider it and they will take it seriously. So I highly advise you to get involved if you see topics come up that you care about. Amit. Oh, can you, can we, can we? Thank you. See, teamwork makes the dream work. Oh yeah. Can you hear me? Yeah, I'll be brief. Just a quick fill up and then a word of thanks. I think one of the most amazing things that we have seen and especially in the UK, Australia and other jurisdiction as well is a summary of comments where you will be able as a community to also see which comments were in, from what type of entities, how much as individuals, how much as trade associations from which industries. This is by the way also true for the DMCA proceedings that we talked about earlier today. And you can also, so not only you would get a summary of the consultation and the government response and these are really illuminating if you look at the connected devices, right? For example, comment here and how impactful that was to the shaping of the recent legislation, the PTSI. Not only you are able to access that and that's a summary of the consultation and can give you ideas of where you wanna engage and which questions can most benefit from the tremendous input of this community, you can also read the letters and the comments themselves that were submitted and that is really helpful to kind of understand the shape and the type of arguments that are being made, type of technical argumentation as well. So I just wanted to quickly put that plug and I would agree also in the EU as well, we see a lot of surveys that are coming in and these are very detailed but there is always an opportunity to engage with. Often it's a think-thank that creates the consultation for the European Commission and provide them with comment directly or comments in the shape of the letter. Thank you. Yeah, really interesting because on that consultation we were talking about before, organizations wanted individuals to have licenses but individuals didn't want to get licensed. So really important that we understand the distinction of who's responding. Go ahead, Dave. So I just wanted, I thought I'd throw another fastball I hit basically. Just at the UK in this particular case. So you can just chill. We're gonna get you a new LinkedIn picture by the way. It's too spooky right now, you need at least something deep faked. But so if we wanna talk about failures from the UK in the past 10 years based in cyber policy. What you're talking about Brexit, is that it? Cause I'm ready, let's go. What are you ready for? Brexit. No, I was gonna talk about export control, right? So the UK has the biggest and best penetration testing company in the world at headquartered in the UK, NCC group. Did the UK communicate with NCC group on how export control was gonna work? No, right? Like they didn't bother to ask Olly Whitehouse if he wanted this to happen or if it was a good idea. And even now, it's sort of like they're ignoring his input on whether or not it's working or not working. So I feel like. One person have dare you ignore that one person. Right, I just feel like it's sort of like we did what we wanna do and now it's up to industry to deal with the issues. And we're not gonna even say if it's working or doing anything. I'm gonna push back on that. And you and I, we've talked about Vasana at length. I think the UK has actually been pretty receptive to feedback and actually was supportive of going back to Vasana and saying, hey, we need to look at what the impact is on researchers and looking at the language. So I will push back on that a little bit. Go ahead. Sorry, I don't wanna halt the catharsis, shall we say? I think there is a really important broader point on how we kind of do this stuff responsibly. And what is it that we want to put out there in the world? What kind of practices do we want people to be undertaking and what do we wanna proliferate, right? I think we need to be very careful about how we conduct ourselves in cyberspace. I think it doesn't go unnoticed when we do something. It goes more unnoticed than when the US does something. So the US has to be very careful on how they do things and recognize that when people see something that it does that give other people a license to do the same thing. I think the UK has to have a set of principles by which it thinks that this is okay and this is not okay. And I think, and I don't know specifically, I think the export control question came under that. What is it okay for a cyber practitioner for doing what, what is it not? I think Oli Whitehouse is an incredible individual. He does actually talk to decent as well. I really like him. He inputs on pretty much everything we do along with a number of other people. And you know, it's a shame that this one didn't go that way for him. And I think the UK government was very surprised by the reaction. And my sense from having talked to a lot of people who are involved in it, is that there were some learnings from the process. But I also think a lot of it was not driven out of, it wasn't driven out of the traditional policy-making units. I think that was part of it. I mean, it wasn't a DCMS thing. It was a broader kind of how do we, what do we do as the UK and make sure that the wrong things don't get out to the wrong people? I don't think it was a problem to go down that route. This is the technical people when it's convenient. Is that? No, I don't, but I don't think this was a technical question. I think this was a question on principles and on law. I'm not sure that this was a, you know, so for example, when we are the app store, app security code of practice that we put out recently, we had quite a lot of interest from a number of different companies and individuals on what those, we put out six principles. These are the things that app developers and app stores should adhere to, to, you know, similar to the conversation we're having now. How do you reduce cyber risk upstream? What are the interventions we need to do up here? So the person that's downloading apps to their device isn't gonna worry about security because they don't anyway, right? And technical respondents told us things that we weren't necessarily keen on. They told us that, you know, we need to be looking at enterprise app stores. We didn't really think that that was gonna be, we weren't sure how important that was gonna be to start off with. They told us that we can't ask operators to say what good looks like without actually stating what good looks like. I don't think it's that we listen to technical experts when it's convenient. I think we listen to technical experts when there's a technical answer needed, but not everything in cyber has a technical answer. I think a lot of it is around governance, a lot of it is around international norms. I think a lot of it is around stuff that is way outside of cyber technical expertise. Yeah, maybe if we just take a step back and say, okay, so I'm gonna use DCMS as an example, but I think this is actually true of a lot of governments that I've seen. So DCMS has expert advisory groups, right, which are people from various different communities that are asked to be part of an ongoing advisory group, and they will basically bring together a mixture of technical, business, strategic, international diplomacy, as you said, and so they're getting all the different points of view and expertise, and they meet regularly and go, here's a bunch of stuff we're thinking about, what do you think? And they get that feedback, and then they process it, and they'll then have follow-ups with little working groups or sub-conversations, and that's an ongoing effort. Then when they get a topic that they're actually pursuing further, that's when you get the open consultations, and anyone can respond to them, that's why they're called open, and then as they're going through the open consultation process, they'll also proactively try to seek responses from specific communities. So for example, number of times they've come to me and said, hey, you know a bunch of people who work in security, can you help us talk to them? And so as, you know, I'll try and plug them into various people who I think are relevant in different ways, like Ollie, and Ollie's, I mean, I know you just picked Ollie because he's, you know, he's Ollie, but he's very engaged with lots of these groups. And so, and then on top of that, they talk to existing associations and groups that have the ear of technical people. So I actually do think, and I don't think that's just unique to the UK, I think the US does the same thing, I think Australia does the same thing. So as you think that there is a huge effort from governments to talk to technical people, I think the problem is that as with anything, and I am laughing at myself saying this because we're in this room, it's an echo chamber. And so your trouble is you only often speak to the people you already know. You run an open consultation and you hope you'll get to meet new people. You come to an event like this. I mean, these guys flew here to come talk to you guys because they wanted to meet people who are in the security community and build those relationships and those networks, but you can't force people to participate in the process and everyone's busy. And so like, you know, if you have an APSEC consultation, I mean, the guy Ed, who has been running the APSEC consultation, has done everything he can think of to hear from the technical people in the security community to the extent where he literally like, I introduced him to some people at OWOSP, and he sort of went along to an OWOSP then with no knowledge of what was gonna happen or if he was gonna speak or what was anything to expect because he was like, I just wanna hear from security people who care about app security. And so I think the desire is really there all the time to keep it going. But one, yes, it has to be tailored to the specific topic. And two, it has to be met on the other side by willingness from the community. And the problem we all have all the time is that we're drowning in noise. Like, how do you guys keep up with what governments are working on? Because I have a hard time with it and it's my job. And so for people who it's not their day job, I think it's really hard to do. And we actually talked about this. We were like, I said, DCMS covers digital, culture, media and sport. Is there a way to do an RSS feed on the website where you just say, I'm only interested in cybersecurity? I don't wanna hear about all the press releases for sport. And you were like, that's a good idea. Right? So like, I think there's a noise problem. There's a signal to noise problem is part of it. The other thing I'll say is, and this is UK specific, although I think, again in Australia, you have the ACSC. But in the UK, DCMS works very closely with the NCSC. And I will fight anyone who says that they don't have good technical experts. Because they do, you know, it's their job. So I think you do get quite a lot of technical input. It's just that sometimes things don't go exactly how you thought they would. And Vassana, I think we can all agree was definitely an example of that. Yeah, go ahead. Nicole, like technology companies lobbying play into this because I look at someone like, like I look at myself and I work for myself now, but I don't have time to respond to all the potential requests for information and no one's paying me to do so. But yeah, if I worked, I've done public policy at Visa and other places, not government relations per se, but they have huge teams of people that are responding. And so, you know, how do you get to the people who are paid to do so by a company? So this echo chamber that we were talking about, right? So when we, generally when we put out consultations, the response, like I said, I think at the beginning of the session is, yes, this is the right thing to do, go ahead. And a lot of what we get is from the cybersecurity industry. And that's great. It's really, really important that the cybersecurity industry is on board with what we're doing. The other issue is that, so we really, we, you know, way before stuff goes public, we have been talking to tech groups and particularly big tech on something like app stores. We absolutely have to speak to the people that are running app stores. Otherwise, this code of practice is dead in the water, right, before it even comes out. I think we have to be very careful about the technical advice that we get from these companies, right, because there's not a particular company I'm thinking about, honestly, there are, because this is across the board. Security has become one of these things that tech companies use as a differentiator in the market. And when I talked earlier about, we have to make sure that cyber risks are placed alongside other risks. What we see now, and the US is doing a lot about this and the UK is trying to do stuff about this, is we definitely need, in certain areas, more competition in the tech sector. We need to do a lot more around making sure consumers are protected in the tech sector. Those are economic harms that sit alongside cybersecurity. Now, what we're finding is, because I think all tech policy is new, relatively new, compared to things like health and safety policy. So on maturity curve, we are really right at the beginning. And cybersecurity can sometimes be a little bit ahead of other policy issues in the tech space, particularly around competition, for example. And so what we find is that companies will be jumping on cybersecurity policy questions as a way to say, yes, you should do that. And actually, we need to be careful that it doesn't entrench companies in a way that predicates future digital policy growth and questions. And so I think we need to be kind of very... We need to listen to the technical side of things, but we need to understand that the answer that comes out isn't necessarily going to be to solve that technical problem because that technical issue sits in a much bigger landscape of tech policy and we can't do one bit without the other. But we might have been guilty of doing that a little bit in the past. Right, right. So this isn't a question just to round that up. Exactly points about liability, for instance, or risk, who bears the brunt of insurance, or all of those things I have nothing to do with cybersecurity, right? But there are still lobbyists will be pushing for what's gonna be the best outcome for the company, as opposed to human beings and users, whatever you wanna call them. Yeah, I'll jump in on that because I think I sit in a weird space because I work for a security vendor and I talk to governments a lot and so I think the way that most people understand the term lobbyist, I could potentially satisfy that term, right? And... Neutral term, right? It's just how you use it. And actually, yeah, like I'm British, so I don't have it. It's not as loaded for me as it is for other people. Plus, I'm talking about cybersecurity, not guns. So, which is good. So, one of the things that I observe often is that a lot of the conversations I'm in are stuffed by policy people, people whose job it is, to work on government relations and work on public policy. And the job of a really good policy person is to be a translator and to go and talk to the experts in their organization and build a position based on that expertise and then translate that to the governments and back again, right? Like, and keep going back and forth. But there is a time and a place where the governments need to talk to the technical people themselves, exactly as Dave said. And so, one of the things that I've always worked on, I mean, a story that I was telling Adam the other night and I think I've probably told you before, very like a broken record, when I started doing policy, so I started doing policy purely because I got super indignant about the computer fraud and abuse act and an angry Jen is a bad Jen and I went off to DC to like fix it. And here we are, it's still not fixed. But the DMCA has improvement. But when I went, I met with staffers on the Hill and this is going back nine years and they said, oh, you're the first person we've ever spoken to from the security community. And I went, oh, because they were like actively legislating on cybersecurity and so I was like, how is that possible? And they said, oh, we talked to lobbyists from the defense contractors and the big banks and the big tech companies and we hear from the Chamber of Commerce who don't want anyone to be regulated. We never actually talked to people who work in security and I was horrified. And so I made it a mission after that to really try and figure out a way to plug people who are working in security into these conversations, which is why I do things like force these nice people who work for governments to come to events like this. And I think those touch points are super, super, super important. I think we have to create the opportunity for that as much as we can. And I also think it works in the other direction, right? Like it's one thing to sort of capture policy makers and then cage them and bring them here. But we also have to look at what the events are that other people have that have nothing to do with cybersecurity and think about how do we plug that cybersecurity reference and knowledge into those events? Like how do we go to the forum, the fora, that they're already participating in that we don't know about because all we think about is cybersecurity all day long? And I actually think, like, this is one of the biggest challenges I think cybersecurity faces, whether you're talking about it from policy point of view or anything to do with adoption, is that we live in an echo chamber and we do a shitty job of breaking out of it and we spent 30 years basically saying to people, well, it's really technical, you wouldn't understand. And now we're like, oh, but you should understand. So actually, I think that's, to an extent, part of why we need to work with governments because we have successfully got to a point where there's such a level of apathy around cybersecurity that unless governments intercede in some way, we are not gonna see change as quickly as we need to. And so it sort of falls to us, it behooves us to actually engage governments and have these conversations with them as much as we can. And so, yeah, again, I will be on my soapbox and say to you, please participate, because you're obviously interested, because you've sat here for an hour very patiently. Amit, yeah. Can you... Just a really quick follow-up patience on the devices, but also the U.S. cyber residents that most of the deliverables under their executive order is that the nature of cybersecurity regulation is becoming more descriptive and more reliant on the technical standards, especially as we go into more central regulation and also needs to have a sectorial lens. And you talked about the critical infrastructure sector and the different needs. And I think that just merits a lot of deep technical expertise on informing consultations on the actual technical requirements and the standards in place. And that presents a tremendous opportunity for more technical experts to engage, but also underlines the importance of global engagement, because, again, going to the points of standards, a lot of the connectivity and the interpretability is coming from things like ISOIC standards that are in common to many of the approaches that are underlining different regulations, whether it's vulnerability disclosure, whether it's connected devices. So I'm just joining Jane called to action. Oh, yeah. Getting more technical. Technical expertise in information. Yeah, I mean, I think it's an issue in Australia. We're struggling to get the data points in place. So we've been running a process for almost two years now, a best practice cyber regulation task force. And frankly, if you look at the raw numbers, we had 770 stakeholder consultations and 140 submissions. And I guess one of the key things we wanted to get out of it was that consumer viewpoint, particularly on issues like smart devices and the like. And I will put a plug in for the task force report, which is on incentives and possible regulations that should be coming out soon. And then we'll be engaging through, again, through consultative processes on the action plans for those. So I would like keep an ear out for anything that comes out of Australia. Because we're just cognizant of, like I said before, a tech taker and we really need external views, particularly from the US, to engage to shape our policy frameworks. So I mean, we have 50 minutes left and we are very happy to talk about the sort of aspects around digital service providers, but it seems as though perhaps questions in the room are kind of drying up on that. I mean, I might just go back to, I saw a couple of nods of the head when I was talking about some of the market failures and incentivizing cyber security. So I'd be keen to just kind of get, because we're thinking about health checks for small and medium enterprises, trust markers and the like, and just want to get like unadulterated views on those, because I think we've got some hands. Yeah, can you say a little bit more about it and then we can see if people have feedback for you? Sure, so I mean, one of the, like I said before, the information asymmetries that exist is the small and medium enterprises are coming to us and saying like, if can you put a trust marker in place for MSPs, for example, like because we're unsure of who we should go with and where the risks lie. So that's one example and the other is just on governance standards that can apply across the Australian economy. Do we want to be prescriptive with those or do we want to have standards like, again, just across the board, a standardized approach, so. And bear in mind, like, because I know, probably a lot of people in the room are based in the US, maybe, but bear in mind, something that policymakers do strive for is consistency internationally. I mean, not to the detriment of their own jurisdictions, priorities and needs, but certainly to attempt to try and create some alignment internationally so that it's not creating an overburden of complexity for those who are covered by legislation. So, you know, while Adam's asking about stuff that's US specific, sorry, Australia's specific, it could well have an impact for the US in the long run. So it is definitely worth paying attention to. Hi, I'm Valerie, just to circle back to our talk about bringing more technical into the government consultation process. So your story really rang true to me. So I run a national security cyber program at a think tank now, but last year and about a decade before that I was a Hill staffer and I'm deeply, personally, painfully aware of how little they know. And it's better than it was nine years ago, I would say. Absolutely. And it was not a criticism of the Hill staffers, right? They're resource constrained beyond belief. Yeah, yeah. So one of the reasons why I, one of my goals coming here for the first time was to try to figure out, yeah, like who are those people in this community that would be best suited because, you know, I've connected to put in front of directly, right? Not just the written back and forth. Absolutely. And my current theory is that it's not sufficient just to have the technical expertise, but they also need to, you know, be fluent enough in the policy world and also, you know, really be self advocates for it. Like they care. They want this outcome for the public good. Right? So if you have any thoughts on who those people might be. Yeah. I mean, we can definitely talk about it afterwards. I think, you know, to your point, I think that's something that's important to the engagement is you have to recognize that when you come to the table as a security expert, your expertise is security. The person on the other side of the table, their expertise is policy. And so you've got to come to the table, yes, with your expertise and your opinions, ready to go, ready to talk about it, but you've also got to respect the expertise of the person across the table who knows a lot more than you do about how policy works and how the lawmaking process works and actually be willing to collaborate and work with them, not just go, well, this is my opinion and if you don't take it then you suck. It has to be a sort of iterative, collaborative process, for sure. Yeah, I was gonna say they need to have the patience to try to explain the same thing over and over again and fail over and over again. Right. Also just following up to your point about being a translator, that's how I describe my job all the time, right? I speak Congress, right? Like I'm not a tech expert, but I think I understand enough to explain enough to members of Congress. Yeah. Two very quick things, I know we're running short on time. One comment, I think it's important for us to remember too as a community that tech expertise is not a monolithic thing, right? Not everybody in the cybersecurity community agrees as to what's best and I think it's important for policy makers to understand that as well, that just because somebody who has a great reputation is telling them something that doesn't mean other people might not agree and so that gets to the, we need more community involvement. It can't be, it's not just about lobbyists, it's about who has a strong voice who's out there, right? So to your point, Adam, and you were talking about MSPs, I think, is where you were sort of leaning, right? I think more attention needs to be paid to MSPs 100%. I see it all the time in my risk management job. Small companies finding the right MSP, finding an MSP that understands cybersecurity makes a huge difference. I think what we need to do is we need to target more of our guidance to MSPs. We got tons of guidance out there around how organizations should secure themselves. I do not think we have enough educating MSPs on how to understand cyber risk for their clients, what are the things that they need to be doing and how for their clients, which is a nuanced difference from how you go about and doing it yourself. So I 100% think that more attention needs to be paid there. There's a lot of discussion about that here in the US, trying to figure out what that looks like, not just from a policy perspective, but just at the community level, right? At the risk, the risk manager level. I think you get a pretty good indication of the appetite in the government sphere based on the output in May of this year by the Five Eyes operational agency on security guidance to MSPs. So I'd encourage you to have a look at that. It's difficult enough to get the five to publish something, and something as vexed as the MSP issue was pretty significant, I think. And that was, I'm cognizant I'm on camera, but it was about giving signals to customers that they should be demanding more from MSPs, frankly. So I'd encourage you to have a look at that as well. This microphone isn't great. There you go. A shift towards micro-businesses. So people who are in the service industry especially, and they're very reliant on tech platforms for their following and marketing and they get a lot of business that way. And it seems like we kind of push all of the responsibility for security onto the platforms they're using who might not treat them as a business, they treat them as an individual. And I was just wondering if that market and that kind of shift in what business means is something you guys are discussing at all. So I think it falls into our kind of broader question on how do you get smaller companies to take this stuff seriously, right? And so we're trying to... I mean, there's the MSP guidance that was mentioned and there's the kind of MSP best practice if you're, you know, for the customer. If you are a two-person operation or a three-person operation, you're filing your taxes, you're doing your accounts, you're paying your payroll, you're opening and closing, you know, trying to find customers, throwing in cyber stuff doesn't... It doesn't land. And so that's why we need to... Oh, good title, we need to move this upstream because those people are not gonna... We've put out all the guidance in the world. We actually even have a certification program for small businesses called Cyber Essentials. People won't do it. It's the uptake is small. It needs to be much bigger. But it's really difficult to get small businesses to do this, and quite rightly, they're times limited. So the more that we can get those... But then when you're a slightly bigger business, your risk profile is different and it's not just about what those, you know, 10 companies that are providing you with services are. It's the 50 companies and how you manage that internally and dealing with your kind of internal environment if you have one. And actually, I think at that point, there's probably more of an appetite to start understanding this and that's where the kind of, how do you drive good behaviors, good guidance comes into place. Thank you, just on the back of that. Yeah, I'm a researcher in Scotland and I did some work for the Scottish government on the back of the Cyber Essentials campaign in trying to get small medium businesses but more particular small charities to take up the Cyber Essentials movement, which, even with an incentive grant of £1,000 to help upskill them, the charities still didn't have the capacity to submit the grants to actually get the money to make them do it. And I wondered how do we fit the voices of those types of organisations where they're, like ultimately, their primary agenda is providing services to their service users as a charity and fitting their sort of everyday routine and the contours of their everyday running into this conversation as well. Yeah, I think that's something we're also considering in Australia. And I mean, one of the things, like you're right on when it comes to the contours of their daily operations, we want to make it as simple as possible for them. So is it through a vector for information coming through their bank, their financial provider, or their insurer, like there's a cacophony of information out there that they can access via government websites, but they're not going hunting for that information. We want to streamline that process and push information out through their trusted networks that they have in the data operation of their business, something we're considering. I mean, the thing that people sometimes forget is that charities sometimes deliver critical services, often deliver critical services, care services, end of life services, getting people food services, but don't fall under any critical infrastructure laws. And it's a gap, but it's not one that you don't want to stick charities into critical national infrastructure, because the things that they would have to do would mean that they would shut down. And so it's about finding, it's that putting cyber in its place, right? Which means not getting rid of it, but making sure that it sits in there in the right kind of part of the risk register. It's gonna be a challenge. It's gonna continue to be a challenge. Quick thing on the MSP that I forgot to mention that's something that I think is an interesting idea. I found one for a client, small company that I work with. This MSP will not take customers that will not implement multi-factor authentication, will not implement certain patch practices. They'll just, they'll say, no, you cannot be a customer if you don't do these things. And so while it's important that we educate SMBs on the questions to ask their MSPs, I think it's really important to get the MSPs to start saying, no, you have to do these things. You have to do these things, or you can't be a customer of ours. Now that seems like tough love in a way. But this is, a lot of MSPs are worried about customers being the weak point for them rather than customers worrying that the MSPs are the weak spot for them, right? It's a poorly, a customer with poor cybersecurity actually leading to a breach for other customers. Hugely important. So not really, this may not apply to the MSP piece of the puzzle, but, right, small, medium enterprise, micro, right, half part-time person enterprise business. I think the charity is something I hadn't considered, right? There could be a large, like the Red Cross is a large, medium, large organization. I feel pretty strongly that they're the only option for these classes is the providers and vendors have to do pretty much all of the security for them. And it's great to give them guidance and it's great to give them grants at scale that's just never gonna work. And that's maybe true of a more mature organization, but obviously as you get a bigger company and they can afford their own internal IT and security and they can select MSPs carefully, that's a little bit of a different beast, but, and for consumers as well, right? I cannot be, where's my phone anyway, like I can't be a complete Android security expert and analyze every app that I get. We need the app store and the provider, my service provider and the OS provider to just give me secure stuff, push patches, secure defaults. So I think that's the only at scale hope, which now gets sort of into the talk, right? And this becomes sort of a responsibility liability-ish question. So I think that's where the at scale answer really is for anyone who doesn't have their own well-funded internal capability. I mean, I can't remember who it was, if it's Microsoft or Google, but one of them is enforcing two factor by default now as part of their cloud service. I can't remember which one but they've said that they're gonna do it. I'd be really interested to understand what impact that has on global cybersecurity because it will have a big one, I am sure. And it's what, you know, what are those other interventions that we can find that will do those things? Okay, let me take this one first and then come back to you. So I work for US government and one of the policies that is kind of rolling out right now that's affecting a lot of different, or at least in different ways, different agencies and that sort of thing is called Section 889, I guess it's of the executive order. It affects, it basically prevents federal contractors from obtaining technology services or products from anybody that is sort of affiliated with Huawei or any of their affiliates. And what does that list look like? Who knows, it's hundreds and hundreds and hundreds of companies that if you're a federal contractor, you have to figure out if your supply chain includes all of this. So anyways, it's just interesting to think about how that level of sophistication is pushed onto companies and at a certain point, is there, in a regulatory position, do you have, what's your thought in terms of situations where, yes, within the UK or within Australia, you have full control of the situation to some extent? If there's a company that needs to run a separate network or system or purchase from a different vendor overseas for their overseas operations, how does that impact them as well? So I don't know, it's kind of a big question. Look, huge question and look, I think it goes back to having an appreciation of what the risks are in terms of any particular sector. I think we're not, like I said before, we're not gonna be prescribing a certain technical solution in any sector because we don't have the expertise to do so necessarily. So yeah, Vex, the wicked policy question. Yeah, I mean, I think we need to understand what risks we're calculating for, right? Are they cybersecurity risks? Are they, you know, are we worried about privacy? Are we worried about human rights? I think, yeah, it's a big one and I don't know. I don't think we have something similar in that sense. I think we are in the UK at the stage of defining what high risk is and why it's high risk and that will then define who should be using them and who should not be, right? The UK, I think, didn't say for Huawei it should not be used at all. It said there are instances when it should and it shouldn't be used and it should make up a certain amount and was trying to correct a market failure issue in that sense as well. And so I think there were a number of different risks. I don't think it was straightforward as that's a company to not use. There will be instances where the UK government might say that's a company to not use and then it will depend on who you are and I think there are some absolutes, right? So even with the MSP question, for example, when we're talking about small MSPs, we're still trying to figure out the answer if you are a small MSP but you supply a critical amount of critical national infrastructure, do we still make the same exemptions to you to not have to undergo certain requirements and I think the answer is no, right? If you are supplying into critical national infrastructure and you're a significant part of that, you know, if you're a small medium enterprise and you're doing that, I mean, you've done really well but you have obligations for national security reasons. In the United States, it's about to get its first cyber work ambassador. What should be number one of this agenda that will be taken? It's the first one we're rushing to the mic. I mean, I think that the new ambassador has pretty much outlined his forward agenda in the CFR paper. I don't know if you've had a look at that. I thought it was a pretty balanced piece of work and I don't know, I didn't catch his test, has he testified yet before the, yeah, okay. I didn't catch it so I'm sure he outlined it there but we're certainly very excited for the position to be in place and see it as the kind of final piece in the cyber leadership for the U.S., which I think has been operating quite well and it'll be interesting to see how he slots into the leadership structure that's already in place. So I don't work for our foreign affairs department so I'm a bit of a bullet bullet in China shop on this one. Not always so. So I'll say what they can't, Russia, China. There you go. So I'd say something really different to that. I think that the thing that the U.S., the U.K., Australia, countries that can do this well need to promote how countries do cyber resilience well. I think so much of the global conversation on cyber is around this kind of cyber warfare or around norms and laws and those quite big questions quite rightly are at the front of the agenda but I think the answer to them, I don't think we've hit the right answer to them yet. I think a big part of the answer which we don't take advantage of is if you can get, if you can do, doing cyber resilience well is expensive for a country because it's a cost for people and it's a cost for companies and I think a lot of countries actually take this down a slightly different route and say government will do more so companies have to spend less on this and I think that is a false economy because actually it's not cheaper for governments to do it because companies are still unprotected and then what you have is a change in the dynamic of who is policing the internet in that country and so I think the more we can put cyber resilience at the front of the agenda as the answer to how should cyber be governed in the world, the better it will be for those countries to remain resilient but actually the global conversation on how cyber is governed so if the cyber ambassador is listening, this is not coming from the UK government, this is coming from... I'll just add one more point to that. If you look at the way that certain state-owned enterprises are operating in the Indo-Pacific in particular, South East Asia, they're investing a lot of money into training, workforce development and essentially undergirding government systems with their enterprises, their solutions I should say and we're very keen for countries like the UK and the US and Australia to lean in that respect as well because I think particularly in Southeast Asia we're seeing certain entities do that quite well and there's been a bit of a gap in terms of our diplomacy and capacity building and partnership in that region so that's one other thing I would add. Do you move in towards the microphone? Sorry Emily. To try to add some dimension to this that we spent a lot of time on in the last two days of East Side Las Vegas and a lot of my time at CISA is whether it's the carrier's opportunity to do things at scale for some of these network, for the things that can be handled at the network that we handle them at the network and for advice to countries who don't have enough infrastructure or money to do the cyber's or cyber resilience well. I think it's not always about adding cybersecurity. I think it's about removing dependence or proportional dependence and I talk a lot about proportionality. There's a great cost to connectivity and most of these critical infrastructure providers who have had cyber attacks in the last two years, they're the target rich cyber poor so we can try to give them money and training and staff that we don't have over the next several years and or we can right size how connected we are proportional to their ability to operate that responsibly. So in the carrier discussion I think instead of maybe doing this everywhere for everyone, the MSISAC in the US during the elections did free, malicious domain blocking service for election authorities. They also did protective DNS services. So these are the kind of things I think could be useful for hospitals, water treatment facilities, the cyber poor and would scale quite nicely. Other ones will get quite messy, quite fast. Similarly, one of the first things I said that very first time I went into Fort Meade one of them knew I was coming from the hacker community and said what do the hackers think of our capabilities? And my answer was I think I assume we're really, really good at offense. We're really, really bad at defense. We probably take comfort and the assumption the same is true for adversaries but we forget as the most connected nation we have much, much, much, much more to lose. And that's my big fear here is it's not so much that we should be helping people defend smart connected water treatment. It's that maybe we should have less smart connected water until a country or a region is capable of operating that in a responsible fashion. So you can add more cybers, yes. Add more dependence and resiliency, yes. Add more S-bombs and transparencies, yes. But in the meantime, the fastest path to resilience or safety is to disconnect. And yeah, so I mean the extreme of that is we go back to not being connected, right? We put the extreme, right? Just as a, I know that's not what you're suggesting at all. But, and I think this is where the, you missed my speech on proportionality as well. And it's really important that therefore those decisions get made on a sector by sector basis. There are, I think there are certain sectors, I'm thinking civil nuclear for example, that its entire operating model is based on being able to shut down safely, whatever the scenario is. I think what we don't want to do is start putting cyber as a, I mean, I'm talking slightly out of my depth here and on some quite critical issues. But actually, I think there are things that are already in place to prevent catastrophic incidents happening, whether it's a cyber incident or not. And I think there are things that are changing that calculus by being slightly too connected, which is what you're saying. And actually, it is then up to the industry and government behind it to say, there are certain things that you need to do to prevent this, right? So, as an example, absolutely everything that is connected to a operational environment should be air-gapped. Maybe that's the thing that we go with. But I think there are also industries where that's completely inappropriate and there's that balance then on security versus innovation. We don't know what smart water, whatever might do, but actually the benefits might way outweigh the risks. I mean, most of us know air-gaps almost never are, even if you think they are. But more deliberately, if you go to the world economic forum or you go to a lot of the digital transformation summits, they're very confidently adding 12, 16, 20 deliberate perforations to that air-gap for predictive maintenance and efficiency and data science. And they're doing it because none of us told them the cost of doing that. Doesn't mean they wouldn't still do it, but they have not baked in the true cost of that project. So not only have we had exposures beyond what we thought were aggressively and deliberately adding more exposure. Okay, so we have 25 minutes left of our allotted time. I guess my question to the room is, do people want us for more time on digital subscribers or we have already kind of been doing this, but do you want to take the last 25 minutes as basically an AMA for the Australian and UK gubbies? I'm losing my voice now, unfortunately. I mean, we've already gone off topic pretty badly, but like so, are you sort of just volunteering you guys for this? But if people have other questions that are relevant to other areas of policy and you want to ask these guys about it, we do have 25 minutes we can use for that. I thought a hand might go up pretty quick then. All right, so Rebecca, maybe you could just move to that microphone. Thanks, sorry. And I apologize if you talked about this before I got here, but I still want to hear the answer. What are your governments thinking about when it comes to spyware and surveillance software? That's not come up. They're all good. Are you firmly in favour? Again, another vexed issue, I guess, and we're in a position in Australia where not necessarily this huge uptake in terms of, well, entities that are in Australia and their operations as well were not to be naive in that respect. We certainly have to monitor that space and we watch what has happened with NSO, for example, with great interest. And certainly interested in the way that the possible acquisition by an American entity was shut down and the way in which that was done so by the administration and the clear signal that that sent. So I thought that was somewhat surprising, but keen to hear your friends' views. Yeah, I think the UK tries to take a leadership role in global cyber governance issues and it's difficult to do that if we don't kind of have an active voice on the proliferation of cyber tools. We know that these things can go really, really wrong. We know that they are a risk to human rights in many ways. We've seen it actually, we've seen that be used. I don't think it's limited to things like spyware though. I think we need to be very careful about how commercial cyber tools are managed. In fact, this is a question that came up slightly earlier on export controls. How do we manage kind of this stuff? So, you know, I think we need to be very careful about being active and vocal on this because otherwise we really lose our ability to stand up in front of a crowd and say, this is wrong. And not just that, actually, these are the kind of tools that are used against. Actually, I think spyware tools are used against UK government personnel, certainly like many government personnel, but I think the UK is one of them. And I think they have, the UK government has been quite vocal on tools used whether it's in the UK or elsewhere and that they shouldn't be commercially available. Yeah, and I think the other point to add there is that we continue as democratic nations to articulate the limits that we're prepared to take in cyber space. The operations that are undertaken by our agencies obviously very sensitive, but we need to keep articulating where the lines are and what responsible behavior in cyber space looks like. So, we're obviously very mindful of that. And frankly, the need to engage through the UN process on these issues as well. Any other questions or comments, feedback? Can I ask the writers for a second? Yeah, please. Not bad form now that we've kind of moved away from it. That's great, that's what we're here for. Good. So, I heard you say something earlier and I wanna make sure I heard it correctly and maybe some follow-up thoughts on it. There clearly is a balance when we look at regulation versus sort of market forces, right? And sort of letting those things figure themselves out. I think you started to describe at one point, there are a lot of good things that are happening with digital service providers where cybersecurity and doing good security and doing good privacy has become a market differentiator. I'm wondering if we have a sense of where is the threshold where we say, you know what, is there actually doing pretty good? They're doing it pretty good consistently. So, let's just take a step back from regulation. Maybe we don't need to do it because they actually seem to be doing what we want them to do. Any further thoughts on that? And that's open to everybody, I just. So, we try to keep, we love a survey in DCMS, as Jen said, right? So, we have a number of different things that we keep an eye on. And one that we'll be keeping an eye on is how actively organizations monitor their or have a grasp on their digital supply chain. I think we will be looking at how legislation, I mean legislation hasn't come into place, it hasn't even gone into parliament yet, it's just the proposals that we've kind of consulted on. We will be looking to see how that changes over the years, you know, if that 7% goes to 97%, we will pull back because it clearly wouldn't be appropriate for us to legislate if all companies are aware of their cybersecurity and, sorry, the next bit is and they're doing it well and it's working and supply chain attacks are reducing. It's really difficult to get that nuance of information, right? And then the other thing that we will not be able to, we don't have, but we should find a way to do this is what are those relative costs? What is the relative cost of poor cybersecurity and digital supply chains versus the other things that digital supply, digital, sorry, poor cybersecurity and digital service providers versus the positives that actually not, you know, allowing those digital service providers to continue without regulations and grow in the ways that they have been. You know, we don't have one number versus another number because those, we're quite immature in our policy development for cybersecurity, but I think as technology, that technology policy space develops, those kinds of valuations should be done and will be done and it won't just be around cybersecurity, it will be around costs of data governance, for example, costs of, yeah, competition policy and those kinds of things. So I don't know that we have the answer to all of those yet, but it's, you know, some of it will be around breaches, some of it will be around user awareness and, yeah. No, I would just add that through the work we're doing and the cyber practice regulation task force in Australia, I think we've got pretty clear feedback that there was areas, particularly on smart devices where we, the consensus view that was we need to move on regulation in line with what the UK is seeking to do and frankly, as we've discussed, we're very careful in the process of consultation, identifying issues and then moving through to legislation. That is probably one of the key areas where I think we're gonna potentially move the rest of it. I think we're still in a process where we need to go back out and consult pretty thoroughly on it. So we have mandatory independent reporting for critical national infrastructure. As does Australia, so. It's difficult to get ahead and tell the data because they sit with the regulators and that obligation doesn't currently sit with, if I'm right, which is about 50% of the time, doesn't sit with digital service providers at the moment. They are regulated in a slightly different way. So I think it's a really interesting tool but as with a lot of policy, the devil's in the detail. Yeah, and I think one of the interesting facets of the cyber incident reporting regime that was just legislated in this country was the transparency that will is included in which I think most people didn't pick up initially but in talking to the staff of the committee who pulled together, it was quite deliberate in ensuring that there was transparency of the information and to get an aggregated view of what instruments look like and whether or not where there's a maturation process there. So I'd be very interested to see how that's digested by other countries and when we look at something similar because it's certainly not something we necessarily thought of in the Australian context but talking to the folks on the hill who drafted it, they wanted that consistent feedback from scissors to what they're saying and the trajectory of things and whether or not it's working. And to get, like I said before, aggregated view of what's happening. So yeah. If I can build off that a little bit. I said this to some of you yesterday. The voluntary only NIST cybersecurity framework was in some ways a stall tactic to not regulate, right? We didn't want a heavy hand of government before we knew more things. And one of the, if you were involved in those, one of the promises was with this framework to stimulate conversation, we could also look at which types of controls are and are not adopted, what the correlation is between these sets of controls and breaches and things like that. But I had to testify in May about should we make a NIST cybersecurity framework overlay for healthcare? And I said, you don't even know who's using it yet because there was an Office of Inspector General report and I think it was a single digit percentage of the healthcare sectors, even attempting to use it 10 years later. So with this new mandatory breach reporting and transparency, great. But just three and a half years of rulemaking before we'll even see it start to kick in. And I continue to argue that the adversaries have set the pace in the tempo and are very good faith desire to move slowly and judiciously and let's not be rash and let's not over-regulate. I would argue we have under-regulated and even the voluntary thing hasn't had it. So I've been advocating for things like any sort of safe harbor, any sort of forgiveness is tethered to your ability to have attestations of your posture against something like the NIST cybersecurity framework or that you aren't being compromised for something I would consider negligent like the CISA.gov bad practices. So I think there has to be a middle ground between they're doing good enough so let's not regulate them or let's do voluntary for 10 to 15 years. Like if we really want to be a learning organization or a learning country or a learning sector we should get some baseline performance objectives and measurements and then play with them over time. I love that because as part of the new national cyber strategy there is a performance framework that sits alongside it. A baseline of what is currently happening and what we would like to see after one, two, three and five years and then we will need to measure it and it's things like are critical national infrastructure companies complying with the cyber assessment framework which is mandated to them. What is the uptake, is the uptake the rate of uptake in companies going up or is it going down? Yeah, so there's a question on under the strategy is what we, are we achieving what we set out to achieve which is improving all of these practices and actually is all of that stuff still right? Have we missed something? Do we need to move in a different direction? In terms of under regulation the only thing I'd say is there is a, in the UK those regulations for most of critical national infrastructure came in in 2018. So companies have had three, four years now to come to terms with it. We still think that's early. We still think four years is at the beginning and companies are just starting to get through their kind of first cycle of board membership, board reporting on what's been happening. The next five years will be very interesting and I think after this strategy, this UK strategy which has just been published in December, the next one will be kind of, it will very much depend on how successful our initiatives have been. Do we go kind of mandatory on a lot of things or actually has this approach worked? Only about 60% of it. Have you watched the undeclared war in the UK? I don't think they've had it yet. But I mean, how are we going to make it? Thresholds, we need thresholds. At what point does it get bad enough if we might say that it's pretty bad? Announcement, if you would like after the session's over we have tickets and you can go next door and discuss some more of this for a beer I don't think it's for drinks. What? Drinks, sorry. Well, so I think we have a session next door, don't we? Oh wait, the gavel battle's in here, I thought it was next door. Sorry, so the gavel battle's in here? Okay, so yeah, we can wrap up, so... Just on Michael's, Michael's a very quick answer. One of them, one of the points is going to be public appetite because it's politicians in the UK, we can say certain things as civil servants and politicians will drive the legislative agenda. When it, you know, I think a lot of this is how public opinion shifts so that ministers say this is a priority for the next legislative agenda. I don't know what it would take. But I think that public opinion bit, political appetite is a really, is that, you know, one of the key drivers. Yeah, I'd agree with that and also add that like the scale, if it tilts in terms of costs or operating in Australia in terms of businesses, I think that's where we're gonna have an appetite to really legislate, but we're not there necessarily yet. Good, I'm just double checking where it is. Okay, so it seems like it might be in both, which is really weird. So for those who are interested in policy but want something a little lighter than this, from six to seven we have a session called gavel battles, chaotic gavel battles, where we will be challenging for security, I'm gonna say experts, because one of them sitting just there, will be challenging for security experts to battle it out with micro debates on policy related topics. And the audience will vote on the winner and the losers will drink. So it's gonna be messy, it's gonna be ridiculous, and it's possibly next door or in here, but somewhere, the schedule doesn't say. I think it's next door. All right, thanks very much. Thank you to our speakers. Thank you everyone. Thank you.