 Hey, guys. This is now the 12 o'clock talk. Sorry for the little problem beforehand, but we're dealing with Zoom that day. So the track is basically what we call DHS, Rebooting Critical Infrastructure Protection. We have a panel right here, and we have somebody on Zoom on the left-hand side. And we're ready to start. So here we go. Hello, can you guys hear me? Yes, I'm on. Okay. Hi, I'm Joe Marks. I'm a reporter with the Washington Post. I write the Cyber Security 202 newsletter. This is my surveillance team for anyone who was watching the last panel. They are Perry Adams, who is a security researcher who, security researcher who advises government agencies, and others. Alex Plimberg, director of the Hague Center for Strategic Studies, and also the policy lead for DEF CON. Amelie Koran, senior technology advocate at Splunk. Faye Francie, executive director of the AutoISAC. And Danny McPherson, chief security officer of Verisign. And we are joined via Zoom. You can't see him, but we can by Clayton Romans, who's an official from CISA standing in for us. Oh, he's up. Yes, he's up. Hey, Clayton. Okay, everyone. So I want to start by pausing a couple of things. So it seems as if cyber protections for critical infrastructure are, is a bigger and scarier issue now than it has been at any point in the past. Biden is going to meet with Putin with a list of 16 critical infrastructure sectors and saying steer clear of these, right? The White House is with the help of CISA drawing up cyber standards for all of critical infrastructure. They are voluntary right now. They could become mandatory if Congress gives the executive branch the authority to do that. That wasn't happening in 2015, 2016, 2017. So I want to start by just level sitting here asking that Clayton probably can't answer this question. The folks on the panel here is an answer to this more government regulations and cyber requirements across critical infrastructure or no. So just hands up, more regulations. Do we get any more? We got like a half man. I think it's nuanced. Okay, so we have no hands fully up. But we'll start with Perry, explain what nuanced means. What where should we be? So I think when it comes to I regulations, right, you have so many disparate industries that have very bespoke, very specific needs when it comes to ICS security. And so there's not really a one size fits all solution. And so I think in some respects, we do need regulation because especially some industries are not going to regulate themselves effectively. But the regulation has to actually achieve its goals. It can't just be there to do something. And so I think that we run the risk of creating an overly broad and generic solution that didn't impose is sort of these owners requirements that don't actually make sense for specific industries. So yes, in some cases, but it has to be smart. And maybe government can do that. Maybe they can't. Sure. Emily. Yeah, I'd say generally to one of the things from policymaking at the kind of executive branch level is while as she mentioned, it's generally generic and everything you know, every industry is somewhat bespoke. It's some of the need for some mechanics because not every organization is equal. So they don't necessarily know where to start. They'll just buy anything that a vendor may say I'm not speaking as a vendor, but that's typically what happens. But also, you know, just the workforce, the hiring required, you know, kind of measures and levels. So you know, that's that's one challenge I've always found like regulation is it's there, it's set out, but there's no, you know, help with mechanics of getting it there. So there's always a boarded starts or just no starts at all. And Clayton, I want to jump to you now because you're on the government side here and our government representative. So CISA right now is working with the White House on these voluntary standards in some sectors. Can you give us just some level setting on that in just two, three minutes? Sure, I'd be happy to and thanks for having me this afternoon for you guys a little bit later for me here in DC. I want to touch on two of the points that have already been made and I'll try to baseline where we're at. You know, from a CISA perspective, what's most important is that we are providing as much partnership and capacity and capability and tools to our critical infrastructure owners and operators to protect themselves. And I think we have to get creative. I mean, you mentioned at the beginning, we had a different approach in the first part of the century, we had a different approach seven years ago, we're pivoting now to address, you know, through potential regulatory means, but we're also looking at creative partnership models that aren't entirely outside of the voluntary space that we've always occupied. And I think that's the unique, you know, value proposition in place that CISA fits between industry and government, you know, from a U.S. government perspective. You mentioned the voluntary standards that we're looking at for industrial control systems. We're looking at incident reporting guidelines, both in partnership with some of our other DHS subcomponents, but also from the White House. And I think the key, and Omli mentioned this a minute ago, it's not just about compliance. It's not just about telling critical infrastructure or other entities what they need to tell us. It's providing frameworks and tools and I think guidelines for how the government can help work with industry to protect themselves and increase resiliency. I think one of the things that we have to be very careful about is what this implementation looks like, look like. It can't just be here are some standards, you know, we've been doing this for years, take these standards and good luck and get back to us. We need to truly have a collaboration between government and industry to implement those standards, to learn from them and potentially evolve them as we work through the process. And I think from our perspective, we plan on doing that closely with some of our sector risk management agency partners across the U.S. government. And we're certainly interested in working outside of the government space to build new models for taking on the same, you know, threat and vulnerability landscape without any of the new regulatory measures that we're considering. Our director yesterday previewed a new capability that we're establishing. My team has been leading this for CISA. It's called the Joint Cyber Defense Collaborative. We're really excited about that. And I really want to emphasize that here because this is a whole solution type of approach to addressing cyber risk. We need guidelines. We may need new regulatory approaches, but we certainly need to lean into and evolve the partnership model that we have in place to work together in this space. So thank you very much for that. And I wanted to throw out to the group again. So why hasn't, because partnerships with between government industry have been going on for quite a while, why haven't they worked yet? And this is not on CISA. This is two years old. But why has the voluntary model not gotten to a place where we're not having, you know, major ransomware attacks against colonial pipelines and, you know, major meat processors. Danny? It's a fair question. No, I'm kidding. I think that a lot of the, a lot of the collaboration today is sharing with the government and usually don't hear anything back, and that's been a frustration. And I think that, you know, receiving more proactive intel, proactive controls, actual collaboration is an important thing. And so I'm optimistic this will help with some of that. At the same time, I think that, you know, it's another name for some of these initiatives and I hope that they can learn from some of the problems of the past and work through some of the regulatory barriers that may exist to help share more information with private sector to better be proactive in these areas. Other thoughts on that? Why hasn't the voluntary model worked yet? Why are we still where we are? I think it's, people do the minimum viable required, right? I think, you know, I think that's the bottom line. It's, you know, it's a lot of liability out there and, you know, most, you know, 95% of critical infrastructure in private sector hands and just the ones that get their arms around that. And I think there probably ought to be a height limit for if you want to operate that infrastructure. So I'm a fan of knowing what good looks like and aiming at that. And if you don't do that, then there might be some liability associated with that, right? But you also need help because, you know, the private sector is not alone in, you know, and experiencing cybersecurity incidents. Yeah, go ahead and then we'll move over here. I think, I think that's a strong point to be made. Is that also the culture in the US tends to be excessively legalistic in particular regarding information exchange and other types of activities that people really need to do in your, and from, from like a private share point of view, you're often worried about liability issues and data protection issues. And it's kind of funny because in Europe, who are definitely big friends of regulation, actually that particular issue, information sharing, for instance, is managed in a way that is sometimes a bit more efficient because people are not so worried about data protection issues, even though the Europeans are much more hyper about data protection than most European entities are. And I think it comes down to basically being a little bit more flexible with your frameworks. And that's sometimes quite difficult when you have, of course, you know, really large system like in the US to deal with, and not like in the Europe where the member states tend to be pretty small and you're able to effectively work things out more in informal matter. Oh, oh, I was just going to say that it's my understanding that NERC-SIP is actually, any R-C-C-I-P is actually a step in the right direction, at least for industry group, establishing standards for the electric, electricity sector. But I mean, to his point, it's all about smarter regulations, stuff like that. But if you want an example of the point, too, of something that a private industry group has done that has made some improvements, that would be one of them. Yeah, and if I could jump in, just a thought on that to build on what you're saying, Kerry, I think it's about smart regulations. I think it's about smart collaboration. The EU is a really interesting model that we can learn from. You have member states of varying sophistication all coming together and needing similar types of services and a model that can bring, you know, different levels of partners together. From what's as a perspective, what we're really trying to do as we stand up this JCDC capability is level set for the US government. There are several entry points. We have very sophisticated government entities with different types of capabilities. We have very sophisticated industry partners with an enormous amount of visibility across the US and across the world. But we're also trying to help our state and local partners. We're helping, you know, small businesses that don't have this level of capability. And I think making and building a model that addresses different types of collaboration needs is one of the solutions that we have to find. We have to make it easier. We have to make it easy for a really large, sophisticated company that can bring a wealth of information to the US government. How do we action that effectively? And how do we take a coordinated approach to addressing prioritized risks? We, you know, you mentioned the president's remarks in Geneva and protecting our critical infrastructure. Well, let's take that to the next level. Let's let's decompose what our critical infrastructure looks like. Let's focus on where we see the greatest risk and where we can partner smartly with other with other people and other entities to address that risk. Yeah, and just coming from somebody who's had to implement this in various organizations, including critical infrastructure, you know, as part of a security engineering team, one of the I think the biggest challenge is if as we kind of lean towards standards or regulations, you folks who are in leadership positions and not the individual contributors doing this stuff see these regulations pretty much as the ceiling and not the water line. So they go to that and they only go to that. And, you know, getting to the water line gets you to where you're just floating, you're neutrally buoyant in your security controls and your resiliency is actually being above that water line, being a boat, you know, having a hovercraft. So trying to engineer or operate in a fashion where, yeah, or a parasail, operate in such a fashion that if you do get hit by ransomware or whatever new threat is coming out there, you can take that punch and still be above that water line, still operate and still be resilient. Unfortunately, a lot of regulations just get you to the point so that everyone's at that even level. But for you to succeed, you need to be above that. And I think that's the next step beyond this. It's, you know, obviously this is getting organizations who may be small, maybe underfunded, you know, this whole security poverty line with like Wendy and Andy wrote many years ago and, you know, getting them to at least that standards point, getting them beyond the ceiling, get them above that water line. I think that's going to be the next challenge to all of this. But, you know, obviously getting, as, you know, you mentioned about CISA helping with some of those tools is, you know, using that as a foundation and building from. Okay, you will. Yeah, so as an ISAC, Information Sharing and Analysis Center, we've had a public-private partnership with DHS for many years for since 1998. And I will say there's been several successes there. So I don't want to throw that completely away. There's been incredible learnings working together, collaboration, and really the take down of some serious activities that would not have occurred without that public partnership. Having said that, that doesn't mean the model is right, right? The model is not necessarily right for what the current times are. As we've seen, the threat actors have gotten much more sophisticated, much quicker than, say, the private sector has. And that is one of the challenges that we have. The private sector wants to do this, right? It's not that they don't want to do it. It's just that they don't necessarily have not been able to keep up with the sophistication that many of you are aware of, right? They just haven't. And that's part of the training and needs that the private sector has. One thing that seems complicated about this is that CIS's work for the last two years has been very much focused on collaboration and working with industry. And we will keep your breach secret. We won't disclose any information. And I think there's a lot of anxiety about pivoting from that to being a regulator and enforcing requirements and that it will be tough to do both. Do you have, Faye, someone who's worked with both regulators and CISA? Do you have thoughts on that? Yeah, of course. And if you want to say anything, Clayton, also. So one of the things that make my members vibrate the most is regulation, of course. And for good reasons. So regulations, as we've already talked about here on stage, can be very onerous, cannot necessarily get you above that waterline. And so they have to be smart regulations, right? What does that look like in a partnership? It's a little difficult to say, right? So I think there needs to be some real good thoughts put onto that. On the other hand, we know that without regulation sometimes nothing happens, right? We understand that. But I will say that as I've been in two major industries, both aviation and in automotive, I've seen both industries, they really care, right? They don't want their customers to be hacked. They don't want to be hurt. They don't want anybody to die because of something that's happening out there. And so they do work to understand it, but they don't understand the asymmetric threat that many of you do and how sophisticated this threat has become. And so there is a need for more training and awareness and education around this topic, cyber workforce development. We've seen what Jan Easterly just came out with yesterday. Many of these topics are really important to the private sector. And I can tell you that the private sector works very hard to mitigate these risks. They're just not at the same level as the threat. And that is, I think, the real big problem. Clayton, did you want to say something about that rough balance between partnership and the WIP? I do. And I think, you know, I think it goes back to what Carrie said at the beginning about nuance. There's, it is a balance. I don't think that partnership and regulations or guidelines are mutually exclusive. I think we've tried different approaches and we're learning that part of this is about empowering udders and operators to take the right steps to protect themselves. From a CISA perspective, we need the best visibility into threat activity and incidents as they occur if we're going to protect the broad national community. And so we're looking for new ways to do that. But there's obviously a balance we have to strike. Instant reporting is part of it. Baseline standards is part of it. You know, as we increase the compliance requirements there, we have to be very careful about what that does to our voluntary approach and how we work with industry to actually define what those requirements are. So I think it's all about the balance that at the end of the day, we think we're striking that now. And the ultimate goal that we have though is to bolster the work that we're already doing in the partnership model that we have. I don't think they're mutually exclusive. So one one critical infrastructure sector I think about a lot is elections because I spent, you know, the last two years focused on it. And that's an area that is as big and complicated and unprepared for this as any you can imagine. And over the course of four years, they made really huge strides. And it seems like that happened partially because of a great deal of terror and concern about Russian involvement in 2016. A lot of government pressure, a lot of public and media focus and to sort of a genuine general kind of push that forced them into that situation. Is that possible in other critical infrastructure sectors? I mean, could they be could pipelines become scared enough by colonial agricultural companies scared enough by JBS? Is there enough of a drive to make that happen to raise that level? So they're at the level of the threat or are we not that terrified yet? Alex. That's a general question. Again, I'm going to give a European example here because I think there's also just a general a bit of a different view that also also has been tried in part in the US. And I think it's not bad to kind of think about going back to it a little bit. And that starts with really basic concepts like duty of care. So when you really impose a certain amount of duty of care on like critical infrastructure providers or in Europe that includes digital service providers as well, which is an interesting concept. Then the next step is their job is to avoid gross negligence, right? What does that mean? Like really screwing up. And if you're going to talk about gross negligence, you have to apply basically basic standards. That's the next step. All of this, by the way, in the network information security directive. And if you have a company in the US and sell services to Europe, then basically you are hit by a network information security directive. So you should basically really find out what it means because you have number four, a duty to report incidents of anything that's substantial. And you have to do that to the member state government, their C-Cert that's responsible for you. And if you don't, you get to the last point which is punishment. So basically, if you end up being grossly negligent, you don't report breaches, then it used to be that the member states the governments, the European governments could inflict their own punishments on you. But now the new network information security directive will be up to two percent of your revenue. And they will do it. So this stuff is a slowly increasing level of regulation where it takes their time. It's not all at once. It's based on basic fundamental principles and it gives the industry time to sort itself out. So the minimum baseline protection always involves best practices in your particular industry. They're never prescribed. They don't say you have to do this. It always will be 27,000 in our industry, but there are subcomponents of that, et cetera, et cetera. So I think that's something to keep in mind the U.S. tried, I think, for a while to do breach notification mandatory requirements and moved off that across the board and went onto it in specific sectors. But it's a really helpful step in this avenue of slowly increased smart regulation rather than trying doing everything at once. Just one second to follow that and then we'll get back to the bigger question with elections. Go ahead, go ahead. I was just going to say in order to do breach notification you have to actually be able to identify the breaches when they happen and many companies are just more than happy to be in the dark on that one. So back to the broader question though, Perry, is it going to take a Russian interference in the 2016 election level event to force enough concern from other critical infrastructure sectors that they self-regulate? Given the way that ransomware attacks have been increasing and the industry is only becoming more sophisticated and more profitable, I expect that that may serve as a powerful impetus for this. But I do think that it's going to come with smarter regulations. I don't think the industries are just going to do this on their own. And just a quick point, obviously elections and was not actually its own critical infrastructure sector before 2016. So, you know, there was not actually, that wasn't on the radar as such. So it's not a question of necessarily having all, everything spelled out in advance, but having principles that we could use to slowly address new issues as they come up. Oh, do we have a hand? No, I didn't have a hand on it. I can say something to you. I do worry with some of this that, you know, when you have that, you know, it's the threat of regulation that, you know, you're doing more compliance-based security versus risk-based. And you could, you could have had every compliance control and every product on the show floor at RSA deployed and operating properly and SolarWinds would have happened. Yeah. Right? Or ransomware in some environment, maybe, right? So, I think to your point, though, wasn't, you know, ransomware is, you know, one of the leading headlines in the impetus for the, the JCDC, right? Isn't that, you know, one of the things we got to collaborate. And so I think that's an example. SolarWinds is an example. So it's happening. It's the same as the Russian stuff. Here's the threat. How do we deal with that? How do we collaborate to be more effective to prevent it in the future? So. Go ahead. Oh, I just want to go back to the question of the European system. Why is the U.S., in your opinion, not as good at doing regulation and doing it smartly? The French would love that question. Yeah. What's wrong with America? Our regulations don't have a sexy accent. That would be it. So President Macron has been saying this for a couple of years right now that that Europe does not want a Silicon Valley Internet. It does not want a Beijing Internet. It wants a European Internet. So the whole idea is they've discovered they think that Europe thinks that regulations is superpower. That's its value added, which is always slightly amusing when there are many circumstances, but actually also provides some kind of risk management to corporations when they're trying to do business. So I think one thing they have going for them is that since the EU itself is even less powerful than the federal government, they have to do a lot more in the US. They have to do a lot more negotiation with the member states. And their biggest challenge in Europe has always been getting everybody up to a basic level that they all can work from and having, I mean, they had to start with building Gov certs and certs basically to even have points of contact. That was in the first network information security director, right? But now, for instance, to give you an example of how far they've come in five, six years, I mean, all of you know, you know, the general data protection regulation, this GDPR, which is a regulation people love to hate but again, it gives people some planability. It's been used everywhere. It's the kind of thing that basically, yeah, it's the worst thing around, but at least it's something and people work with it. So now the thing that's going to be really important is going to be the cybersecurity act implementation. The cybersecurity act actually sets out certification for ICT products and services. And that is a really, really big topic. And that's going to be very hard to imagine, but none of this is mandatory. And it probably won't be mandatory for at least a decade of more. So it's very, very slow steps by step where they're starting with, for instance, certification for cloud services, for instance. And they're going to move up to security services, et cetera, et cetera. So the whole idea is to do this slowly to start it on a voluntary level to connect it with principles like duty of care and similar. And basically, you have it bottom up. It's always bottom up in this case. And then always the question is that the industry knows that one point, the regulations are going to be are going to be turned up. So you can't only have a voluntary model in that space that's their, that's their belief. But they want to find systems that are going to be acceptable to effectively the regulators. So therefore they plan ahead and the system more or less is smoother implementation. So in the last five years, they've gotten really far. Not necessarily everyone will think that's a good idea, but that's basically how it's, how it's turned out. I would say that the Europe versus US versus Asia is not that different, right? So we are a global or, you know, company. And what we're doing in the auto ISAC is we had a global task force that actually looked at the European region, the Japanese region. Why? Because most of my members are from those regions. And so we are actually working very closely in standing up a European office and Japan auto ISAC has already stood up. We're partnering with them. DHS does a lot of this collaboration with their counterpart parts in those regions as well. And I can tell you that every one of my member companies wanted us to be a global ISAC. Because they have operations in all those different regions. So, and I know maybe Clinton can talk a little bit about what DHS does in collaborating with their counterparts in those regions. Sure, I'd be happy to. And I think it's a really good point you made in terms of, you know, even as we can, we can, there are certainly nuances across different regions of the world for how we approach regulation and how we implement it. I think there are things we can learn from our European counterparts. And I think there are things that don't quite scale or fit in the same way in a U.S. ecosystem. And some of that we have learned from the way that we've built our information-sharing channels and collaboration with our respected national counterparts across the world. So, Japan's a really good example. One of our closest bilateral partners, we work with, you know, their computer security and some response teams. They're cert, they're national certs. They have a couple of different entities that we work with. We work very closely across with many of our European counterparts with our five ICE partners with some others in the East Asia-Pacific region in Singapore. And the key across all of these partnerships is that each of our respective countries has a different regulatory framework and a different, slightly different approach to how they approach critical infrastructure protection. But ultimately, however we're getting the information that applies to the U.S. as well, we're all in the business of computer network defense. We're trying to protect critical infrastructure. We're using information in the same way in this community. It's a trust-based partnership. It's how we have to work domestically. It is how we have to work internationally. So, you know, Japan, for instance, has a more complex but certainly different approach to regulation than the U.S. does. So to Singapore, another really close partner in the region. But ultimately, as we come together to work with them to counter some sophisticated nation-state actor activity in their part of the world we're taking the information that they have and the information we have and we're fusing it and we're using it in the same way and it's for protection. Connolly, and then Danny. Yeah. So I think one of the one of the things that we tend to not touch about at this point is like or obviously talking about regulation as to encourage better behavior. A lot of times with not going against or going against regulations is a fine. And unfortunately, that we've not seen is to use that financial leverage to actually encourage that better behavior. The fines are usually too low. You know, whether or not are you going to pay out to recover from a ransomware attack? I mean, you know, obviously like with the pipeline, you know, I made a joke about this on Twitter. You suddenly found four and a half million dollars to pay the ransom, but your security team only needed two to actually secure it. Why didn't this happen? And, you know, with that is you it's this weird perverse incentive. Like what are you going to chase after the recovery? Are you going to pay for the recovery and all the things there and take the hit there? Or are you worried about, you know, a hundred thousand dollar regulation fine? Are you going to keep doing the same thing you're doing? And I think also like cyber insurance itself, which is I consider kind of a bad word now and then. That also, you know, it's been used as a mitigation tactic, but it's also not necessarily been encouraging better behavior either. I think there was a story a couple years ago about a bank that got like hit twice. And, you know, they went to go back and get an insurance policy again and insurer finally said no. But there's other places that will just be willing to take your money and you can keep having that bad behavior regardless of whatever regulations you have on your head. Denny? Yeah, I agree with most of that. So, yeah, Alex, I prefer a global internet, by the way, as opposed to I do think one of the other things to consider is sort of the patchwork of regulations that exist, right? Is, you know, be they privacy or cybersecurity or breach disclosure related, right? I mean, top rate in certain countries you today have to obviously, you know, commit to disclosing certain types of incidents at a certain time and what constitutes a breach and what doesn't versus an incident that was contained and so versus the patchwork privacy regulations as well. So I think that, you know, as we find our way through that it's going to be a little problematic for a while. A lot of the cycles and resources of good security people and GRC people and the like are going to be working on that as opposed to more security-related stuff. So I think there's a double-edged chore there. It's a really important point that you just made and I want to take it one step further and share that, for instance in standards, we've seen, I call it the hair dryer effect, right? So when I used to go to Europe and I have long hair, right? The hair dryer that I carry doesn't work over there, right? Because the plugs aren't the same. We're a very simply simple example. With standards now and in automotive, for instance, in particular, we've worked on a standard here in the US for cybersecurity for automotive. But what happened is we said, wait a minute, what about, let's make sure that we are consistent with what's going on in Europe and in Japan. So now it's an ISO SAE standard, right? So really important. I see that shift happening. I think that's really good news. What, Faye, you said earlier that the companies care a lot. They're getting better. They just aren't up to the level of the threat. Yeah, because the threat keeps increasing. Is there a way, just in the narrow, in your narrow sector, what would it take to get them up to the level of the threat? And what should government do? What does industry have to do? How do you get there? Oh, I think there's a multitude of activities, a lot of which we've already discussed here, but clearly education and awareness is job one. Understanding what it means and what you can do about it takes skills and those skills may not actually reside in those companies. And we all know that we have a workforce challenge, right? So workforce development has to be an element in there. And then it's also having it as a scorecard on the CEO. We talk a lot about that in the past. I think they get it now, right? There's been enough activities and enough blowout from many other incidences where people in corporations, of course, have lost their jobs because of poor cybersecurity. Yet we still can't blame them as being the victim only. We also have to recognize there is an additional investment that we must make to help them get over the hurdles of and staying up with what I call very asymmetric and emerging threat. And I would just also say too is that building up that institutional knowledge and building up the know-how to securities networks is non-trivial, right? And right now we have quite a bit of institutional knowledge on, say, IT security. But ICS security is usually much different and also can be very, very sector-specific. And some sectors have better ideas of how to secure their systems than others. And some are still very nascent. And that takes time to build up. And you also need people who know how to do those things and can actually implement them effectively. So to your point earlier, like, oh, it could, you know, just cost you $2 million to secure your networks. That's probably true, but you also need the people who know how to do that. And if it's, you know, a business network, you can find those people. ICS networks, more complicated. Excellent point. I would probably plug one more. I think some of the work like the NIS cybersecurity framework, the CIS control sets, understanding what's relevant in your environment is important. But then you've got to operationalize and put sustainable capability in place to keep that going. So, and then the other things like the NIST work or the NTI work on software bill of materials, I know it's something a few folks up here have been involved with for supply chain integrity, that kind of thing. I think those are all moving the needle, but I'm not sure it's quite fast enough. Yeah, and I think, you know, if any of you are obviously out there, individual contributors, doing thread intel and so forth, like, you know, you definitely know when you've finally gotten a feed or whatever that has proven to be absolutely useless, you lose trust in it. So anything that's regarding information sharing requires a consistent quality, a trust model that's built up around that, and I don't think we've gotten anything that we can just consistent really rely on. Then we get in these informal networks, which then, you know, require more and more coordination. And I think that's our challenge is to have high-quality trusted information that is constantly reliable to help with those emerging threats as well as just be able to kind of understand where we stand, you know, in our landscape, where we are in time, and so forth. How is government doing on information sharing? There was a law in 2015, they were going to share all this information. How did that go? I'll say they do pretty good, but... Well, as somebody who did the markup on that when I was at the White House, one of the problems was is that, you know, when the original draft in 2014 came down from the hill, it really didn't reflect the kind of mechanics required for information sharing. It still kind of doesn't. There's still some holes in the law that I've pointed out to DHS lawyers in the past regarding liability, even though they have a big, you know, big strength for that. But, you know, again, it's a loose framework. I mean, the weird thing was is that I eventually moved over to Treasury and they couldn't really kind of convince agencies to become part of it because they just didn't have a standard, they didn't have a really easy, flowable method, and none of the agencies trusted one another, which is really bizarre within the federal government. You know, just totally unheard of. Who would think? Who would think? No, that's not bizarre. Just on the information sharing side, I mean, everyone here probably realized that there's quite a lot of informal information exchange that we all rely upon and the challenge has been in 2015 with the Cyber Information Sharing Act and trying to establish to see sales and similar, trying to find a way to basically protect those networks at the same time not kill them. And I don't know how far or how successful the effort has been in the U.S. And some of the stuff that happens, one of the challenges has always been, for instance, in the U.S., is how do you share classified information to non-cleared individuals and how do you get out of that bubble easily, especially in emergency? And that's the flexibility that some of the European countries have in dealing with those issues is a real advantage, probably because it's so much smaller, right? Like, for instance, in the Netherlands, 15 million people, they have like a cybersecurity council that basically includes CEOs of some of the major companies, universities, technical researchers, and in a major emergency, they can advise the national crisis management, but they're also an open access point for hackers. Literally, people who have no connection to anything can basically say, hey, I found this really great exploit. Hey, I have something you really need right now. And they can basically use these informal networks in a formal way. And that kind of challenge, how do you really take the benefit of these informal networks and don't kill them with undue legal obligations or scrutiny or whatnot? I think it's a big challenge that everyone faces. Yeah, we don't have a national break glass in case of emergency type kind of protocol. Just more or less, it's like spin up whoever you know and get on the phone or email or discord or wherever. So I do believe this is, I mean, I'm sorry, it's going to pile on. So I think some of the sharing from that, I guess if I was a warrior, probably frame it as less bad. But it's actually, you know, some of the targeted intel, some of the threat indicators, some of the primitives on controls that you can deploy to mitigate, say propagation ransomware in an environment or something like that. They're putting a lot more information out there than they did traditionally. And I think there's still work to be done and you know, there's certainly a long, long tail on the capabilities of people to consume that even in critical infrastructure. But I do think that it's less bad. If I could jump in too, and I apologize, I lost audio for a minute here. So I hope I'm assuming it was just tons of compliments about how great the government has been for the last decade and more. But from my perspective, ISOs are a creative solution. There's no one size fits all. Certainly one thing the government can continue to do better is how do we operationalize sometimes very classified information into mitigation guidance and useful information that non-cleared partners can use. A lot of what we're seeing, you know, we understand threat activity better with classified information, but you don't need that classified nuance to take action. And that's really the role that we play. It says then that we're trying to continue to optimize over time. And with the establishment of this new capability at the JCBC, that's really intended to address the gap that we're all talking about. What's the break glass protocol? What's the planning we have in place when another solar winds happens or a Microsoft exchange vulnerability occurs or, you know, one of the many things that have happened this year, the ransomware incidents that continue to multiply over time. So we have some of that methodology and doctrine in place from a U.S. government perspective, but there's opportunity to grow it and systematize it and standardize it and make it easier for industry and others to work as part of that framework. So let's tie together some of the policy that exists from the White House, let's explore potential new regulatory approaches, and let's implement a unified way to do planning to address these types of scenarios. So when we have the next big cyber emergency or even just a baseline for dealing with regular threat activity, how do we optimize and implement the lessons learned that we're all talking about? Clayton, can you run through us in a little more detail what the JCDC is going to do? Is it going to, are you guys going to all get together once a week, once a month? Is it going to be a chat room? Like, how does this work? I'm certainly hoping for at least a coffee bar, but we're still working out those details. You know, the JCDC is really kind of the culmination of a lot of the public-private collaboration that we've already had underway at CISA for several years and really bringing together sort of two different public-private collaboration pieces. One is this systematic approach to addressing risk that will be informed by all of our partner sets through joint cyber planning and that's kind of what I was just talking about. That's something that we've done through elections, as you mentioned earlier. That's one of the reasons we found so much success there and then it's fusing it with our real-time operational operation and information sharing. So we have a number of channels in place with different partners. How do we enrich all of our partnership work bringing together those channels with this whole of government approach to different national critical functions, for example. So the JCDC provides a framework so that we can work closely with the intelligence community and our Department of Defense partners, sectorist management agencies will all be part of this, bringing in industry, bringing in state and locals and bringing in international partners all into a common and transparent understanding of the priorities that the U.S. government has when it comes to cyber risk. So I'm going to call for audience questions in just a minute, but you wanted to talk, Emily, about the cyber NTSB? Yeah, so kind of what Clayton mentioned a little bit on the coordination stuff and this was mentioned in the executive order as well. I forget the actual term they've used, but there was some work done with NSF and the Belfer Center in regards to cyber NTSB, basically like your National Transportation Safety Board after action reports. A lot of the discussions there, especially we're talking about collaboration is a learning model. Like how Boeing would learn from their mistakes or Airbus or whomever with the crash or some other mechanical issue, but on the cyber side of the house is to make it so that people are more willing to kind of share that background information, the more detailed information and then help the community or the industry or the sector as a whole. One of the big discussions among the working group was to not have this be punitive. A lot of times if you're going to have it run by a particular government agency, you're afraid that they're going to knock at the door. It's going to be tied to a regulation. They're going to find you. You're not going to be willing to share information. The cyber NTSB was kind of of the idea of just please share it so that we all kind of rate, you know, the rising tide raises all boats with our knowledge and awareness. And you know, that was one of the really kind of cool things I thought was coming out of the cyber and I definitely think could be piloted a lot by the critical infrastructure sectors and ISACs. Just an idea because I just wanted to give another example of something we did in aviation along those lines. We did something on safety. We worked with MITRE as a third party anonymizer of the safety breaches that folks would see just in a normal day, say in a normal operations day. And that allowed the both folks like Boeing as well as the airlines to address those safety issues without penalty. And with that came a reduction, literally a reduction of safety incidences in the industry. So it really makes a difference. And you asked at the beginning, you know, why are there still brands and war attacks? And I think that if our approaches no more, like there will never be any more ransomware attacks in the near term is just completely unrealistic and sets up a sort of Sisyphean challenge. And, you know, you want people to feel as though they can make reasonable gains in this area and that it's okay if they get hacked. There are processes to handle that. So, you know, an iterative approach I think is more necessary. You don't want Sisyphean to face a Sisyphean challenge? Pun intended. Audience questions, anyone? Straight back there. And I'll repeat it. Yeah, you, you, you. All right. All right. So, if you get it, what is the process done? Then you face the problem. I was wondering if you would down that idea would you be able to lay out some kind of weird whatever revenue? Okay, I think I get it. So the question is if we're going to have a model like the Europeans do where the bad things you do equate to the fines you pay, how do we get into that? And are we looking at the amount of money you lost, the amount of data you lost, whether it affected people's lives, et cetera? Alex? You want to take a couple? Or, all right. You want to take a couple of questions? All right. Fine. I'll just do one. Yeah, just really quickly. The two percent is the maximum that, you know, of revenue, maximum fine that you would pay. But the basic notion is simply gross negligence. So it doesn't actually say how good you are. It just says you were terrible. If you were terrible, then you have to pay. No one's going to go at this stage and say, you're good, you're medium good. The next step in this journey was to basically talk about assurance levels, and they now are defining three assurance levels. So you have to, it's terrible depending upon the assurance level that you're in for that service, right? And the step after that will be even more interesting. The second point is that you address in terms of how damage is sometimes calculated. All CIP programs that I've looked at and I looked at the ones in the U.S. and most of the ones in Europe, they all have that in terms of what is bad, you know, as it costs in terms of utilities and damage and loss. Those might pay a role in trying to figure out the punishment or how badly the company has performed. The problem is a lot of that information, including the U.S., as far as I know, is classified. So they probably wouldn't be able to use it in a court setting when trying to figure out how much, how much did you screw up, exactly? Okay, so, um, right nearest me with the big hand, high up, make it real quick. Clayton, do you want to do that? You're not here, although there's a good reason this year, so we won't blame you for it. I can't hear the question, otherwise. Oh, yeah, I'm sorry about that. So the issue is there are good things the government puts together. The grid X exercise is a whole bunch of, um, cyber storm exercise, things like that. A lot of people don't know about them, the right people aren't there, etc. How do you get industry and government actually meeting together effectively? Is that fair? I think it's a huge challenge. I think it's, I mean, honestly, it's forums like this. You know, one of the things we have to do is make sure that we're talking to the right people who are talking to the right people, getting in the right communities of trust and leveraging the right channels. I think again, not to, not to get another plug-in for the JCBC, but that is, that is really what we're attempting to do with this new capability, and sort of fuse together a lot of what the U.S. government has in different pockets in a holistic way. So that's a messaging opportunity and challenge for us, but we really want to work with and through industry to bring in partners who we might not be reaching yet. I wanted to jump back quickly to something Amelie brought up, cyber insurance. Why didn't that work? Like back in 2014 when I was first covering this stuff, everyone said, oh, eventually the insurers will develop models and they'll have all of their metrics and their data and they will require you to have sufficient protections before they write the policy and it'll be just like fire insurance and it'll be great. And it seems that hasn't happened. What went wrong? Asymmetric emerging threat. Okay. You know, as you know, you know, part of the cyberintestiny we did have some underwriters on and it is extremely complex. As you mentioned, the emerging threats, you're constantly having to update your actual aerial models and so forth is much what Perry mentioned. You know, we have every, every industry has a little bit of boutiqueness to themselves within there. So it's really challenging to kind of write, you know, a new life insurance policy for your business or organization because it changes all the time. So yeah, it's, it, it can work. It's just, we keep, there's, you just can't keep up with the speed. It takes time. I mean, if you look at the fire industry, that's where insurance started, right? Chicago, whatever it was, the beginning of fires and so forth. And initially, they didn't have all the fire regulation that they have today. Right? But now we all have to have whatever the fire extinguishers in. That, that took a hundred years. So, you know. And fire doesn't really, you know, change. It's fire. It burns things. Exactly. Depends on the fuel though. Cissit can take on a Promethean challenge, I suppose. And on that, thank you everyone for joining. Thanks to the panel. Thank you.